miniml.optim
Optimization algorithms for MiniML models.
ScipyOptimizer
Bases: MiniMLOptimizer
Optimizer that wraps scipy.optimize.minimize and supports the following methods:
- 'Nelder-Mead'
- 'Powell'
- 'CG'
- 'BFGS'
- 'L-BFGS-B'
- 'Newton-CG'
- 'trust-ncg'
- 'trust-krylov'
- 'trust-constr'
- 'dogleg'
- 'trust-exact'
- 'COBYLA'
__init__(method='L-BFGS-B', options={}, tol=None)
Initialize the ScipyOptimizer.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
method
|
str
|
The optimization method to use. Defaults to 'L-BFGS-B'. |
'L-BFGS-B'
|
options
|
dict
|
Options to pass to scipy.optimize.minimize. Defaults to {}. |
{}
|
tol
|
float | None
|
Tolerance for termination. Defaults to None. |
None
|
AdamOptimizer
Bases: MiniMLOptimizer
Adaptive Moment Estimation (Adam) optimizer.
References
- Diederik P. Kingma and Jimmy Ba. "Adam: A Method for Stochastic Optimization."
__init__(alpha=0.001, beta_1=0.9, beta_2=0.999, eps=1e-08, tol=0.0, maxiter=1000)
Initialize the Adam optimizer.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
alpha
|
float
|
Learning rate. Defaults to 0.001. |
0.001
|
beta_1
|
float
|
Exponential decay rate for the first moment estimates. Defaults to 0.9. |
0.9
|
beta_2
|
float
|
Exponential decay rate for the second moment estimates. Defaults to 0.999. |
0.999
|
eps
|
float
|
Small constant for numerical stability. Defaults to 1e-8. |
1e-08
|
tol
|
float
|
Tolerance for stopping criterion based on the norm of the first moment. Defaults to 0.0. |
0.0
|
maxiter
|
int
|
Maximum number of iterations. Defaults to 1000. |
1000
|