Skip to content

Index

__all__ module-attribute

__all__ = [
    "FitnessBase",
    "LBFGSBOptimizer",
    "NelderMeadOptimizer",
    "optimizer_registry",
    "OptimizerBase",
    "OptimizerResult",
    "PowellOptimizer",
    "PYCMAOptimizer",
    "PYCMAOptimizerResult",
    "ScipyOptimizerResult",
    "TNCOptimizer",
]

optimizer_registry module-attribute

optimizer_registry: Registry[str, Type[OptimizerBase]] = Registry()

FitnessBase

Bases: HashableBaseModelIO

__call__

__call__(x: Iterable | dict, *args, **kwargs) -> float

Evaluate the fitness function.

create classmethod

create(
    fitness: Union["FitnessBase", Type["FitnessBase"], str, None] = None,
    **kwargs
) -> "FitnessBase"

Create a fitness object from the input.

PARAMETER DESCRIPTION
fitness

Custom fitness object or class or the full name of the class, by default None

TYPE: 'FitnessBase' | Type['FitnessBase'] | str DEFAULT: None

kwargs

Additional keyword arguments to pass to the fitness object

DEFAULT: {}

RETURNS DESCRIPTION
FitnessBase

The fitness object

evaluate

evaluate(
    results: Dict,
) -> Tuple[Dict[str, Dict[str, float]], Dict[str, Dict[str, float]]]

Evaluate the fitness function, return the fitness and weights

lineOption

lineOption(
    option: str,
    experiment_name: str,
    *,
    option_name: str = "line_{option}",
    experiment_option_name="line_experiment_{option}"
) -> Any

Get line options or experiment-specific line options.

match classmethod

match(name: str, pattern: str | List[str]) -> bool

Check if the name matches any of the given patterns.

parseOptimizeKeys

parseOptimizeKeys()

Parse the keys of the fitness function

run

run(
    x: Iterable | dict, *args, return_results: bool = False, **kwargs
) -> float | Tuple[float, Dict[str, Any]]

Evaluate the fitness function.

setup

setup(**kwargs) -> Self

Set up the arguments of the fitness function

simulate

simulate(x: Iterable | dict) -> Dict

Simulate the model and return the results as a dictionary of SVExporter objects. If the simulation fails, return False instead.

LBFGSBOptimizer

Bases: ScipyOptimizer

method class-attribute instance-attribute

method = 'L-BFGS-B'

options instance-attribute

NelderMeadOptimizer

Bases: ScipyOptimizer

method class-attribute instance-attribute

method = 'Nelder-Mead'

options instance-attribute

OptimizerBase

Base class for optimizers

args instance-attribute

args: tuple = args

Additional arguments for the objective function

bounds instance-attribute

bounds: ndarray | None = bounds

Parameter bounds

defaultOptions class-attribute instance-attribute

defaultOptions: Dict = {}

n_obj class-attribute instance-attribute

n_obj: int = 1

number of objectives

objective_function instance-attribute

objective_function: FitnessBase | Callable = objective_function

Objective function to optimize, the function's signature must be: f(x: Iterable) -> float

options instance-attribute

options: OptimizerOptions = OptimizerOptions(defaultOptions)

options

user_callback instance-attribute

user_callback: Callable[[Iterable], Any] | None = callback

callback function that will be evaluated after each iteration

x0 instance-attribute

x0: ndarray = x0

Initial parameters

__init__

__init__(
    objective_function: FitnessBase | Callable,
    x0: ndarray,
    *,
    args: tuple = (),
    bounds: ndarray | None = None,
    callback: Callable[[Iterable], Any] | None = None,
    maxiter: int = 1000,
    maxfun: int = 10000,
    **options
)

Initialize the optimizer

PARAMETER DESCRIPTION
objective_function

Objective function

TYPE: FitnessBase | Callable

x0

Initial parameters

TYPE: ndarray

args

Additional arguments for the objective function

TYPE: tuple DEFAULT: ()

bounds

Bounds of the parameter

TYPE: ndarray DEFAULT: None

maxiter

Maximal number of iterations

TYPE: int DEFAULT: 1000

maxfun

Maximal number of function evaluations

TYPE: int DEFAULT: 10000

callback

Callback function that will be evaluated after each iteration

TYPE: Callable DEFAULT: None

options

Optimizer-specific options

TYPE: int | float | bool | dict | Any DEFAULT: {}

callback

callback(x: ndarray | Any)

Callback function that will be evaluated after each iteration

fmin classmethod

fmin(*args, **kwargs) -> OptimizerResult

Static method to do the optimization

PARAMETER DESCRIPTION
args

Positional and keyword arguments for the optimizer

DEFAULT: ()

kwargs

Positional and keyword arguments for the optimizer

DEFAULT: ()

RETURNS DESCRIPTION
res

Optimize result

TYPE: OptimizeResult

optimize

optimize() -> OptimizerResult

Optimize the objective function

setup

setup()

Post set up the optimizer

OptimizerResult

Bases: HashableBaseModelIO

OptimizeResult for the optimizers

duration class-attribute instance-attribute

duration: float = 0.0

Duration

experiments class-attribute instance-attribute

experiments: Dict[str, Any] = {}

experiments

fitness class-attribute instance-attribute

fitness: Dict[str, Dict[str, float]] = {}

fitness for all experiments and lines

fitness_options class-attribute instance-attribute

fitness_options: Dict[str, Any] = {}

fitness options

fun class-attribute instance-attribute

fun: float = 0.0

Values of objective function

nfev class-attribute instance-attribute

nfev: int = 0

Number of evaluations of the objective functions

nit class-attribute instance-attribute

nit: int = 0

Number of iterations performed by the optimizer.

optimize_keys class-attribute instance-attribute

optimize_keys: List[str] = []

Optimize keys

optimizer_options class-attribute instance-attribute

optimizer_options: Dict[str, Any] = {}

optimizer options

options class-attribute instance-attribute

options: Dict[str, Any] = {}

calculation options

parameters class-attribute instance-attribute

parameters: Dict[str, Any] = {}

A dict of parameters

stds class-attribute instance-attribute

stds: List[float] = []

standard deviations of the parameters

weights class-attribute instance-attribute

weights: Dict[str, Dict[str, float]] = {}

weights for all experiments and lines

x class-attribute instance-attribute

x: List[float] = []

The solution of the optimization.

PYCMAOptimizer

Bases: OptimizerBase

defaultOptions class-attribute instance-attribute

defaultOptions: Dict = dict(min_iterations=10, n_jobs=1, verb_disp=100)

options instance-attribute

stds class-attribute instance-attribute

stds: ndarray | None = None

Standard deviations

callback

callback(x: 'CMAEvolutionStrategy')

optimize

optimize() -> PYCMAOptimizerResult

setup

setup()

PYCMAOptimizerResult

Bases: OptimizerResult

Optimize result for PYCMA package, just for annotation

evals_best class-attribute instance-attribute

evals_best: int = 0

number of function evaluations

sigma class-attribute instance-attribute

sigma: float = 0.0

overall standard deviation

stop class-attribute instance-attribute

stop: Dict[str, Any] = {}

stop message

PowellOptimizer

Bases: ScipyOptimizer

method class-attribute instance-attribute

method = 'Powell'

options instance-attribute

ScipyOptimizerResult

Bases: OptimizerResult

Represents the optimization result for scipy optimize algorithms

Notes
``OptimizeResult`` may have additional attributes not listed here depending
on the specific solver being used. Since this class is essentially a
subclass of dict with attribute accessors, one can see which
attributes are available using the `OptimizeResult.keys` method.

hess class-attribute instance-attribute

hess: List[List[float]] = []

Values of Hessian, The Hessians may be approximations, see the documentation of the function in question.

jac class-attribute instance-attribute

jac: List[float] = []

Values of Jacobin

maxcv class-attribute instance-attribute

maxcv: float = 0.0

The maximum constraint violation.

message class-attribute instance-attribute

message: str = ''

Description of the cause of the termination.

nhev class-attribute instance-attribute

nhev: int = 0

Number of evaluations of the hessian

njev class-attribute instance-attribute

njev: int = 0

Number of evaluations of the jacobin

status class-attribute instance-attribute

status: int = -1

Termination status of the optimizer. Its value depends on the underlying solver. Refer to message for details.

success class-attribute instance-attribute

success: bool = False

Whether the optimizer exited successfully.

TNCOptimizer

Bases: ScipyOptimizer

method class-attribute instance-attribute

method = 'TNC'

options instance-attribute