leaspy.algo#

Submodules#

Attributes#

Classes#

AlgorithmName

The available algorithms in Leaspy.

AlgorithmType

The type of the algorithms.

BaseAlgorithm

Base class containing common methods for all algorithm classes.

AlgorithmSettings

Used to set the algorithms' settings.

OutputsSettings

Used to create the logs folder to monitor the convergence of the fit algorithm.

Functions#

algorithm_factory(settings)

Return the requested algorithm based on the provided settings.

get_algorithm_class(name)

Return the algorithm class.

get_algorithm_type(name)

Return the algorithm type.

Package Contents#

class AlgorithmName#

Bases: str, enum.Enum

The available algorithms in Leaspy.

FIT_MCMC_SAEM = 'mcmc_saem'#
FIT_LME = 'lme_fit'#
PERSONALIZE_SCIPY_MINIMIZE = 'scipy_minimize'#
PERSONALIZE_MEAN_POSTERIOR = 'mean_posterior'#
PERSONALIZE_MODE_POSTERIOR = 'mode_posterior'#
PERSONALIZE_CONSTANT = 'constant_prediction'#
PERSONALIZE_LME = 'lme_personalize'#
SIMULATE = 'simulate'#
class AlgorithmType#

Bases: str, enum.Enum

The type of the algorithms.

FIT = 'fit'#
PERSONALIZE = 'personalize'#
SIMULATE = 'simulate'#
class BaseAlgorithm(settings)#

Bases: abc.ABC, Generic[ModelType, ReturnType]

Base class containing common methods for all algorithm classes.

Parameters:
settingsAlgorithmSettings

The specifications of the algorithm as a AlgorithmSettings instance.

Attributes:
nameAlgorithmName

Name of the algorithm.

familyAlgorithmType

Family of the algorithm.

deterministicbool

True, if and only if algorithm does not involve randomness. Setting a seed will have no effect on such algorithms.

algo_parametersdict

Contains the algorithm’s parameters. Those are controlled by the leaspy.algo.AlgorithmSettings.parameters class attribute.

seedint, optional

Seed used by numpy and torch.

Parameters:

settings (AlgorithmSettings)

name: AlgorithmName = None#
family: AlgorithmType = None#
deterministic: bool = False#
seed#
algo_parameters#
output_manager = None#
abstract set_output_manager(output_settings)#
Parameters:

output_settings (OutputsSettings)

Return type:

None

run(model, dataset=None, **kwargs)#

Main method, run the algorithm.

Parameters:
modelBaseModel

The used model.

datasetDataset

Contains all the subjects’ observations with corresponding timepoints, in torch format to speed up computations.

Returns:
ReturnType:

Depends on algorithm class.

Parameters:
  • model (ModelType)

  • dataset (Optional[Dataset])

Return type:

ReturnType

See also

AbstractFitAlgo
AbstractPersonalizeAlgo
load_parameters(parameters)#

Update the algorithm’s parameters by the ones in the given dictionary.

The keys in the input which does not belong to the algorithm’s parameters are ignored.

Parameters:
parametersdict

Contains the pairs (key, value) of the requested parameters

Parameters:

parameters (dict)

Examples

>>> from leaspy.algo import AlgorithmSettings, algorithm_factory, OutputsSettings
>>> my_algo = algorithm_factory(AlgorithmSettings("mcmc_saem"))
>>> my_algo.algo_parameters
{'progress_bar': True,
'n_iter': 10000,
'n_burn_in_iter': 9000,
'n_burn_in_iter_frac': 0.9,
'burn_in_step_power': 0.8,
'random_order_variables': True,
'sampler_ind': 'Gibbs',
'sampler_ind_params': {'acceptation_history_length': 25,
'mean_acceptation_rate_target_bounds': [0.2, 0.4],
'adaptive_std_factor': 0.1},
'sampler_pop': 'Gibbs',
'sampler_pop_params': {'random_order_dimension': True,
'acceptation_history_length': 25,
'mean_acceptation_rate_target_bounds': [0.2, 0.4],
'adaptive_std_factor': 0.1},
'annealing': {'do_annealing': False,
'initial_temperature': 10,
'n_plateau': 10,
'n_iter': None,
'n_iter_frac': 0.5}}
>>> parameters = {'n_iter': 5000, 'n_burn_in_iter': 4000}
>>> my_algo.load_parameters(parameters)
>>> my_algo.algo_parameters
{'progress_bar': True,
'n_iter': 5000,
'n_burn_in_iter': 4000,
'n_burn_in_iter_frac': 0.9,
'burn_in_step_power': 0.8,
'random_order_variables': True,
'sampler_ind': 'Gibbs',
'sampler_ind_params': {'acceptation_history_length': 25,
'mean_acceptation_rate_target_bounds': [0.2, 0.4],
'adaptive_std_factor': 0.1},
'sampler_pop': 'Gibbs',
'sampler_pop_params': {'random_order_dimension': True,
'acceptation_history_length': 25,
'mean_acceptation_rate_target_bounds': [0.2, 0.4],
'adaptive_std_factor': 0.1},
'annealing': {'do_annealing': False,
'initial_temperature': 10,
'n_plateau': 10,
'n_iter': None,
'n_iter_frac': 0.5}}
algorithm_factory(settings)#

Return the requested algorithm based on the provided settings.

Parameters:
settingsleaspy.algo.AlgorithmSettingss

The algorithm settings.

Returns:
algorithmchild class of BaseAlgorithm

The requested algorithm. If it exists, it will be compatible with algorithm family.

Parameters:

settings (AlgorithmSettings)

Return type:

BaseAlgorithm

get_algorithm_class(name)#

Return the algorithm class.

Parameters:
namestr or AlgorithmName

The name of the algorithm.

Returns:
algorithm class: BaseAlgorithm
Parameters:

name (Union[str, AlgorithmName])

Return type:

Type[BaseAlgorithm]

get_algorithm_type(name)#

Return the algorithm type.

Parameters:
namestr or AlgorithmName

The name of the algorithm.

Returns:
algorithm type: leaspy.algo.AlgorithmType
Parameters:

name (Union[str, AlgorithmName])

Return type:

AlgorithmType

class AlgorithmSettings(name, **kwargs)#

Used to set the algorithms’ settings.

All parameters except the algorithm name have default values, which can be overwritten by the user.

Parameters:
namestr

Algorithm name. Accepted values: - fit: "mcmc_saem" or "lme_fit" (LME only) - personalize: "scipy_minimize", "mean_real", "mode_real", "constant_prediction" (constant model only), "lme_personalize" (LME only) - simulate: "simulation"

**kwargs

Optional algorithm-specific settings. Common keys: - seed (int | None): seed for stochastic algorithms. - algorithm_initialization_method (str | None): strategy name accepted by the target algorithm. - n_iter (int | None): number of iterations (no auto stopping for MCMC SAEM). - n_burn_in_iter (int | None): burn-in iterations for MCMC SAEM. - use_jacobian (bool): use Jacobian in scipy_minimize to switch to L-BFGS (default True). - n_jobs (int): joblib parallelism for scipy_minimize (default 1). - progress_bar (bool): show a progress bar (default True). - device (int | torch.device | str): computation device for algorithms that support it.

Refer to each algorithm documentation in leaspy.algo for the full list of supported parameters.

Attributes:
namestr

The algorithm’s name.

algorithm_initialization_methodstr, optional

Personalize the algorithm initialization method, according to those possible for the given algorithm (refer to its documentation in leaspy.algo).

seedint, optional, default None

Used for stochastic algorithms.

parametersdict

Contains the other parameters: n_iter, n_burn_in_iter, use_jacobian, n_jobs & progress_bar.

logsOutputsSettings, optional

Used to create a logs file containing convergence information during fitting the model.

devicestr (or torch.device), optional, default ‘cpu’

Specifies the computation device. Only ‘cpu’ and ‘cuda’ are supported. Note that specifying an indexed CUDA device (such as ‘cuda:1’) is not supported. In order to specify the precise cuda device index, one should use the CUDA_VISIBLE_DEVICES environment variable.

Raises:
LeaspyAlgoInputError
Parameters:

name (str)

Notes

Developers can use _dynamic_default_parameters to define settings that depend on other parameters when not explicitly specified by the user.

name: AlgorithmName#
parameters: leaspy.utils.typing.KwargsType | None = None#
seed: int | None = None#
algorithm_initialization_method: str | None = None#
logs: OutputsSettings | None = None#
check_consistency()#

Check internal consistency of algorithm settings and warn or raise a LeaspyAlgoInputError if not.

Return type:

None

classmethod load(path_to_algorithm_settings)#

Instantiate a AlgorithmSettings object a from json file.

Parameters:
path_to_algorithm_settingsstr

Path of the json file.

Returns:
AlgorithmSettings

An instanced of AlgorithmSettings with specified parameters.

Raises:
LeaspyAlgoInputError

if anything is invalid in algo settings

Parameters:

path_to_algorithm_settings (Union[str, Path])

Examples

>>> from leaspy.algo import AlgorithmSettings
>>> leaspy_univariate = AlgorithmSettings.load('outputs/leaspy-univariate_model-settings.json')
save(path, **kwargs)#

Save an AlgorithmSettings object in a json file.

TODO? save leaspy version as well for retro/future-compatibility issues?

Parameters:
pathstr

Path to store the AlgorithmSettings.

kwargsdict

Keyword arguments for json.dump method. Default: dict(indent=2)

Parameters:

path (Union[str, Path])

Examples

>>> from leaspy.algo import AlgorithmSettings
>>> settings = AlgorithmSettings("scipy_minimize", seed=42)
>>> settings.save("outputs/scipy_minimize-settings.json")
set_logs(**kwargs)#

Use this method to monitor the convergence of a model fit.

This method creates CSV files and plots to track the evolution of population parameters (i.e., fixed effects) during the fitting.

Parameters:
**kwargs

Supported keys: - path (str | None): folder where graphs and CSV files will be saved. If None, DEFAULT_LOGS_DIR is used. - print_periodicity (int | None): print every N iterations (default 100 when provided). - save_periodicity (int | None): save CSV values every N iterations (default 50 when provided). - plot_periodicity (int | None): generate plots every N iterations; must be a multiple of save_periodicity (default 1000). - plot_patient_periodicity (int | None): frequency for saving patient reconstructions. - plot_sourcewise (bool): plot source-based parameters per source (default False). - overwrite_logs_folder (bool): overwrite existing content in path (default False). - nb_of_patients_to_plot (int): number of patients to plot when plotting is enabled (default 5).

Raises:
LeaspyAlgoInputError

If the folder given in path already exists and if overwrite_logs_folder is set to False.

Notes

By default, if the folder given in path already exists, the method will raise an error. To overwrite the content of the folder, set overwrite_logs_folder it to True.

class OutputsSettings(settings)#

Used to create the logs folder to monitor the convergence of the fit algorithm.

Parameters:
settingsdict[str, Any]

Configuration mapping. Supported keys:

  • path (str | None): destination folder for logs. If None, defaults to "./_outputs/" when saving.

  • print_periodicity (int | None): print information every N iterations.

  • save_periodicity (int | None): save convergence data every N iterations (default 50).

  • plot_periodicity (int | None): plot convergence data every N iterations; must not be more frequent than saves.

  • plot_sourcewise (bool): plot source-based parameters per source instead of per feature (default False).

  • overwrite_logs_folder (bool): remove any existing logs folder before writing (default False).

  • nb_of_patients_to_plot (int): number of patients plotted when enabled (default 5).

Raises:
LeaspyAlgoInputError
DEFAULT_LOGS_DIR = '_outputs'#
print_periodicity = None#
plot_periodicity = None#
save_periodicity = None#
plot_patient_periodicity = None#
plot_sourcewise = False#
nb_of_patients_to_plot = 5#
root_path = None#
parameter_convergence_path = None#
plot_path = None#
patients_plot_path = None#
algo_default_data_dir#