Inheritance and composition of calibr8
modelsο
Models in calibr8
are implemented as classes, inheriting from the CalibrationModel
interface* either directly or via configureable calibr8.BaseXYZ
model classes.
Generalization across different distributions in the noise model is provided with mixins**.
The reasoning for this design choice is explained in the following sections.
*Further reading on informal interfaces in Python. **Further reading on mixins.
Inheritance of model classesο
The choice of the inheritance architecture is based on an analysis of two important aspects of a calibration model:
What kind of independent variable is the model working with? (continuous/discrete)
How many dimensions does the independent variable have? (univariate/multivariate)
Note that the (log)likelihood
of a model is agnostic to the dependent variable being continuous/discrete, and therefore we do not have to consider it here.
The following tables summarizes the long-form matrix of the various combinations and corresponding implementations. Note that since MCMC can be used for any of them, the βinference strategyβ column denotes the computationally simplest generic strategy.
ndim |
independent variable |
integration/inference strategy |
|
model subclass |
---|---|---|---|---|
1 |
continuous |
|
|
|
>1 |
continuous |
MCMC |
|
|
1 |
discrete |
summation & division of likelihoods |
*, |
*, |
>1 |
discrete |
summation & division of likelihoods |
*, |
*, |
>1 |
**mix of continuous & discrete π€― |
MCMC |
*, |
*, |
*Not implemented. **Needs custom SciPy and PyMC distribution.
Composition of distribution mixinsο
The CalibrationModel
, and specifically itβs loglikelihood
implementation can generalize across distributions (noise models) for the dependent variable based on a mapping of predicted distribution parameters (predict_dependent
) to the corresponding SciPy, and optionally a PyMC distribution.
This mapping is managed via class attributes and staticmethods and provided by subclassing a DistributionMixin
.
For example, a user-implemented model may subclass the calibr8.LogNormalNoise
and implement a predict_dependent
method to return a tuple of the two distribution parameters.
class LogNormalNoise(DistributionMixin):
"""Log-Normal noise, predicted in logarithmic mean and standard deviation.
β This corresponds to the NumPy/PyTensor/PyMC parametrization!
"""
scipy_dist = scipy.stats.lognorm
pymc_dist = pm.Lognormal if HAS_PYMC else None
@staticmethod
def to_scipy(*params):
# SciPy wants linear scale mean and log scale standard deviation!
return dict(scale=numpy.exp(params[0]), s=params[1])
@staticmethod
def to_pymc(*params):
return dict(mu=params[0], sigma=params[1])
Localization of implementationsο
Most of the source codeβand certainly the most intricate codeβof a calibr8
model is located in the classes provided by the library.
For most applications the user may directly use a common model class from calibr8
, or implement their own class with nothing more than the __init__
method.
If the desired model structure is not already provided by calibr8.BaseXYZ
types, a custom predict_dependent
can be implemented by subclassing from a model subclass.
Examples are the use of uncommon noise models, or multivariate calibration models.
name β² class |
|
model subclass |
|
|
|
---|---|---|---|---|---|
implemented by |
|
|
user |
|
user |
inherits from |
|
|
model subclass |
model subclass |
|
adds mixin |
|
user-specified |
|
||
generalizes for |
all models |
ndim and type ofindependent variable |
common model structures (e.g. univariate polynomial) |
||
|
β |
β |
β |
(β) |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
β |
β |
β |
|
β |
β |
(β) |
(β) |
β |
Symbols
β
NotImplementedError
at this levelβ implemented at this class/mixin level
(β) necessity/feasability depends on the type of model
β inherits the implementation