model.ipynb
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Cookbook: Models\n",
"================\n",
"\n",
"Model composition is the process of defining a probabilistic model as a collection of model components, which are\n",
"ultimate fitted to a dataset via a non-linear search.\n",
"\n",
"This cookbook provides an overview of basic model composition tools.\n",
"\n",
"__Contents__\n",
"\n",
"If first describes how to use the `af.Model` object to define models with a single model component from single\n",
"Python classes, with the following sections:\n",
"\n",
" - Python Class Template: The template of a model component written as a Python class.\n",
" - Model Composition (Model): Creating a model via `af.Model()`.\n",
" - Priors (Model): How the default priors of a model are set and how to customize them.\n",
" - Instances (Model): Creating an instance of a model via input parameters.\n",
" - Model Customization (Model): Customizing a model (e.g. fixing parameters or linking them to one another).\n",
" - Json Output (Model): Output a model in human readable text via a .json file and loading it back again.\n",
"\n",
"It then describes how to use the `af.Collection` object to define models with many model components from multiple\n",
"Python classes, with the following sections:\n",
"\n",
" - Model Composition (Collection): Creating a model via `af.Collection()`.\n",
" - Priors (Collection): How the default priors of a collection are set and how to customize them.\n",
" - Instances (Collection): Create an instance of a collection via input parameters.\n",
" - Model Customization (Collection): Customize a collection (e.g. fixing parameters or linking them to one another).\n",
" - Json Output (Collection): Output a collection in human readable text via a .json file and loading it back again.\n",
" - Extensible Models (Collection): Using collections to extend models with new model components, including the use\n",
" of Python dictionaries and lists."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"%matplotlib inline\n",
"from pyprojroot import here\n",
"workspace_path = str(here())\n",
"%cd $workspace_path\n",
"print(f\"Working Directory has been set to `{workspace_path}`\")\n",
"\n",
"import json\n",
"import os\n",
"from os import path\n",
"\n",
"import autofit as af"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Python Class Template__\n",
"\n",
"A model component is written as a Python class using the following format:\n",
"\n",
" - The name of the class is the name of the model component, in this case, \u201cGaussian\u201d.\n",
"\n",
" - The input arguments of the constructor are the parameters of the mode (here `centre`, `normalization` and `sigma`).\n",
"\n",
" - The default values of the input arguments tell PyAutoFit whether a parameter is a single-valued float or a\n",
" multi-valued tuple.\n",
"\n",
"We define a 1D Gaussian model component to illustrate model composition in PyAutoFit."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"\n",
"\n",
"class Gaussian:\n",
" def __init__(\n",
" self,\n",
" centre=30.0, # <- **PyAutoFit** recognises these constructor arguments\n",
" normalization=1.0, # <- are the Gaussian``s model parameters.\n",
" sigma=5.0,\n",
" ):\n",
" self.centre = centre\n",
" self.normalization = normalization\n",
" self.sigma = sigma\n"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Model Composition (Model)__\n",
"\n",
"We can instantiate a Python class as a model component using `af.Model()`."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(Gaussian)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The model has 3 free parameters, corresponding to the 3 parameters defined above (`centre`, `normalization` \n",
"and `sigma`).\n",
"\n",
"Each parameter has a prior associated with it, meaning they are fitted for if the model is passed to a non-linear \n",
"search."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(f\"Model Total Free Parameters = {model.total_free_parameters}\")"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"If we print the `info` attribute of the model we get information on all of the parameters and their priors."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Priors (Model)__\n",
"\n",
"The model has a set of default priors, which have been loaded from a config file in the PyAutoFit workspace.\n",
"\n",
"The config cookbook describes how to setup config files in order to produce custom priors, which means that you do not\n",
"need to manually specify priors in your Python code every time you compose a model.\n",
"\n",
"If you do not setup config files, all priors must be manually specified before you fit the model, as shown below."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(Gaussian)\n",
"model.centre = af.UniformPrior(lower_limit=0.0, upper_limit=100.0)\n",
"model.normalization = af.LogUniformPrior(lower_limit=1e-4, upper_limit=1e4)\n",
"model.sigma = af.GaussianPrior(mean=0.0, sigma=1.0, lower_limit=0.0, upper_limit=1e5)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Instances (Model)__\n",
"\n",
"Instances of the model components above (created via `af.Model`) can be created, where an input `vector` of\n",
"parameters is mapped to create an instance of the Python class of the model.\n",
"\n",
"We first need to know the order of parameters in the model, so we know how to define the input `vector`. This\n",
"information is contained in the models `paths` attribute."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(model.paths)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We create an `instance` of the `Gaussian` class via the model where `centre=30.0`, `normalization=2.0` and `sigma=3.0`."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"instance = model.instance_from_vector(vector=[30.0, 2.0, 3.0])\n",
"\n",
"print(\"\\nModel Instance:\")\n",
"print(instance)\n",
"\n",
"print(\"\\nInstance Parameters:\\n\")\n",
"print(\"centre = \", instance.centre)\n",
"print(\"normalization = \", instance.normalization)\n",
"print(\"sigma = \", instance.sigma)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can create an `instance` by inputting unit values (e.g. between 0.0 and 1.0) which are mapped to the input values \n",
"via the priors.\n",
"\n",
"The inputs of 0.5 below are mapped as follows:\n",
"\n",
" - `centre`: goes to 0.5 because this is the midpoint of a `UniformPrior` with `lower_limit=0.0` and `upper_limit=1.0`.\n",
"\n",
" - `normalization` goes to 1.0 because this is the midpoint of the `LogUniformPrior`' with `lower_limit=1e-4` \n",
" and `upper_limit=1e4`, corresponding to log10 space.\n",
"\n",
" - `sigma`: goes to 0.0 because this is the `mean` of the `GaussianPrior`."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"instance = model.instance_from_unit_vector(unit_vector=[0.5, 0.5, 0.5])\n",
"\n",
"print(\"\\nModel Instance:\")\n",
"print(instance)\n",
"\n",
"print(\"\\nInstance Parameters:\\n\")\n",
"print(\"centre = \", instance.centre)\n",
"print(\"normalization = \", instance.normalization)\n",
"print(\"sigma = \", instance.sigma)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can create instances of the `Gaussian` using the median value of the prior of every parameter."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"instance = model.instance_from_prior_medians()\n",
"\n",
"print(\"\\nInstance Parameters:\\n\")\n",
"print(\"centre = \", instance.centre)\n",
"print(\"normalization = \", instance.normalization)\n",
"print(\"sigma = \", instance.sigma)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can create a random instance, where the random values are unit values drawn between 0.0 and 1.0.\n",
"\n",
"This means the parameter values of this instance are randomly drawn from the priors."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(Gaussian)\n",
"instance = model.random_instance()"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Model Customization (Model)__\n",
"\n",
"We can fix a free parameter to a specific value (reducing the dimensionality of parameter space by 1):"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(Gaussian)\n",
"model.centre = 0.0\n",
"\n",
"print(\n",
" f\"\\n Model Total Free Parameters After Fixing Centre = {model.total_free_parameters}\"\n",
")"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can link two parameters together such they always assume the same value (reducing the dimensionality of \n",
"parameter space by 1):"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model.centre = model.normalization\n",
"\n",
"print(\n",
" f\"\\n Model Total Free Parameters After Linking Parameters = {model.total_free_parameters}\"\n",
")"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Offsets between linked parameters or with certain values are possible:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model.centre = model.normalization + model.sigma\n",
"\n",
"print(\n",
" f\"Model Total Free Parameters After Linking Parameters = {model.total_free_parameters}\"\n",
")"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Assertions remove regions of parameter space (but do not reduce the dimensionality of parameter space):"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model.add_assertion(model.sigma > 5.0)\n",
"model.add_assertion(model.centre > model.normalization)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The customized model can be inspected by printing its `info` attribute."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The overwriting of priors shown above can be achieved via the following alternative API:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(\n",
" Gaussian,\n",
" centre=af.UniformPrior(lower_limit=0.0, upper_limit=1.0),\n",
" normalization=af.LogUniformPrior(lower_limit=1e-4, upper_limit=1e4),\n",
" sigma=af.GaussianPrior(mean=0.0, sigma=1.0),\n",
")"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This API can also be used for fixing a parameter to a certain value:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(Gaussian, centre=0.0)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Json Outputs (Model)__\n",
"\n",
"A model has a `dict` attribute, which expresses all information about the model as a Python dictionary.\n",
"\n",
"By printing this dictionary we can therefore get a concise summary of the model."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model(Gaussian)\n",
"\n",
"print(model.dict())"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The dictionary representation printed above can be saved to hard disk as a `.json` file.\n",
"\n",
"This means we can save any **PyAutoFit** model to hard-disk in a human readable format.\n",
"\n",
"Checkout the file `autofit_workspace/*/cookbooks/jsons/model.json` to see the model written as a .json."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model_path = path.join(\"scripts\", \"cookbooks\", \"jsons\")\n",
"\n",
"os.makedirs(model_path, exist_ok=True)\n",
"\n",
"model_file = path.join(model_path, \"model.json\")\n",
"\n",
"with open(model_file, \"w+\") as f:\n",
" json.dump(model.dict(), f, indent=4)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can load the model from its `.json` file, meaning that one can easily save a model to hard disk and load it \n",
"elsewhere."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model.from_json(file=model_file)\n",
"\n",
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Model Composition (Collection)__\n",
"\n",
"To illustrate `Collection` objects we define a second model component, representing a `Exponential` profile."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"\n",
"\n",
"class Exponential:\n",
" def __init__(\n",
" self,\n",
" centre=0.0, # <- PyAutoFit recognises these constructor arguments are the model\n",
" normalization=0.1, # <- parameters of the Exponential.\n",
" rate=0.01,\n",
" ):\n",
" self.centre = centre\n",
" self.normalization = normalization\n",
" self.rate = rate\n"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To instantiate multiple Python classes into a combined model component we combine the `af.Collection()` and `af.Model()` \n",
"objects.\n",
"\n",
"By passing the key word arguments `gaussian` and `exponential` below, these are used as the names of the attributes of \n",
"instances created using this model (which is illustrated clearly below)."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection(gaussian=af.Model(Gaussian), exponential=af.Model(Exponential))"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can check the model has a `total_free_parameters` of 6, meaning the 3 parameters defined \n",
"above (`centre`, `normalization`, `sigma` and `rate`) for both the `Gaussian` and `Exponential` classes all have \n",
"priors associated with them .\n",
"\n",
"This also means each parameter is fitted for if we fitted the model to data via a non-linear search."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(f\"Model Total Free Parameters = {model.total_free_parameters}\")"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Printing the `info` attribute of the model gives us information on all of the parameters. "
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Priors (Collection)__\n",
"\n",
"The model has a set of default priors, which have been loaded from a config file in the PyAutoFit workspace.\n",
"\n",
"The configs cookbook describes how to setup config files in order to produce custom priors, which means that you do not\n",
"need to manually specify priors in your Python code every time you compose a model.\n",
"\n",
"If you do not setup config files, all priors must be manually specified before you fit the model, as shown below."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model.gaussian.centre = af.UniformPrior(lower_limit=0.0, upper_limit=100.0)\n",
"model.gaussian.normalization = af.UniformPrior(lower_limit=0.0, upper_limit=1e2)\n",
"model.gaussian.sigma = af.UniformPrior(lower_limit=0.0, upper_limit=30.0)\n",
"model.exponential.centre = af.UniformPrior(lower_limit=0.0, upper_limit=100.0)\n",
"model.exponential.normalization = af.UniformPrior(lower_limit=0.0, upper_limit=1e2)\n",
"model.exponential.rate = af.UniformPrior(lower_limit=0.0, upper_limit=10.0)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"When creating a model via a `Collection`, there is no need to actually pass the python classes as an `af.Model()`\n",
"because **PyAutoFit** implicitly assumes they are to be created as a `Model()`.\n",
"\n",
"This enables more concise code, whereby the following code:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection(gaussian=af.Model(Gaussian), exponential=af.Model(Exponential))"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Can instead be written as:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection(gaussian=Gaussian, exponential=Exponential)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Instances (Collection)__\n",
"\n",
"We can create an instance of collection containing both the `Gaussian` and `Exponential` classes using this model.\n",
"\n",
"Below, we create an `instance` where: \n",
"\n",
"- The `Gaussian` class has `centre=30.0`, `normalization=2.0` and `sigma=3.0`.\n",
"- The `Exponential` class has `centre=60.0`, `normalization=4.0` and `rate=1.0``."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"instance = model.instance_from_vector(vector=[30.0, 2.0, 3.0, 60.0, 4.0, 1.0])"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Because we passed the key word arguments `gaussian` and `exponential` above, these are the names of the attributes of \n",
"instances created using this model (e.g. this is why we write `instance.gaussian`):"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"\n",
"print(\"\\nModel Instance:\")\n",
"print(instance)\n",
"\n",
"print(\"\\nInstance Parameters:\\n\")\n",
"print(\"centre (Gaussian) = \", instance.gaussian.centre)\n",
"print(\"normalization (Gaussian) = \", instance.gaussian.normalization)\n",
"print(\"sigma (Gaussian) = \", instance.gaussian.sigma)\n",
"print(\"centre (Exponential) = \", instance.exponential.centre)\n",
"print(\"normalization (Exponential) = \", instance.exponential.normalization)\n",
"print(\"rate (Exponential) = \", instance.exponential.rate)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Alternatively, the instance's variables can also be accessed as a list, whereby instead of using attribute names\n",
"(e.g. `gaussian_0`) we input the list index.\n",
"\n",
"Note that the order of the instance model components is determined from the order the components are input into the \n",
"`Collection`.\n",
"\n",
"For example, for the line `af.Collection(gaussian=gaussian, exponential=exponential)`, the first entry in the list\n",
"is the gaussian because it is the first input to the `Collection`."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(\"centre (Gaussian) = \", instance[0].centre)\n",
"print(\"normalization (Gaussian) = \", instance[0].normalization)\n",
"print(\"sigma (Gaussian) = \", instance[0].sigma)\n",
"print(\"centre (Gaussian) = \", instance[1].centre)\n",
"print(\"normalization (Gaussian) = \", instance[1].normalization)\n",
"print(\"rate (Exponential) = \", instance[1].rate)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Model Customization (Collection)__\n",
"\n",
"By setting up each Model first the model can be customized using either of the API\u2019s shown above:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"gaussian = af.Model(Gaussian)\n",
"gaussian.normalization = 1.0\n",
"gaussian.sigma = af.GaussianPrior(mean=0.0, sigma=1.0)\n",
"\n",
"exponential = af.Model(Exponential)\n",
"exponential.centre = 50.0\n",
"exponential.add_assertion(exponential.rate > 5.0)\n",
"\n",
"model = af.Collection(gaussian=gaussian, exponential=exponential)\n",
"\n",
"print(model.info)\n",
"\n",
"ddd\n",
"# %%\n",
"'''\n",
"Below is an alternative API that can be used to create the same model as above.\n",
"\n",
"Which API is used is up to the user and which they find most intuitive.\n",
"'''"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "code",
"metadata": {},
"source": [
"gaussian = af.Model(\n",
" Gaussian, normalization=1.0, sigma=af.GaussianPrior(mean=0.0, sigma=1.0)\n",
")\n",
"exponential = af.Model(Exponential, centre=50.0)\n",
"exponential.add_assertion(exponential.rate > 5.0)\n",
"\n",
"model = af.Collection(gaussian=gaussian, exponential=exponential)\n",
"\n",
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"After creating the model as a `Collection` we can customize it afterwards:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection(gaussian=Gaussian, exponential=Exponential)\n",
"\n",
"model.gaussian.normalization = 1.0\n",
"model.gaussian.sigma = af.GaussianPrior(mean=0.0, sigma=1.0)\n",
"\n",
"model.exponential.centre = 50.0\n",
"model.exponential.add_assertion(exponential.rate > 5.0)\n",
"\n",
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__JSon Outputs (Collection)__\n",
"\n",
"A `Collection` has a `dict` attribute, which express all information about the model as a Python dictionary.\n",
"\n",
"By printing this dictionary we can therefore get a concise summary of the model."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"print(model.dict())"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Python dictionaries can easily be saved to hard disk as a `.json` file.\n",
"\n",
"This means we can save any **PyAutoFit** model to hard-disk.\n",
"\n",
"Checkout the file `autofit_workspace/*/model/jsons/model.json` to see the model written as a .json."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model_path = path.join(\"scripts\", \"model\", \"jsons\")\n",
"\n",
"os.makedirs(model_path, exist_ok=True)\n",
"\n",
"model_file = path.join(model_path, \"collection.json\")\n",
"\n",
"with open(model_file, \"w+\") as f:\n",
" json.dump(model.dict(), f, indent=4)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can load the model from its `.json` file, meaning that one can easily save a model to hard disk and load it \n",
"elsewhere."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Model.from_json(file=model_file)\n",
"\n",
"print(f\"\\n Model via Json Prior Count = {model.prior_count}\")\n"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Extensible Models (Collection)__\n",
"\n",
"There is no limit to the number of components we can use to set up a model via a `Collection`."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection(\n",
" gaussian_0=Gaussian,\n",
" gaussian_1=Gaussian,\n",
" exponential_0=Exponential,\n",
" exponential_1=Exponential,\n",
" exponential_2=Exponential,\n",
")\n",
"\n",
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A model can be created via `af.Collection()` where a dictionary of `af.Model()` objects are passed to it.\n",
"\n",
"The two models created below are identical - one uses the API detailed above whereas the second uses a dictionary."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection(gaussian_0=Gaussian, gaussian_1=Gaussian)\n",
"print(model.info)\n",
"\n",
"model_dict = {\"gaussian_0\": Gaussian, \"gaussian_1\": Gaussian}\n",
"model = af.Collection(**model_dict)\n",
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The keys of the dictionary passed to the model (e.g. `gaussian_0` and `gaussian_1` above) are used to create the\n",
"names of the attributes of instances of the model."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"instance = model.instance_from_vector(vector=[1.0, 2.0, 3.0, 4.0, 5.0, 6.0])\n",
"\n",
"print(\"\\nModel Instance:\")\n",
"print(instance)\n",
"\n",
"print(\"\\nInstance Parameters:\\n\")\n",
"print(\"centre (Gaussian) = \", instance.gaussian_0.centre)\n",
"print(\"normalization (Gaussian) = \", instance.gaussian_0.normalization)\n",
"print(\"sigma (Gaussian) = \", instance.gaussian_0.sigma)\n",
"print(\"centre (Gaussian) = \", instance.gaussian_1.centre)\n",
"print(\"normalization (Gaussian) = \", instance.gaussian_1.normalization)\n",
"print(\"sigma (Gaussian) = \", instance.gaussian_1.sigma)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A list of model components can also be passed to an `af.Collection` to create a model:"
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"model = af.Collection([Gaussian, Gaussian])\n",
"\n",
"print(model.info)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"When a list is used, there is no string with which to name the model components (e.g. we do not input `gaussian_0`\n",
"and `gaussian_1` anywhere.\n",
"\n",
"The `instance` therefore can only be accessed via list indexing."
]
},
{
"cell_type": "code",
"metadata": {},
"source": [
"instance = model.instance_from_vector(vector=[1.0, 2.0, 3.0, 4.0, 5.0, 6.0])\n",
"\n",
"print(\"\\nModel Instance:\")\n",
"print(instance)\n",
"\n",
"print(\"\\nInstance Parameters:\\n\")\n",
"print(\"centre (Gaussian) = \", instance[0].centre)\n",
"print(\"normalization (Gaussian) = \", instance[0].normalization)\n",
"print(\"sigma (Gaussian) = \", instance[0].sigma)\n",
"print(\"centre (Gaussian) = \", instance[1].centre)\n",
"print(\"normalization (Gaussian) = \", instance[1].normalization)\n",
"print(\"sigma (Gaussian) = \", instance[1].sigma)"
],
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Wrap Up__\n",
"\n",
"This cookbook shows how to compose models consisting of multiple components using the `af.Model()` \n",
"and `af.Collection()` object.\n",
"\n",
"Advanced model composition uses multi-level models, which compose models from hierarchies of Python classes. This is\n",
"described in the multi-level model cookbook. "
]
},
{
"cell_type": "code",
"metadata": {},
"source": [],
"outputs": [],
"execution_count": null
}
],
"metadata": {
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.1"
}
},
"nbformat": 4,
"nbformat_minor": 4
}