swh:1:snp:d2bcff616bbf538fe8ce2a9c384200307730292a
Revision 7e717ccb69bf4a3e563f176b28ff2f516d65482c authored by st-- on 31 March 2020, 12:17:12 UTC, committed by GitHub on 31 March 2020, 12:17:12 UTC
This gives all `BayesianModel` subclasses a consistent interface both for optimization (MLE/MAP) and MCMC. Models are required to implement `maximum_log_likelihood_objective`, which is to be maximized for model training. Optimization: The `_training_loss` method is defined as `- (maximum_log_likelihood_objective + log_prior_density)`. This is exposed by the InternalDataTrainingLossMixin and ExternalDataTrainingLossMixin classes. For models that keep hold of the data internally, `training_loss` can directly be passed as a closure to an optimizer's `minimize`, for example: ```python model = gpflow.models.GPR(data, ...) gpflow.optimizers.Scipy().minimize(model.training_loss, model.trainable_variables) ``` If the model objective requires data to be passed in, a closure can be constructed on the fly using `model.training_loss_closure(data)`, which returns a no-argument closure: ```python model = gpflow.models.SVGP(...) gpflow.optimizers.Scipy().minimize( model.training_loss_closure(data), model.trainable_variables, ... ) ``` The training_loss_closure() method provided by both InternalDataTrainingLossMixin and ExternalDataTrainingLossMixin takes a boolean `compile` argument (default: True) that wraps the returned closure in tf.function(). Note that the return value should be cached in a variable if the minimize() step is run several times to avoid re-compilation in each step! MCMC: The `log_posterior_density` method can be directly passed to the `SamplingHelper`. By default, `log_posterior_density` is implemented as `maximum_log_likelihood_objective + log_prior_density`. Models can override this if needed. Example: ```python model = gpflow.models.GPMC(...) hmc_helper = gpflow.optimizers.SamplingHelper( model.log_posterior_density, model.trainable_parameters ) hmc = tfp.mcmc.HamiltonianMonteCarlo( target_log_prob_fn=hmc_helper.target_log_prob_fn, ... ) ``` In this case, the function that runs the MCMC chain should be wrapped in tf.function() (see MCMC notebook).
1 parent e61ee69
Tip revision: 3e4b9f16d5b0757975e9cc11bb63db44dd2acf4d authored by st-- on 06 July 2021, 08:24:44 UTC
Update RELEASE.md (#1701)
Update RELEASE.md (#1701)
Tip revision: 3e4b9f1
File | Mode | Size |
---|---|---|
.circleci | ||
.github | ||
doc | ||
gpflow | ||
tests | ||
.coveragerc | -rw-r--r-- | 283 bytes |
.coveralls.yml | -rw-r--r-- | 23 bytes |
.gitignore | -rw-r--r-- | 828 bytes |
.pylintrc | -rw-r--r-- | 14.6 KB |
GLOSSARY.md | -rw-r--r-- | 1.4 KB |
LICENSE | -rw-r--r-- | 11.1 KB |
MANIFEST.in | -rw-r--r-- | 182 bytes |
Makefile | -rw-r--r-- | 682 bytes |
README.md | -rw-r--r-- | 9.7 KB |
RELEASE.md | -rw-r--r-- | 6.9 KB |
VERSION | -rw-r--r-- | 9 bytes |
codecov.yml | -rw-r--r-- | 274 bytes |
contributing.md | -rw-r--r-- | 7.3 KB |
notebooks | l--------- | 21 bytes |
setup.py | -rw-r--r-- | 3.4 KB |
tests_requirements.txt | -rw-r--r-- | 232 bytes |
Computing file changes ...