https://github.com/GPflow/GPflow

sort by:
Revision Author Date Message Commit Date
435a7ff Merge pull request #1481 from GPflow/develop Release 2.0.4 20 May 2020, 13:59:30 UTC
f8676eb bump version to 2.0.4 (#1478) 20 May 2020, 12:19:40 UTC
a198764 enable mypy on CI (#1471) 19 May 2020, 13:50:17 UTC
61f8d84 Fix freeze and traverse utility methods (#1476) TensorFlow Probability 0.10 introduced a new _parameters attribute in tfp.bijectors.Bijector instances that contains a self-reference, which broke GPflow's module-traversion utilities (leaf_components, print_summary, deepcopy, freeze). This PR changes traverse_module to ignore the _parameters attribute and proposes a new deepcopy approach that simply uses copy.deepcopy()'s memo argument. Co-authored-by: st-- <st--@users.noreply.github.com> 18 May 2020, 20:58:12 UTC
150a0d1 add type hints to gpflow.base (#1427) 14 May 2020, 13:37:09 UTC
bf35d66 Merge pull request #1468 from GPflow/vincent/master_203 Sync develop with master 2.0.3 13 May 2020, 09:28:43 UTC
56c2efe Merge remote-tracking branch 'origin/develop' into vincent/master_203 13 May 2020, 09:05:41 UTC
a4bfbe6 Add ipynb to gitignore (#1466) * Add ipynb to gitignore 13 May 2020, 07:58:09 UTC
b83f124 Merge branch 'develop' 13 May 2020, 07:49:01 UTC
49ffbcf Merge branch 'master' into develop 13 May 2020, 07:48:48 UTC
00d2e66 bump version to 2.0.3 (#1464) 12 May 2020, 17:21:58 UTC
7560936 Update setup.py for tensorflow 2(.2) (#1460) 12 May 2020, 12:44:01 UTC
5e599fb Release 2.0.2 (#1459) * add long_description and project_urls to setup.py (#1438) * add type hints for probability distributions (#1421) * remove comment (#1441) * Use a type alias in the fnction signature of leading_transpose (#1442) * refactor natgrads to be more efficient (#1443) * Fix dimensions of kernel evaluation of changepoint kernel (#1446) * Removed unusued imports. (#1450) * Improve representation of GPflow objects in IPython/Jupyter notebook (#1453) * includes the repr() string in IPython/Jupyter notebook representation as well (i.e. fully-qualified class name and object hash (memory address), which helps distinguish objects from each other) * only displays the parameter table when it is not empty * makes use of default_summary_fmt() for IPython shell * Convert data structures to tensor in model init method (#1452) * Use a boolean for full covariance in sample_mvn. (#1448) * #1452 for GPMC model (#1458) * release candidate v2.0.2 (#1457) Co-authored-by: st-- <st--@users.noreply.github.com> Co-authored-by: joelberkeley-pio <joel.berkeley@prowler.io> Co-authored-by: John Mcleod <43960404+johnamcleod@users.noreply.github.com> Co-authored-by: Mark van der Wilk <markvanderw@gmail.com> Co-authored-by: Artem Artemev <art.art.v@gmail.com> 07 May 2020, 15:01:50 UTC
6ce4203 release candidate v2.0.2 (#1457) 07 May 2020, 14:45:49 UTC
8f06b7c #1452 for GPMC model (#1458) 07 May 2020, 14:24:22 UTC
39d563c Use a boolean for full covariance in sample_mvn. (#1448) 07 May 2020, 13:00:10 UTC
7e3e514 Convert data structures to tensor in model init method (#1452) Resolves #1439 07 May 2020, 12:22:34 UTC
5ec7e91 Improve representation of GPflow objects in IPython/Jupyter notebook (#1453) * includes the repr() string in IPython/Jupyter notebook representation as well (i.e. fully-qualified class name and object hash (memory address), which helps distinguish objects from each other) * only displays the parameter table when it is not empty * makes use of default_summary_fmt() for IPython shell 04 May 2020, 13:13:48 UTC
08739ce Removed unusued imports. (#1450) 02 May 2020, 18:11:17 UTC
8b24183 Fix dimensions of kernel evaluation of changepoint kernel (#1446) Closes #1440 26 April 2020, 21:36:15 UTC
b41d4f3 refactor natgrads to be more efficient (#1443) Previously, GPflow's NaturalGradient optimizer would call the loss_function once for each (q_mu, q_sqrt) set in the var_list. This is a light refactor that separates out applying the natural gradient step from computing the gradients (`_natgrad_apply_gradients`), and changes `_natgrad_steps` to only evaluate the loss function once, computing the gradients for all (q_mu, q_sqrt) tuples passed in the var_list. Other changes: - The no-longer-used `_natgrad_step` method got removed. - NaturalGradient now takes a `xi_transform` argument that is used for all parameter sets without explicitly specified xi transform (i.e. tuples rather than triplets). - XiTransform has been changed to have staticmethods. None of this should affect any downstream code; this PR is backwards-compatible. 23 April 2020, 18:17:42 UTC
c7550ce Use a type alias in the function signature of leading_transpose (#1442) Ensure that the `leading_transpose` function can be used with the @tf.function decorator (workaround for bug in tensorflow<=2.2.0rc3) 23 April 2020, 09:19:35 UTC
82f2db4 remove comment (#1441) 20 April 2020, 22:03:27 UTC
e1a3fa3 add type hints for probability distributions (#1421) 20 April 2020, 09:38:45 UTC
ca3b854 add long_description and project_urls to setup.py (#1438) 20 April 2020, 07:43:57 UTC
2358539 Release 2.0.1 (#1436) Release notes: - Improve structure of likelihoods subdirectory (#1416) - Update README.md (#1401) and GPflow 2 upgrade guide (#1414) - Improved handling of invalid values for constrained Parameters (#1408) - Improvements on types/function annotations (#1406, #1420) - Documentation improvements (metalearning with GPs: #1382, coregionalization notebook: #1402, MCMC notebook: #1410, intro to gpflow with tensorflow 2: #1413) - Minor documentation fixes (#1429, #1430, #1433) - Fix: move matplotlib import inside ImageToTensorBoard (#1399) - Fix: tf.function compilation of ndiagquad (#1418) - Fix: cache tensorboard file writers and re-use them (#1424) 15 April 2020, 12:19:00 UTC
3fc050d Merge master (at 2.0.0) into develop (pull request #1437) 15 April 2020, 11:33:35 UTC
641b8ad Merge branch 'develop' of github.com:GPflow/GPflow into develop-master-merge 15 April 2020, 10:48:26 UTC
8afd2f9 Merge remote-tracking branch 'origin/master' into develop-master-merge 15 April 2020, 10:37:40 UTC
a6156e8 bump version to 2.0.1 (#1426) 15 April 2020, 10:14:37 UTC
2018243 fix kernel construction in multioutput notebook (#1430) * fix kernel construction in multioutput notebook * fix one more kernel in changepoints notebook 15 April 2020, 10:14:20 UTC
3fd78aa fix other-issue.md issue template (#1432) The "other issue" template wasn't being pulled in by GitHub's "new issue chooser"(https://github.com/GPflow/GPflow/issues/new/choose) because of a parsing failure due to the use of quotes in the name: field, which is fixed by this PR (changed version obtained by putting the text with quotes into GitHub's "create an issue template" interface). Includes a few minor copyedits. 14 April 2020, 16:57:44 UTC
5a5603d fix pyplot import in notebooks (#1433) * fix pyplot import for matplotlib 3.1.3 (closes #1423) * apply same fix to other notebooks 14 April 2020, 16:56:36 UTC
4821d03 Issue templates (#1425) This gives the GPflow repository four issue templates: * bugs (including performance and build issues) * feature requests * documentation issues * other issues (pointing to the stackoverflow gpflow tag) This will hopefully make new issues more easily addressable. :) Co-authored-by: joelberkeley-pio <joel.berkeley@prowler.io> 14 April 2020, 14:59:19 UTC
806004d fix link in sphinx intro.rst (#1429) Closes #1428. 14 April 2020, 12:57:55 UTC
12dafe9 move matplotlib import inside ImageToTensorBoard (#1399) * move matplotlib import inside ImageToTensorBoard class so that it is not a hard dependency for GPflow (`import gpflow` does not require matplotlib, only instantiating `ImageToTensorBoard` does) * split up monitor into base.py and tensorboard.py * add matplotlib to extras_require 14 April 2020, 09:55:15 UTC
e652e75 Fix issue where multiple file writers were created (#1424) Cache summary file writers and re-use them in ToTensorBoard monitor task. Fixes #1385 10 April 2020, 20:28:52 UTC
8e8ea85 type ci utils module (#1420) 09 April 2020, 21:07:52 UTC
6388133 Improve user experience on invalid assignments to constrained Parameters (#1408) Addresses #1407. * attempt to improve the error message in the gpflow.Parameter check on assigning a new value that is incompatible with the parameter's transform (e.g. a non-positive value to a parameter with a positive() transform) * Gaussian likelihood: add explicit `__init__`-time check that variance > variance_lower_bound, add `__init__` docstring, move "default variance lower bound" magic number into class-level constant * change gpflow.config's positive_minimum to always be a float (initialized to 0.0 by default) 08 April 2020, 13:52:11 UTC
92f914d fix compilation issue in ndiagquad when integrating over multiple dimensions (#1418) 08 April 2020, 13:45:08 UTC
5c89c1e Improve structure of likelihoods subdirectory (#1416) Addresses #1405 * New structure underneath gpflow/likelihoods/: * base.py: all base classes (Likelihood, MonteCarloLikelihood, ScalarLikelihood) and SwitchedLikelihood * multiclass.py: multi-class classification (Softmax, MultiClass + RobustMax) * scalar_continuous.py: continuous-Y subclasses of ScalarLikelihood (Gaussian, StudentT, Exponential, Beta, Gamma) * scalar_discrete.py: discrete-Y subclasses of ScalarLikelihood (Bernoulli, Poisson, Ordinal) * utils.py: the `inv_probit` link function used by Bernoulli and Beta likelihoods * misc.py: GaussianMC - used for demonstration/tests only. (Note that usage, i.e. accessing gpflow.likelihoods.<LikelihoodClass>, has not changed.) * Tests for multi-class classification likelihoods moved out into their own test module (including stubs for the missing MultiClass quadrature tests of #1091) * Re-activates the quadrature tests for ScalarLikelihood subclasses with analytic variational_expectations/predict_log_density/predict_mean_and_var that inadvertently got disabled by #1334 * Fixes a bug in Bernoulli._predict_log_density that was uncovered by these tests * Fixes random seed for mock data generation in test_natural_gradient to make svgp_vs_gpr test pass again 08 April 2020, 09:48:25 UTC
dda8c39 update gpflow2 upgrade guide: likelihood section, predict_log_density, IsotropicStationary (#1414) 08 April 2020, 09:25:06 UTC
e7cfb8e Match original experimental setup for "metalearning with GPs" notebook. (#1382) * Match original experiment setup from paper. Replicate the original setup published in Fortuin and Ratsch (2019) and heavily comment the code with references to the paper. * Run `make format` and reduce hyperparameters to ease computation. * Format MSE and std to 2 decimals and sort randomly permuted indices. * Describe experimental modifications made in the notebook. * Plot target task training points. * Plot prediction variance/uncertainty. * Go with N=500, use more explicit name. * Address comments from @st-- 07 April 2020, 19:38:30 UTC
8094187 Use AUTOTUNE from tf.data.experimental (#1413) Best practice demonstrated in our notebooks 06 April 2020, 11:56:58 UTC
90f9a12 Fix MCMC sampler in documentation (#1410) * Increase MCMC sampling * Fix section links 03 April 2020, 17:04:51 UTC
ea10dea Fix mypy errors -variables are not types- (#1406) 03 April 2020, 13:12:53 UTC
7417180 update README for 2.0 release (#1401) 02 April 2020, 11:49:10 UTC
4dddadc Add clarity to coregionalization example (#1402) 01 April 2020, 13:40:56 UTC
1b023cb fixup website (#1398) 31 March 2020, 18:50:35 UTC
59f3e2c Update circleci to TF2.1 docker image (#1397) 31 March 2020, 15:15:00 UTC
47e788a Release 2.0.0 (#1396) 31 March 2020, 13:19:27 UTC
4f9aff7 Bump version to 2.0.0 (#1395) 31 March 2020, 13:04:10 UTC
7e717cc refactor training objective methods (#1276) This gives all `BayesianModel` subclasses a consistent interface both for optimization (MLE/MAP) and MCMC. Models are required to implement `maximum_log_likelihood_objective`, which is to be maximized for model training. Optimization: The `_training_loss` method is defined as `- (maximum_log_likelihood_objective + log_prior_density)`. This is exposed by the InternalDataTrainingLossMixin and ExternalDataTrainingLossMixin classes. For models that keep hold of the data internally, `training_loss` can directly be passed as a closure to an optimizer's `minimize`, for example: ```python model = gpflow.models.GPR(data, ...) gpflow.optimizers.Scipy().minimize(model.training_loss, model.trainable_variables) ``` If the model objective requires data to be passed in, a closure can be constructed on the fly using `model.training_loss_closure(data)`, which returns a no-argument closure: ```python model = gpflow.models.SVGP(...) gpflow.optimizers.Scipy().minimize( model.training_loss_closure(data), model.trainable_variables, ... ) ``` The training_loss_closure() method provided by both InternalDataTrainingLossMixin and ExternalDataTrainingLossMixin takes a boolean `compile` argument (default: True) that wraps the returned closure in tf.function(). Note that the return value should be cached in a variable if the minimize() step is run several times to avoid re-compilation in each step! MCMC: The `log_posterior_density` method can be directly passed to the `SamplingHelper`. By default, `log_posterior_density` is implemented as `maximum_log_likelihood_objective + log_prior_density`. Models can override this if needed. Example: ```python model = gpflow.models.GPMC(...) hmc_helper = gpflow.optimizers.SamplingHelper( model.log_posterior_density, model.trainable_parameters ) hmc = tfp.mcmc.HamiltonianMonteCarlo( target_log_prob_fn=hmc_helper.target_log_prob_fn, ... ) ``` In this case, the function that runs the MCMC chain should be wrapped in tf.function() (see MCMC notebook). 31 March 2020, 12:17:12 UTC
e61ee69 Add a section about custom user config (#1394) 31 March 2020, 10:00:53 UTC
27bc0d4 _scalar_log_density -> _scalar_log_prob (#1393) In GPflow2, the log probability density itself (summed over multiple outputs, if applicable) was called log_prob to be consistent with tensorflow_probability. In #1334 the element-wise log probability density to be implemented by ScalarLikelihood subclasses was called _scalar_log_density. This PR renames the latter to _scalar_log_prob to make it more self-consistent. 31 March 2020, 08:39:35 UTC
baa8bee Update upgrade guide (#1390) 31 March 2020, 08:29:19 UTC
d2ac01f Improve doc generation (#1391) * clarifying doc generation * add automodule for module level docstrings 31 March 2020, 08:23:27 UTC
72de787 rename jit to compile (#1392) 30 March 2020, 17:58:43 UTC
1e936e0 Increased tolerance for sqrt in kernels K_r2. (#1388) We had a report of numerical problems when the tolerance was 1e-40. So in GPflow 1 it got changed to 1e-36, but this wasn't ported to GPflow 2, hence why we're fixing it now. 30 March 2020, 12:08:03 UTC
792201d Added suggestion on how to open TensorBoard so it updates. (#1389) 30 March 2020, 11:35:07 UTC
c8e8f00 enable pickling of frozen models (#1338) Changes setattr_by_path to delete the attribute before setting it again. When simply replacing a Parameter with tf.Constant, tensorflow did not update its internal list of tracked variables, which prevented deepcopying and pickling of frozen models. Explicitly deleting the tracked variable before assigning a constant value forces tensorflow to update its state. Workaround for https://github.com/tensorflow/tensorflow/issues/37806 30 March 2020, 11:20:14 UTC
0b9e1f0 MCMC cleanup (#1374) * SamplingHelper: clean up docstrings * SamplingHelper: do not convert to numpy in convert_to_constrained_samples * MCMC notebook: clean up; make faster (ci_niter); remove opaque use of utility function * MCMC notebook: clarify MCMC over Z for SGPMC - inducing points should be non-trainable * tests: use pytest.raises * tests: clean up match argument 30 March 2020, 11:17:24 UTC
1272141 fix notebook LaTeX equations + upgrade guide link (#1370) * fix notebook LaTeX equations * remove empty cells at bottom of notebooks * fix link to intro_to_gpflow2 in docs * Apply suggestions from code review Co-Authored-By: Vincent Dutordoir <dutordoirv@gmail.com> * fix internal notebook link * more fixes Co-authored-by: Vincent Dutordoir <dutordoirv@gmail.com> 30 March 2020, 10:33:32 UTC
aff62bc Make Scipy more robust & faster (#1377) * Adds an explicit check to Scipy() that all elements of `variables` are actually tf.Variable instances (unconstrained variables) - accidentally passing `model.trainable_parameters` may work but give the wrong answers for parameters with transforms! Passing `model.trainable_variables` is the right thing to do. * Updates _compute_loss_and_gradients to only compute gradients wrt passed-in variables. * Fix mixture density network notebook, which had erroneously passed model.trainable_parameters instead of model.trainable_variables. 30 March 2020, 09:32:58 UTC
d41d6eb Set kernelspec to Py3 in monitoring.pct.py (#1386) Quickfix for the failing monitoring.pct.py notebook during the doc generation on CircleCi. Problem was caused by an unknown Python version in the notebook's kernelspec. It has to be 'name: python3', whereas conda may add a user-specific custom name there. 30 March 2020, 08:49:58 UTC
82999c9 minimum required multipledispatch version is 0.6 (#1384) As shown by #1381, gpflow does not work with older versions of multipledispatch - this change reflects this in the setup.py version requirements. 30 March 2020, 08:47:02 UTC
7ec7955 speed up VFF notebook (#1383) 28 March 2020, 16:12:50 UTC
30824d2 Cleaning some todo (#1347) 27 March 2020, 18:15:25 UTC
0d97bc0 MCMC notebook prior update & helper refactoring (#1372) 27 March 2020, 16:57:51 UTC
cf35758 moved multioutput module files (#1353) Moved all `mo_` files to be in their own multioutput directory. This should be more sustainable as the number of multioutput items increases 27 March 2020, 15:23:42 UTC
66010a4 Update Readme (#1378) * added slack badge * prepare for release 2.0 * update contrib list 27 March 2020, 14:23:02 UTC
b4168fa shared interface with inducing variables (#1311) Added a shared interface for multioutput inducing variables and kernels Co-authored-by: ST John <st@prowler.io> Co-authored-by: st-- <st--@users.noreply.github.com> 27 March 2020, 13:44:49 UTC
1196151 Update link to monitoring notebook in intro (#1376) 27 March 2020, 09:09:35 UTC
8d0128c fix Cosine kernel for multivariate inputs (#1357) The previous version operating on Euclidean distance was returning indefinite covariance matrices on multivariate data. This version, derived from eq. 4.7 of Wilson (2014), is always positive semidefinite. Closes #1328. This PR also changes the definition of the cosine kernel slightly, from sigma * cos(|x - x'| / l) to sigma * cos(2 * pi * (x - x') / l). This makes the lengthscale parameter directly interpretable as period length. It introduces new IsotropicStationary and AnisotropicStationary base classes. 26 March 2020, 21:02:24 UTC
c442e87 Monitoring of optimisation (#1344) 26 March 2020, 18:45:58 UTC
118dcfe Fixing up broken notebook links (#1369) Fixing broken notebook links in the README, GPFlow intro, and gpflow2 upgrade guide. Mostly links to files which used to be .ipynb jupyter files, and are now jupytext files. Removing (commenting out) a couple links to now-removed notebooks. Updating link to monitoring notebook, such that #1344 just needs to uncomment that line 26 March 2020, 16:46:56 UTC
26ee155 fix typing in gpflow.optimizers.scipy (#1364) 25 March 2020, 19:08:28 UTC
9481826 update contributing.md (#1326) * updates * bring sphinx version up to date * travis->circleci, patch coverage * clarification * further cleanup * update copyright years in doc/source/conf.py 25 March 2020, 18:58:14 UTC
310dfef Move TODO into NotImplementedError message (#1365) Move TODO in uncertain_conditional into NotImplementedError message 25 March 2020, 09:19:30 UTC
718da13 Do not install `dataclasses` for python >= 3.7 (#1355) 24 March 2020, 21:17:03 UTC
89e573d Added references to the recently released multioutput paper (#1361) * Added references to the recently released multioutput paper. * Apply st--'s suggestions from code review Co-Authored-By: st-- <st--@users.noreply.github.com> * Changed latex math from '$$' to equation environment. Co-authored-by: st-- <st--@users.noreply.github.com> 24 March 2020, 18:11:23 UTC
834ed79 improve shape robustness in likelihoods (#1334) This gives GPflow likelihoods a stronger contract re what input shapes are expected and what shapes are returned. In particular, we should obey something akin to tensorflow_probability’s event-shape/batch-shape/sample-shape. Very little changes for most users, except that some shapes will be asserted. Advanced users will benefit from more shape checks and better defined return shapes for methods attached to likelihoods. Likelihoods now need to define: - self.observation_dim: what is the last dimension of Y supposed to be? - self.latent_dim: what is the last dimension of F, F_mu and F_var expected to be? We’ll check that the dimensions of tensors passed in match up. Return shapes for all methods will be the broadcast-shape of the tensors passed in, with the last dimension removed. Example: likelihood.variational_expectations(F_mu, F_var, Y) might take tensors of dimensions [..., 2], [..., 2], [..., 2] and return a tensor of shape [...] The shape checks are handled by the public methods log_prob, predict_mean_and_var, predict_log_density, and variational_expectations; new likelihoods should implement the corresponding private methods with leading underscore. ## Standard likelihoods Most likelihoods in GPflow are univariate, and treat columns of Y and F as independent variables. For these observation_dim and latent_dim are None, and we shape-check F and Y on-the-fly for matching last-column dimensions. Note that the return shape contract changes as per that above, and the likelihood methods return the sum over observation dimensions. ## Fancy likelihoods Likelihoods that depart from the univariate standard include: SwitchedLikelihood [we’ll check that latent_dim = observation_dim - 1] MultiClass/Softmax [observation_dim = 1, latent_dim = number of classes (e.g. 10 for MNIST)] HeteroskedasticGaussian e.g. see GPflow notebook [observation_dim = 1, latent_dim = 2] Note that this change deprecates Likelihood.predict_density in favour of Likelihood.predict_log_density. 24 March 2020, 18:09:29 UTC
0919a92 Improve config documentation (#1267) 23 March 2020, 18:25:47 UTC
ce65114 gpflow2 upgrade guide: fix the Parameter.value -> Parameter.numpy() section (#1358) 23 March 2020, 13:40:55 UTC
43408ce Clean up type annotations and doc strings (#1346) * fix SGPR.__init__() argument types - fixes #1248 * improve predict_f_samples docstring - fixes #1249 * unify type annotations in models 23 March 2020, 12:11:53 UTC
1a7bb92 replace images with textual diff and add section on Parameter.trainable (#1356) * replace images with textual diff (removes the images) * move one level up as no longer requiring a separate directory * fix lengthscale -> lengthscales * additional section on Parameter.trainable 23 March 2020, 12:09:48 UTC
05647ca Fix "comprehensiveness" tests (#1340) There is a test in test_likelihoods that checks whether we missed any likelihood in the test - this is so we are reminded to add tests when we add new likelihoods! But the test was broken, and this PR fixes the test. I've also done the same for the kernel broadcasting test. 23 March 2020, 10:50:46 UTC
d47446f Clean up TODOs in the code (#1322) * clean up TODO in test_likelihoods.py * clean up TODO in kullback_leiblers * L-BFGS-B is fine as optimizer * move to tf based pca_reduce and add test 23 March 2020, 10:49:18 UTC
b32d13a Black formatting is missing for documentation and setup.py (#1349) 22 March 2020, 16:33:47 UTC
aaf9339 fix gast dependency (#1350) Fixes #1348 by * only adding gast as a requirement if tensorflow is not already installed; * we check what's the latest available version of tensorflow on pypi and select a version of gast accordingly (gast<0.3 for tf<2.2 and gast>=0.3 for tf>=2.2) 21 March 2020, 10:59:27 UTC
67dc76d Fix SwitchedLikelihood vs num_latent_gps bug (#1316) Fixes #951 in gpflow-2. Adds methods to GPModel that compute the number of latent GPs required by the combination of kernel, likelihood, and data. Co-authored-by: Eric Hammy <6815729+condnsdmatters@users.noreply.github.com> 18 March 2020, 18:13:42 UTC
f36052b assert_shapes in code instead of just comments (#1219) Adds shape asserts to some of the modules listed in #1241: - conditionals.mo_conditionals (not 100%) - conditionals.util - kullback_leiblers - logdensities (only `multivariate_normal`) 18 March 2020, 14:32:52 UTC
d3ca3c9 remove trainable setter on Parameter (#1323) TensorFlow does not allow setting a tf.Variable's `trainable` property; this brings GPflow Parameter objects in line with that behaviour. gpflow.set_utilities() can be used to change the _trainable state for gpflow.Parameter and tf.Variable objects as well as recursively for tf/gpflow Modules. Note that set_trainable() should not be used inside functions that will be wrapped in tf.function() (see #1335). 17 March 2020, 19:12:46 UTC
e680397 log_prior -> log_prior_density (#1329) 17 March 2020, 09:27:35 UTC
516f97c Fix more potential tf.cast() loss-of-precision issues (#1213) 17 March 2020, 00:56:12 UTC
bb099e4 Fix predict_f_samples (#1327) Fix model.predict_f_samples 16 March 2020, 18:24:10 UTC
6d989a8 Rename model.predict_... arguments to Xnew (#1278) 16 March 2020, 15:02:01 UTC
e563056 Rename lengthscale to lengthscales (#1324) Rename lengthscale to lengthscales 16 March 2020, 13:47:39 UTC
8704514 rename kernel full to full_cov (#1319) 13 March 2020, 23:38:43 UTC
e7879b0 Reactivate notebook tests (#1317) * fix notebooks tests * add test to ensure notebook tests cannot silently fail anymore * fix num_latent -> num_latent_gps in notebooks * clarify comment in notebook 13 March 2020, 21:28:34 UTC
back to top