https://github.com/GPflow/GPflow
Revision b41d4f38436e4a090c940dbd3bc7e2afd39a283e authored by st-- on 23 April 2020, 18:17:42 UTC, committed by GitHub on 23 April 2020, 18:17:42 UTC
Previously, GPflow's NaturalGradient optimizer would call the loss_function once for each (q_mu, q_sqrt) set in the var_list. This is a light refactor that separates out applying the natural gradient step from computing the gradients (`_natgrad_apply_gradients`), and changes `_natgrad_steps` to only evaluate the loss function once, computing the gradients for all (q_mu, q_sqrt) tuples passed in the var_list.

Other changes:
- The no-longer-used `_natgrad_step` method got removed.
- NaturalGradient now takes a `xi_transform` argument that is used for all parameter sets without explicitly specified xi transform (i.e. tuples rather than triplets).
- XiTransform has been changed to have staticmethods.

None of this should affect any downstream code; this PR is backwards-compatible.
1 parent c7550ce
Raw File
Tip revision: b41d4f38436e4a090c940dbd3bc7e2afd39a283e authored by st-- on 23 April 2020, 18:17:42 UTC
refactor natgrads to be more efficient (#1443)
Tip revision: b41d4f3
.gitignore
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]

# C extensions
*.so

# Distribution / packaging
.Python
env/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg

# PyInstaller
#  Usually these files are written by a python script from a template
#  before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover

# Translations
*.mo
*.pot

# Django stuff:
*.log

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Emacs backups
*~

# Pycharm IDE directory
.idea

# IPython Notebooks
.ipynb_checkpoints

# VSCode
.vscode

# OSX
.DS_Store

# mypy artifacts
.mypy_cache
back to top