Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Revision b4df039c6fe478297e532720e76d1213022410d5 authored by Jesper Nielsen on 26 October 2022, 08:27:38 UTC, committed by GitHub on 26 October 2022, 08:27:38 UTC
Fix mypy error. (#2009)
1 parent dc84ca2
  • Files
  • Changes
  • 03384ba
  • /
  • doc
  • /
  • sphinx
  • /
  • index.rst
Raw File Download
Permalinks

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • revision
  • directory
  • content
revision badge
swh:1:rev:b4df039c6fe478297e532720e76d1213022410d5
directory badge Iframe embedding
swh:1:dir:c174a18127572f4e295f3439455efb3db996e31e
content badge Iframe embedding
swh:1:cnt:8fc54c0cc0717339e1fc4df1f972c008aadec9c2
Citations

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • revision
  • directory
  • content
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
index.rst
GPflow
======

GPflow is a package for building Gaussian Process models in python, using `TensorFlow
<http://www.tensorflow.org>`_. A Gaussian Process is a kind of supervised learning model.
Some advantages of Gaussian Processes are:

* Uncertainty is an inherent part of Gaussian Processes. A Gaussian Process can tell you when
  it does not know the answer.
* Works well with small datasets. If your data is limited Gaussian Procceses can get the most from
  your data.
* Can scale to large datasets. Although Gaussian Processes, admittedly, can be computationally
  intensive there are ways to scale them to large datasets.

GPflow was originally created by `James Hensman <http://www.lancaster.ac.uk/staff/hensmanj/>`_
and `Alexander G. de G. Matthews <http://mlg.eng.cam.ac.uk/?portfolio=alex-matthews>`_.
Today it is primarily maintained by the company `Secondmind <https://www.secondmind.ai/>`_.


Documentation
-------------

If you're new to GPflow we suggest you continue to:

.. toctree::
   :maxdepth: 2

   getting_started

For more in-depth documentation see:

.. toctree::
   :maxdepth: 1

   user_guide
   API reference <api/gpflow/index>
   benchmarks
   bibliography

.. _implemented_models:


What models are implemented?
----------------------------
GPflow has a slew of kernels that can be combined in a straightforward way. As for inference, the options are currently:

Regression
""""""""""
For GP regression with Gaussian noise, it's possible to marginalize the function values exactly: you'll find this in :class:`gpflow.models.GPR`. You can do maximum likelihood or MCMC for the covariance function parameters.

It's also possible to do Sparse GP regression using the :class:`gpflow.models.SGPR` class. This is based on work by Michalis Titsias :cite:p:`titsias2009variational`.

MCMC
""""
For non-Gaussian likelihoods, GPflow has a model that can jointly sample over the function values and the covariance parameters: :class:`gpflow.models.GPMC`. There's also a sparse equivalent in :class:`gpflow.models.SGPMC`, based on :cite:t:`hensman2015mcmc`.

Variational inference
"""""""""""""""""""""
It's often sufficient to approximate the function values as a Gaussian, for which we follow :cite:t:`Opper:2009` in :class:`gpflow.models.VGP`. In addition, there is a sparse version based on :cite:t:`hensman2014scalable` in :class:`gpflow.models.SVGP`. In the Gaussian likelihood case some of the optimization may be done analytically as discussed in :cite:t:`titsias2009variational` and implemented in :class:`gpflow.models.SGPR` . All of the sparse methods in GPflow are solidified in :cite:t:`matthews2016sparse`.

The following table summarizes the model options in GPflow.

+----------------------+----------------------------+----------------------------+------------------------------+
|                      | Gaussian                   | Non-Gaussian (variational) | Non-Gaussian                 |
|                      | Likelihood                 |                            | (MCMC)                       |
+======================+============================+============================+==============================+
| Full-covariance      | :class:`gpflow.models.GPR` | :class:`gpflow.models.VGP` | :class:`gpflow.models.GPMC`  |
+----------------------+----------------------------+----------------------------+------------------------------+
| Sparse approximation | :class:`gpflow.models.SGPR`| :class:`gpflow.models.SVGP`| :class:`gpflow.models.SGPMC` |
+----------------------+----------------------------+----------------------------+------------------------------+

A unified view of many of the relevant references, along with some extensions, and an early discussion of GPflow itself, is given in the PhD thesis of Matthews :cite:p:`matthews2017scalable`.

Interdomain inference and multioutput GPs
"""""""""""""""""""""""""""""""""""""""""
GPflow has an extensive and flexible framework for specifying interdomain inducing variables for variational approximations.
Interdomain variables can greatly improve the effectiveness of a variational approximation, and are used in e.g.
:doc:`notebooks/advanced/convolutional`. In particular, they are crucial for defining sensible sparse
approximations for multioutput GPs (:doc:`notebooks/advanced/multioutput`).

GPflow has a unifying design for using multioutput GPs and specifying interdomain approximations. A review of the
mathematical background and the resulting software design is described in :cite:t:`GPflow2020multioutput`.

GPLVM
"""""
For visualisation, the GPLVM :cite:p:`lawrence2003gaussian` and Bayesian GPLVM :cite:p:`titsias2010bayesian` models are implemented
in GPflow (:doc:`notebooks/advanced/GPLVM`).

Heteroskedastic models
""""""""""""""""""""""
GPflow supports heteroskedastic models by configuring a likelihood object. See examples in :doc:`notebooks/advanced/varying_noise` and :doc:`notebooks/advanced/heteroskedastic`


Contact
-------

* GPflow is an open source project, and you can find this project on `GitHub
  <https://github.com/GPflow/GPflow>`_.
* If you find any bugs, please `file a ticket <https://github.com/GPflow/GPflow/issues/new/choose>`_.
* If you need help, please use `Stack Overflow <https://stackoverflow.com/tags/gpflow>`_.
* If you otherwise need to contact us, the easiest way to get in touch is
  through our `Slack workspace
  <https://join.slack.com/t/gpflow/shared_invite/enQtOTE5MDA0Nzg5NjA2LTYwZWI3MzhjYjNlZWI1MWExYzZjMGNhOWIwZWMzMGY0YjVkYzAyYjQ4NjgzNDUyZTgyNzcwYjAyY2QzMWRmYjE>`_.


If you feel you have some relevant skills and are interested in contributing then please read our
`notes for contributors <https://github.com/GPflow/GPflow/blob/develop/CONTRIBUTING.md>`_ and contact
us. We maintain a `full list of contributors
<https://github.com/GPflow/GPflow/blob/develop/CONTRIBUTORS.md>`_.


Citing GPflow
-------------

To cite GPflow, please reference :cite:t:`GPflow2017`. Sample BibTeX is given below:

.. code-block:: bib

    @ARTICLE{GPflow2017,
        author = {Matthews, Alexander G. de G. and
                  {van der Wilk}, Mark and
                  Nickson, Tom and
                  Fujii, Keisuke. and
                  {Boukouvalas}, Alexis and
                  {Le{\'o}n-Villagr{\'a}}, Pablo and
                  Ghahramani, Zoubin and
                  Hensman, James},
        title = "{{GP}flow: A {G}aussian process library using {T}ensor{F}low}",
        journal = {Journal of Machine Learning Research},
        year = {2017},
        month = {apr},
        volume = {18},
        number = {40},
        pages = {1-6},
        url = {http://jmlr.org/papers/v18/16-537.html}
    }

Since the publication of the GPflow paper, the software has been significantly extended
with the framework for interdomain approximations and multioutput priors. We review the
framework and describe the design in :cite:t:`GPflow2020multioutput`, which can be cited by users:

.. code-block:: bib

    @article{GPflow2020multioutput,
      author = {{van der Wilk}, Mark and
                Dutordoir, Vincent and
                John, ST and
                Artemev, Artem and
                Adam, Vincent and
                Hensman, James},
      title = {A Framework for Interdomain and Multioutput {G}aussian Processes},
      year = {2020},
      journal = {arXiv:2003.01115},
      url = {https://arxiv.org/abs/2003.01115}
    }


Acknowledgements
----------------

James Hensman was supported by an MRC fellowship and Alexander G. de G. Matthews was supported by EPSRC grants EP/I036575/1 and EP/N014162/1.
The diff you're trying to view is too large. Only the first 1000 changed files have been loaded.
Showing with 0 additions and 0 deletions (0 / 0 diffs computed)
swh spinner

Computing file changes ...

Software Heritage — Copyright (C) 2015–2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Contact— JavaScript license information— Web API

back to top