Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

  • c75bac1
  • /
  • gpflow
  • /
  • conditionals
  • /
  • conditionals.py
Raw File Download
Permalinks

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • content
  • directory
content badge Iframe embedding
swh:1:cnt:f0d46bc747bd0e49746c0abe2aa78db86cc0f88b
directory badge Iframe embedding
swh:1:dir:788a4cca9ce8e926b0fda0c822ab60b690f70ed2
Citations

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • content
  • directory
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
conditionals.py
# Copyright 2017-2020 The GPflow Contributors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from typing import Optional

import tensorflow as tf

from ..base import MeanAndVariance
from ..experimental.check_shapes import check_shapes
from ..inducing_variables import InducingVariables
from ..kernels import Kernel
from ..posteriors import BasePosterior, VGPPosterior, get_posterior_class
from .dispatch import conditional


@conditional._gpflow_internal_register(object, InducingVariables, Kernel, object)
@check_shapes(
    "Xnew: [batch..., N, D]",
    "inducing_variable: [M, D, broadcast R]",
    "f: [M, R]",
    "q_sqrt: [M_R_or_R_M_M...]",
    "return[0]: [batch..., N, R]",
    "return[1]: [batch..., N, R] if (not full_cov) and (not full_output_cov)",
    "return[1]: [batch..., R, N, N] if full_cov and (not full_output_cov)",
    "return[1]: [batch..., N, R, R] if (not full_cov) and full_output_cov",
    "return[1]: [batch..., N, R, N, R] if full_cov and full_output_cov",
)
def _sparse_conditional(
    Xnew: tf.Tensor,
    inducing_variable: InducingVariables,
    kernel: Kernel,
    f: tf.Tensor,
    *,
    full_cov: bool = False,
    full_output_cov: bool = False,
    q_sqrt: Optional[tf.Tensor] = None,
    white: bool = False,
) -> MeanAndVariance:
    """
    Single-output GP conditional.

    The covariance matrices used to calculate the conditional have the following shape:

    - Kuu: [M, M]
    - Kuf: [M, N]
    - Kff: [N, N]

    Further reference:

    - See `gpflow.conditionals._dense_conditional` (below) for a detailed explanation of
      conditional in the single-output case.
    - See the multiouput notebook for more information about the multiouput framework.

    :param Xnew: data matrix
    :param f: data matrix
    :param full_cov: return the covariance between the datapoints
    :param full_output_cov: return the covariance between the outputs.
        NOTE: as we are using a single-output kernel with repetitions
        these covariances will be zero.
    :param q_sqrt: matrix of standard-deviations or Cholesky matrices,
        size [M, R] or [R, M, M].
    :param white: boolean of whether to use the whitened representation
    :return: mean and variance
    """
    posterior_class = get_posterior_class(kernel, inducing_variable)

    posterior: BasePosterior = posterior_class(
        kernel,
        inducing_variable,
        f,
        q_sqrt,
        whiten=white,
        mean_function=None,
        precompute_cache=None,
    )
    return posterior.fused_predict_f(Xnew, full_cov=full_cov, full_output_cov=full_output_cov)


@conditional._gpflow_internal_register(object, object, Kernel, object)
@check_shapes(
    "Xnew: [batch..., N, D]",
    "X: [M, D]",
    "f: [M, R]",
    "return[0]: [batch..., N, R]",
    "return[1]: [batch..., N, R] if (not full_cov) and (not full_output_cov)",
    "return[1]: [batch..., R, N, N] if full_cov and (not full_output_cov)",
    "return[1]: [batch..., N, R, R] if (not full_cov) and full_output_cov",
    "return[1]: [batch..., N, R, N, R] if full_cov and full_output_cov",
)
def _dense_conditional(
    Xnew: tf.Tensor,
    X: tf.Tensor,
    kernel: Kernel,
    f: tf.Tensor,
    *,
    full_cov: bool = False,
    full_output_cov: bool = False,
    q_sqrt: Optional[tf.Tensor] = None,
    white: bool = False,
) -> MeanAndVariance:
    """
    Given f, representing the GP at the points X, produce the mean and
    (co-)variance of the GP at the points Xnew.

    Additionally, there may be Gaussian uncertainty about f as represented by
    q_sqrt. In this case `f` represents the mean of the distribution and
    q_sqrt the square-root of the covariance.

    Additionally, the GP may have been centered (whitened) so that::

        p(v) = ๐’ฉ(๐ŸŽ, ๐ˆ)
        f = ๐‹v

    thus::

        p(f) = ๐’ฉ(๐ŸŽ, ๐‹๐‹แต€) = ๐’ฉ(๐ŸŽ, ๐Š).

    In this case `f` represents the values taken by v.

    The method can either return the diagonals of the covariance matrix for
    each output (default) or the full covariance matrix (full_cov=True).

    We assume R independent GPs, represented by the columns of f (and the
    first dimension of q_sqrt).

    :param Xnew: data matrix. Evaluate the GP at these new points
    :param X: data points.
    :param kernel: GPflow kernel.
    :param f: data matrix, representing the function values at X,
        for R functions.
    :param q_sqrt: matrix of standard-deviations or Cholesky matrices,
        size [M, R] or [R, M, M].
    :param white: boolean of whether to use the whitened representation as
        described above.
    :return: mean and variance
    """
    posterior = VGPPosterior(
        kernel=kernel,
        X=X,
        q_mu=f,
        q_sqrt=q_sqrt,
        white=white,
        precompute_cache=None,
    )
    return posterior.fused_predict_f(Xnew, full_cov=full_cov, full_output_cov=full_output_cov)

back to top

Software Heritage โ€” Copyright (C) 2015โ€“2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, APIโ€” Contactโ€” JavaScript license informationโ€” Web API