Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Revision c4b2c08fcdc2664e886be357161da815abb8f2bc authored by Jean Kossaifi on 27 August 2017, 20:04:05 UTC, committed by Jean Kossaifi on 27 August 2017, 20:04:05 UTC
Updated doc and website
1 parent 72f040b
  • Files
  • Changes
  • e5c34ea
  • /
  • tensorly
  • /
  • regression
  • /
  • tucker_regression.py
Raw File Download

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • revision
  • directory
  • content
revision badge
swh:1:rev:c4b2c08fcdc2664e886be357161da815abb8f2bc
directory badge Iframe embedding
swh:1:dir:4baeb3f15a23683b7de9a68cafd62ac76427963c
content badge Iframe embedding
swh:1:cnt:2f8a25d7879b9fddb6993a2a297361fba48677a2

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • revision
  • directory
  • content
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
tucker_regression.py
import numpy as np
from ..base import unfold, vec_to_tensor
from ..base import partial_tensor_to_vec, partial_unfold
from ..tenalg import kronecker
from ..tucker_tensor import tucker_to_tensor, tucker_to_vec
from ..random import check_random_state
from .. import backend as T

# Author: Jean Kossaifi

# License: BSD 3 clause



class TuckerRegressor():
    """Tucker tensor regression

        Learns a low rank Tucker weight for the regression

    Parameters
    ----------
    weight_ranks : int list
        dimension of each mode of the core Tucker weight
    tol : float
        convergence value
    reg_W : int, optional, default is 1
        regularisation on the weights
    n_iter_max : int, optional, default is 100
        maximum number of iteration
    random_state : None, int or RandomState, optional, default is None
    verbose : int, default is 1
        level of verbosity
    """

    def __init__(self, weight_ranks, tol=10e-7, reg_W=1, n_iter_max=100, random_state=None, verbose=1):
        self.weight_ranks = weight_ranks
        self.tol = tol
        self.reg_W = reg_W
        self.n_iter_max = n_iter_max
        self.random_state = random_state
        self.verbose = verbose

    def get_params(self, **kwargs):
        """Returns a dictionary of parameters
        """
        params = ['weight_ranks', 'tol', 'reg_W', 'n_iter_max', 'random_state', 'verbose']
        return {param_name: getattr(self, param_name) for param_name in params}

    def set_params(self, **parameters):
        """Sets the value of the provided parameters"""
        for parameter, value in parameters.items():
            setattr(self, parameter, value)
        return self

    def fit(self, X, y):
        """Fits the model to the data (X, y)

        Parameters
        ----------
        X : ndarray of shape (n_samples, N1, ..., NS)
            tensor data
        y : array of shape (n_samples)
            labels associated with each sample

        Returns
        -------
        self
        """
        rng = check_random_state(self.random_state)

        # Initialise randomly the weights
        G = T.tensor(rng.randn(*self.weight_ranks))
        W = []
        for i in range(1, X.ndim):  # First dimension of X = number of samples
            W.append(T.tensor(rng.randn(X.shape[i], G.shape[i - 1])))

        # Norm of the weight tensor at each iteration
        norm_W = []

        for iteration in range(self.n_iter_max):

            # Optimise modes of W
            for i in range(len(W)):
                phi = partial_tensor_to_vec(
                            T.dot(partial_unfold(X, i),
                                  T.dot(kronecker(W, skip_matrix=i),
                                          unfold(G, i).T)))
                # Regress phi on y: we could call a package here, e.g. scikit-learn
                inv_term = T.dot(phi.T, phi) + self.reg_W * T.tensor(np.eye(phi.shape[1]))
                W_i = vec_to_tensor(T.solve(inv_term, T.dot(phi.T, y)),
                                    (X.shape[i + 1], G.shape[i]))
                W[i] = W_i

            phi = T.dot(partial_tensor_to_vec(X), kronecker(W))
            G = vec_to_tensor(T.solve(T.dot(phi.T, phi) + self.reg_W * T.tensor(np.eye(phi.shape[1])), T.dot(phi.T, y)), G.shape)

            weight_tensor_ = tucker_to_tensor(G, W)
            norm_W.append(T.norm(weight_tensor_, 2))

            # Convergence check
            if iteration > 1:
                weight_evolution = abs(norm_W[-1] - norm_W[-2]) / norm_W[-1]

                if (weight_evolution <= self.tol):
                    if self.verbose:
                        print('\nConverged in {} iterations'.format(iteration))
                    break

        self.weight_tensor_ = weight_tensor_
        self.tucker_weight_ = (G, W)
        self.vec_W_ = tucker_to_vec(G, W)
        self.n_iterations_ = iteration + 1
        self.norm_W_ = norm_W

        return self

    def predict(self, X):
        """Returns the predicted labels for a new data tensor

        Parameters
        ----------
        X : ndarray
            tensor data of shape (n_samples, N1, ..., NS)
        """
        return T.dot(partial_tensor_to_vec(X), self.vec_W_)
The diff you're trying to view is too large. Only the first 1000 changed files have been loaded.
Showing with 0 additions and 0 deletions (0 / 0 diffs computed)
swh spinner

Computing file changes ...

back to top

Software Heritage — Copyright (C) 2015–2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Content policy— Contact— JavaScript license information— Web API