Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Raw File Download
Permalink

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • content
content badge Iframe embedding
swh:1:cnt:e60a24372743edf4c6d7ec3bb4c7618cb5a85260
Citations

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • content
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Cross-approximation\n",
    "\n",
    "Often, we would like to build a $N$-dimensional tensor from a **black-box function** $f: \\Omega \\subset \\mathbb{R}^N \\to \\mathbb{R}$, where $\\Omega$ is a tensor product grid. That is, we are free to sample *whatever entries we want* within our domain $\\Omega$, but we cannot afford to sample the entire domain (as it contains an **exponentially large number of points**). One way to build such a tensor is using **cross-approximation** ([I. Oseledets, E. Tyrtyshnikov: \"TT-cross Approximation for Multidimensional Arrays\"](http://www.mat.uniroma2.it/~tvmsscho/papers/Tyrtyshnikov5.pdf)) from well-chosen fibers in the domain.\n",
    "\n",
    "We support two major use cases of cross-approximation in the TT format.\n",
    "\n",
    "## Approximating a Function over a Domain\n",
    "\n",
    "This is the more basic setting. We just need to specify:\n",
    "\n",
    "- Our function of interest\n",
    "- The tensor product domain $\\Omega = u_1 \\otimes \\dots \\otimes u_N$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Cross-approximation over a 5D domain containing 3.35544e+07 grid points:\n",
      "iter: 0  | eps: 9.221e-01 | total time:   0.0110 | largest rank:   1\n",
      "iter: 1  | eps: 4.867e-03 | total time:   0.0350 | largest rank:   4\n",
      "iter: 2  | eps: 4.295e-06 | total time:   0.0609 | largest rank:   7\n",
      "iter: 3  | eps: 8.606e-09 | total time:   0.1027 | largest rank:  10 <- converged: eps < 1e-06\n",
      "Did 33984 function evaluations, which took 0.001594s (2.133e+07 evals/s)\n",
      "\n",
      "5D TT tensor:\n",
      "\n",
      " 32  32  32  32  32\n",
      "  |   |   |   |   |\n",
      " (0) (1) (2) (3) (4)\n",
      " / \\ / \\ / \\ / \\ / \\\n",
      "1   10  10  10  10  1\n",
      "\n"
     ]
    }
   ],
   "source": [
    "import tntorch as tn\n",
    "import torch\n",
    "torch.set_default_dtype(torch.float64)\n",
    "\n",
    "def function(x, y, z, t, w):  # Input arguments are vectors\n",
    "    return 1 / (x + y + z + t + w)  # Hilbert tensor\n",
    "\n",
    "domain = [torch.arange(1, 33) for n in range(5)]\n",
    "t = tn.cross(function=function, domain=domain)\n",
    "\n",
    "print(t)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Sometimes it's more convenient to work with functions that accept matrices (instead of a list of vectors) as input. We can do this with the `function_arg='matrix'` flag:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Cross-approximation over a 5D domain containing 3.35544e+07 grid points:\n",
      "iter: 0  | eps: 9.355e-01 | total time:   0.0138 | largest rank:   1\n",
      "iter: 1  | eps: 4.148e-03 | total time:   0.0341 | largest rank:   4\n",
      "iter: 2  | eps: 5.244e-06 | total time:   0.0610 | largest rank:   7\n",
      "iter: 3  | eps: 7.581e-09 | total time:   0.0961 | largest rank:  10 <- converged: eps < 1e-06\n",
      "Did 33984 function evaluations, which took 0.00437s (7.777e+06 evals/s)\n",
      "\n"
     ]
    }
   ],
   "source": [
    "def function(Xs):  # Matrix (one row per sample, one column per input variable) and return a vector with one result per sample\n",
    "    return 1/torch.sum(Xs, dim=1)\n",
    "\n",
    "t = tn.cross(function=function, domain=domain, function_arg='matrix')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Element-wise Operations on Tensors\n",
    "\n",
    "Here we have one (or several) $N$-dimensional tensors that we want to transform element-wise. For instance, we may want to square each element of our tensor:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Cross-approximation over a 5D domain containing 3.35544e+07 grid points:\n",
      "iter: 0  | eps: 9.539e-01 | total time:   0.0062 | largest rank:   1\n",
      "iter: 1  | eps: 2.066e-02 | total time:   0.0174 | largest rank:   4\n",
      "iter: 2  | eps: 5.644e-05 | total time:   0.0338 | largest rank:   7\n",
      "iter: 3  | eps: 6.255e-08 | total time:   0.0627 | largest rank:  10 <- converged: eps < 1e-06\n",
      "Did 33984 function evaluations, which took 0.0005157s (6.59e+07 evals/s)\n",
      "\n"
     ]
    }
   ],
   "source": [
    "t2 = tn.cross(function=lambda x: x**2, tensors=t)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Just for practice, let's do this now in a slightly different way by passing two tensors:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Cross-approximation over a 5D domain containing 3.35544e+07 grid points:\n",
      "iter: 0  | eps: 9.757e-01 | total time:   0.0081 | largest rank:   1\n",
      "iter: 1  | eps: 2.939e-02 | total time:   0.0228 | largest rank:   4\n",
      "iter: 2  | eps: 1.086e-04 | total time:   0.0440 | largest rank:   7\n",
      "iter: 3  | eps: 8.331e-08 | total time:   0.0675 | largest rank:  10 <- converged: eps < 1e-06\n",
      "Did 33984 function evaluations, which took 0.0005171s (6.572e+07 evals/s)\n",
      "\n"
     ]
    }
   ],
   "source": [
    "t2 = tn.cross(function=lambda x, y: x*y, tensors=[t, t])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let's check the accuracy of our cross-approximated squaring operation, compared to the groundtruth `t*t`:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(8.6986e-08)"
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "tn.relative_error(t*t, t2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "See [this notebook](arithmetics.ipynb) for more examples on element-wise tensor operations."
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}

Software Heritage — Copyright (C) 2015–2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Contact— JavaScript license information— Web API

back to top