Skip to main content
  • Home
  • Development
  • Documentation
  • Donate
  • Operational login
  • Browse the archive

swh logo
SoftwareHeritage
Software
Heritage
Archive
Features
  • Search

  • Downloads

  • Save code now

  • Add forge now

  • Help

Raw File Download
Permalink

To reference or cite the objects present in the Software Heritage archive, permalinks based on SoftWare Hash IDentifiers (SWHIDs) must be used.
Select below a type of object currently browsed in order to display its associated SWHID and permalink.

  • content
content badge Iframe embedding
swh:1:cnt:d01f057420f961600df2084fb9a6b4cf176f27ac
Citations

This interface enables to generate software citations, provided that the root directory of browsed objects contains a citation.cff or codemeta.json file.
Select below a type of object currently browsed in order to generate citations for them.

  • content
Generate software citation in BibTex format (requires biblatex-software package)
Generating citation ...
tntorch -- Tensor Network Learning with PyTorch
===============================================

.. image:: tntorch.svg
   :width: 300 px
   :align: center

This is a `PyTorch <http://pytorch.org/>`__-powered library for tensor modeling and learning that features transparent support for the `tensor train (TT) model <https://epubs.siam.org/doi/pdf/10.1137/090752286>`_, `CANDECOMP/PARAFAC (CP) <https://epubs.siam.org/doi/pdf/10.1137/07070111X>`_, the `Tucker model <https://epubs.siam.org/doi/pdf/10.1137/S0895479898346995>`_, and more. Supported operations (CPU and GPU) include:

- Basic and fancy `indexing <tutorials/introduction.html>`_ of tensors, broadcasting, assignment, etc.
- Tensor `decomposition and reconstruction <tutorials/decompositions.html>`_
- Element-wise and tensor-tensor `arithmetics <tutorials/arithmetics.html>`_
- Building tensors from black-box functions using `cross-approximation <tutorials/cross.html>`_
- Statistics and `sensitivity analysis <tutorials/sobol.html>`_
- Optimization using autodifferentiation, useful for e.g. `regression <tutorials/completion.html>`_ or `classification <tutorials/classification.html>`_
- Misc. operations on tensors: stacking, unfolding, sampling, `derivating <tutorials/derivatives.html>`_, etc.

Get the Code
------------

You can clone the project from `tntorch's GitHub page <https://github.com/rballester/tntorch>`_:

.. code-block:: bash

    git clone https://github.com/rballester/tntorch.git

or get it as a `zip file <https://github.com/rballester/tntorch/archive/master.zip>`_.
    
Installation
------------

The main dependencies are `NumPy <http://www.numpy.org/>`_ and `PyTorch <https://pytorch.org/>`_ (we recommend to install those with `Conda <https://conda.io/en/latest/>`_ or `Miniconda <https://conda.io/en/latest/miniconda.html>`_). To install *tntorch*, run:

.. code-block:: bash

   cd tntorch
   pip install .

First Steps
-----------

Some basic tensor manipulation:

.. code-block:: python

   import tntorch as tn
   
   t = tn.ones(64, 64)  # 64 x 64 tensor, filled with ones
   t = t[:, :, None] + 2*t[:, None, :]  # Singleton dimensions, broadcasting, and arithmetics
   print(tn.mean(t))  # Result: 3

Decomposing a tensor:
   
.. code-block:: python

   import tntorch as tn
   
   data = ...  # A NumPy or PyTorch tensor
   t1 = tn.Tensor(data, ranks_cp=5)  # A CP decomposition
   t2 = tn.Tensor(data, ranks_tucker=5)  # A Tucker decomposition
   t3 = tn.Tensor(data, ranks_tt=5)  # A tensor train decomposition

To get fully on board, check out the complete documentation:

.. toctree::
   :hidden:

   Welcome <self>

.. toctree::
   :maxdepth: 1

   goals
   api
   tutorial-notebooks
   contact

Software Heritage — Copyright (C) 2015–2025, The Software Heritage developers. License: GNU AGPLv3+.
The source code of Software Heritage itself is available on our development forge.
The source code files archived by Software Heritage are available under their own copyright and licenses.
Terms of use: Archive access, API— Contact— JavaScript license information— Web API

back to top