tntorch -- Tensor Network Learning with PyTorch =============================================== .. image:: tntorch.svg :width: 300 px :align: center This is a `PyTorch `__-powered library for tensor modeling and learning that features transparent support for the `tensor train (TT) model `_, `CANDECOMP/PARAFAC (CP) `_, the `Tucker model `_, and more. Supported operations (CPU and GPU) include: - Basic and fancy `indexing `_ of tensors, broadcasting, assignment, etc. - Tensor `decomposition and reconstruction `_ - Element-wise and tensor-tensor `arithmetics `_ - Building tensors from black-box functions using `cross-approximation `_ - Statistics and `sensitivity analysis `_ - Optimization using autodifferentiation, useful for e.g. `regression `_ or `classification `_ - Misc. operations on tensors: stacking, unfolding, sampling, `derivating `_, etc. Get the Code ------------ You can clone the project from `tntorch's GitHub page `_: .. code-block:: bash git clone https://github.com/rballester/tntorch.git or get it as a `zip file `_. Installation ------------ The main dependencies are `NumPy `_ and `PyTorch `_ (we recommend to install those with `Conda `_ or `Miniconda `_). To install *tntorch*, run: .. code-block:: bash cd tntorch pip install . First Steps ----------- Some basic tensor manipulation: .. code-block:: python import tntorch as tn t = tn.ones(64, 64) # 64 x 64 tensor, filled with ones t = t[:, :, None] + 2*t[:, None, :] # Singleton dimensions, broadcasting, and arithmetics print(tn.mean(t)) # Result: 3 Decomposing a tensor: .. code-block:: python import tntorch as tn data = ... # A NumPy or PyTorch tensor t1 = tn.Tensor(data, ranks_cp=5) # A CP decomposition t2 = tn.Tensor(data, ranks_tucker=5) # A Tucker decomposition t3 = tn.Tensor(data, ranks_tt=5) # A tensor train decomposition To get fully on board, check out the complete documentation: .. toctree:: :hidden: Welcome .. toctree:: :maxdepth: 1 goals api tutorial-notebooks contact