tnc/_tutorial/
tensors_structure.rs

1//! # Structure of tensors
2//!
3//! In contrast to the common design of having tensors and tensor networks, this
4//! library allows arbitrary nesting of tensors. This means that the children of
5//! tensor networks can themselves be tensor networks again. We call tensors with
6//! children "composite tensors" and those without children "leaf tensors".
7//!
8//! ## Composite tensors
9//! Composite tensors are the equivalent to tensor networks, as they own a list of
10//! tensors that are their children. However, these child tensors can also be
11//! composite, allowing for a hierarchical tree structure. The outer legs of a
12//! composite tensor can be obtained by the [`external_tensor`] method.
13//!
14//! ## Leaf tensors
15//! Leaf tensors have legs with corresponding bond dimensions and can optionally have
16//! data.
17//!
18//! ## Rationale
19//! The idea for this recursive design is natural: When looking only at the outer
20//! (i.e. open) legs of a tensor network, it can be seen as a plain tensor again. It
21//! has legs with sizes and it has data (that is obtained by contracting the tensor
22//! network).
23//!
24//! In addition, this format is useful for multi-level parallelization based on
25//! partitioning the network: For example, the top level could be a composite tensor,
26//! where each children is assigned to one compute node. Each children is also again
27//! a composite tensor, where each children is assigned to one core. Finally, each of
28//! those children is again a composite tensor that is an actual tensor network
29//! (i.e., with leaf tensors as children).
30//!
31//! [`external_tensor`]: Tensor::external_tensor
32//!
33#![allow(unused_imports)]
34use crate::tensornetwork::tensor::Tensor;
35
36pub use crate::_tutorial as table_of_contents;