What is a tensor
a mulidimensional array; an extension of matrix
Kronecker tensor product
MATLAB kron
\(\mathbf{A} \otimes \mathbf{B}=\left[\begin{array}{cccc}{a_{11} \mathbf{B}} & {a_{12} \mathbf{B}} & {\cdots} & {a_{1 J} \mathbf{B}} \\ {\vdots} & {\vdots} & {\ddots} & {\vdots} \\ {a_{I 1} \mathbf{B}} & {a_{I 2} \mathbf{B}} & {\cdots} & {a_{I J} \mathbf{B}}\end{array}\right] \in \mathbb{R}^{I K x J L}\)
K = kron(A,B) returns the Kronecker tensor product of matrices A and B. If A is an m-by-n matrix and B is a p-by-q matrix, then kron(A,B) is an m*p-by-n*q matrix formed by taking all possible products between the elements of A and the matrix B.
Examples
Tensor unfolding
Matricization
convert a tensor to a matrix
Vectorization
convert a tensor to a vector
Tensor multiplication
The n-mode product:
multiplied by a vector
multiplying by a vecor reduces the dimension by one
multiplied by a matrix
Tensor rank
Rank-one tensor
Tensor rank: smallest number of rank-one tensors that can generate it by summing up
Differences with matrix rank
1) tensor rank can be different over R and C
2) deciding tensor rank is an NP problem that no straightforward algorithm can solve it
Tensor factorization
an extension of SVD and PCA of matrix
CANDECOMP/PARAFAC (CP) factorization
Reinvented twice in 1970s
Canonical Decomposition / Parallel Factors
ployadic
Uniqueness: CP of tensor(higher-order) is unique under some general conditions.
How to compute:
Alternative Least Squares(ALS), fixing all but one factor matrix to which LS is applied.
Differences between matrix SVD and tensor CP
Lower-rank approximation is different between matrix and higher-order tensor
a simple form of Tensor Decomposition known from different sources as Polyadic decomposition, PARAFAC, CANDECOMP, or CP decomposition (Cichocki et al. (2015); Kolda and Bader (2009)). The general idea of the method is to represent an observed tensor χ as a sum of R orthogonal rank one tensors (an N-way tensor is rank one if it can be written as the outer product of N vectors):
<math><mrow is="true"><mi is="true">χ</mi><mo is="true">≈</mo><munderover is="true"><mrow is="true"><mo is="true">∑</mo></mrow><mrow is="true"><mi is="true">r</mi><mo is="true">=</mo><mn is="true">1</mn></mrow><mrow is="true"><mi is="true">R</mi></mrow></munderover><msub is="true"><mrow is="true"><mi mathvariant="bold" is="true">a</mi></mrow><mrow is="true"><mi is="true">r</mi></mrow></msub><mo is="true">∘</mo><msub is="true"><mrow is="true"><mi mathvariant="bold" is="true">b</mi></mrow><mrow is="true"><mi is="true">r</mi></mrow></msub><mo is="true">∘</mo><msub is="true"><mrow is="true"><mi mathvariant="bold" is="true">c</mi></mrow><mrow is="true"><mi is="true">r</mi></mrow></msub></mrow></math>
the symbol ‘’ represents the vector outer product.
Tucker factorization
has three types
Uniqueness: Unlike CP, Tucker factorization is not unique.
How to compute:
Higher-order SVD(HOSVD), for each n,
Rn: