Tensors
Tensor Transposition
Scalars
Transpose of a scalar is the scalar itself i.e. x{^T} = x
Vectors
Transposing a vector transofrms columns to rows, and rows to columns i.e.
\begin{bmatrix} x{_{1,1}} & x{_{1,2}} \\ x{_{2,1}} & x{_{2,2}} \\ x{_{3,1}} & x{_{3,2}} \end{bmatrix}{^T} = \begin{bmatrix} x{_{1,1}} & x{_{2,1}} & x{_{3,1}} \\ x{_{1,2}} & x{_{2,2}} & x{_{3,2}} \end{bmatrix}
Scalar and vector transpoition are special cases of matrix transposition, where the axes are “flipped” across the main diagonal.
(\boldsymbol X^T)_{i,j} = (\boldsymbol X)_{j,i}
Tensor Reduction
Sum
Calculate the sum across all elements of the vector or matrix.
Vector ${i=1}^n x_i $
Matrix ${i=1}^m {j=1}^n X{j,i} $
Dot Product
Different to the Hadamard product as there is summation in the dot product (or scalar product) to produce a single scalar value.
Can calculate the dot product if two vectors tensors x and y have the same length n and can be denoted as: \boldsymbol X\cdot \boldsymbol Y ,\boldsymbol X^T \boldsymbol Y or \lang \boldsymbol X,\boldsymbol Y \rang. The product is calculated in an elemnent-wise pattern, then summed across the products to produce a scalar value:
\boldsymbol X \cdot \boldsymbol Y = \sum_{i=1}^n x_iy_i