If we label the axes as 1,2, and 3 we can write the dot product as a sum
![{\displaystyle \mathbf {u} \cdot \mathbf {v} =\sum _{i=1}^{3}u_{i}v_{i}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/b84cf299b5f6bc09c9115146e725247de81ff884)
If we number the elements of a matrix similarly,
![{\displaystyle \mathbf {A} ={\begin{pmatrix}A_{11}&A_{12}&A_{13}\\A_{21}&A_{22}&A_{23}\\A_{31}&A_{32}&A_{33}\end{pmatrix}}\quad \mathbf {B} ={\begin{pmatrix}B_{11}&B_{12}&B_{13}\\B_{21}&B_{22}&B_{23}\\B_{31}&B_{32}&B_{33}\end{pmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/6878e254c323a0826cf5582592d7bb484fc99376)
we can write similar expressions for matrix multiplications
![{\displaystyle (\mathbf {A} \mathbf {u} )_{i}=\sum _{j=1}^{3}A_{ij}u_{j}\quad (\mathbf {A} \mathbf {B} )_{ik}=\sum _{j=1}^{3}A_{ij}B_{jk}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/f810239cd23377c8a3c531672158cff8900d7b50)
Notice that in each case we are summing over the repeated index. Since this is so common, it is now conventional to omit the summation sign.
Instead we simply write
![{\displaystyle \mathbf {u} \cdot \mathbf {v} =u_{i}v_{i}\quad (\mathbf {A} \mathbf {u} )_{i}=A_{ij}u_{j}\quad (\mathbf {A} \mathbf {B} )_{ik}=A_{ij}B_{jk}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/501df8edf681680ed1e005c7c70c4589d122bd9e)
We can then also number the unit vectors, êi, and write
![{\displaystyle \mathbf {u} =u_{i}{\hat {\mathbf {e} }}_{i}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/298cacc0bef7f65d049e4f6fa198197db1327c42)
which can be convenient in a rotating coordinate system.
The Kronecker delta is
![{\displaystyle \delta _{ij}=\left\{{\begin{matrix}1&i=j\\0&i\neq j\end{matrix}}\right.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/12dc9401cc37e24aa6381a0bf9e208af8ff799a5)
This is the standard way of writing the identity matrix.
Another useful quantity can be defined by
![{\displaystyle \epsilon _{ijk}=\left\{{\begin{matrix}1&(i,j,k)=(1,2,3){\mbox{ or }}(2,3,1){\mbox{ or }}(3,1,2)\\-1&(i,j,k)=(2,1,3){\mbox{ or }}(3,2,1){\mbox{ or }}(1,3,2)\\0&{\mbox{ otherwise }}\end{matrix}}\right.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/247f1618fe032e75181685236c89803b699dba6f)
With this definition it turns out that
![{\displaystyle \mathbf {u} \times \mathbf {v} =\epsilon _{ijk}{\hat {\mathbf {e} }}_{i}u_{j}v_{k}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7d0332c12d2de88d956aaa6008c2ed3e4ec014f1)
and
![{\displaystyle \epsilon _{ijk}\epsilon _{ipq}=\delta _{jp}\delta _{kq}-\delta _{jq}\delta _{kp}\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/c482c1630d86619ab6c834c79307ab07c607927f)
This will let us write many formulae more compactly.