Two vectors are added head to tail.
If
a
{\displaystyle \mathbf {a} \,}
and
b
{\displaystyle \mathbf {b} \,}
are vectors, then the sum
c
=
a
+
b
{\displaystyle \mathbf {c} =\mathbf {a} +\mathbf {b} \,}
is also a vector.
The two vectors can also be subtracted from one another to give another vector
d
=
a
−
b
{\displaystyle \mathbf {d} =\mathbf {a} -\mathbf {b} \,}
.
Multiplicaton by 2 doubles the length of a vector
Multiplication of a vector
b
{\displaystyle \mathbf {b} \,}
by a scalar
λ
{\displaystyle \lambda \,}
has the effect of stretching or shrinking the vector.
You can form a unit vector
b
^
{\displaystyle {\hat {\mathbf {b} }}\,}
that is parallel to
b
{\displaystyle \mathbf {b} \,}
by dividing by the length of the vector
|
b
|
{\displaystyle |\mathbf {b} |\,}
. Thus,
b
^
=
b
|
b
|
.
{\displaystyle {\hat {\mathbf {b} }}={\frac {\mathbf {b} }{|\mathbf {b} |}}~.}
The scalar product depends on the cosine of the angle between two vectors.
The scalar product or inner product or dot product of two vectors is defined as
a
⋅
b
=
|
a
|
|
b
|
cos
(
θ
)
{\displaystyle \mathbf {a} \cdot \mathbf {b} =|\mathbf {a} ||\mathbf {b} |\cos(\theta )}
where
θ
{\displaystyle \theta \,}
is the angle between the two vectors (see Figure 2(b)).
If
a
{\displaystyle \mathbf {a} \,}
and
b
{\displaystyle \mathbf {b} \,}
are perpendicular to each other,
θ
=
π
/
2
{\displaystyle \theta =\pi /2\,}
and
cos
(
θ
)
=
0
{\displaystyle \cos(\theta )=0\,}
. Therefore,
a
⋅
b
=
0
{\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=0}
.
The dot product therefore has the geometric interpretation as the length of the projection of
a
{\displaystyle \mathbf {a} \,}
onto the unit vector
b
^
{\displaystyle {\hat {\mathbf {b} }}\,}
when the two vectors are placed so that they start from the same point (tail-to-tail).
The scalar product leads to a scalar quantity and can also be written in component form (with respect to a given basis) as
a
⋅
b
=
a
1
b
1
+
a
2
b
2
+
a
3
b
3
=
∑
i
=
1..3
a
i
b
i
.
{\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}=\sum _{i=1..3}a_{i}b_{i}~.}
If the vector is
n
{\displaystyle n}
dimensional, the dot product is written as
a
⋅
b
=
∑
i
=
1..
n
a
i
b
i
.
{\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=\sum _{i=1..n}a_{i}b_{i}~.}
Using the Einstein summation convention, we can also write the scalar product as
a
⋅
b
=
a
i
b
i
.
{\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=a_{i}b_{i}~.}
Also notice that the following also hold for the scalar product
a
⋅
b
=
b
⋅
a
{\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }={\mathbf {b} }\cdot {\mathbf {a} }}
(commutative law).
a
⋅
(
b
+
c
)
=
a
⋅
b
+
a
⋅
c
{\displaystyle {\mathbf {a} }\cdot {(\mathbf {b} +\mathbf {c} )}={\mathbf {a} }\cdot {\mathbf {b} }+{\mathbf {a} }\cdot {\mathbf {c} }}
(distributive law).
The area of the parallelogram generated by two vectors is the length of their cross product
The vector product (or cross product) of two vectors
a
{\displaystyle \mathbf {a} \,}
and
b
{\displaystyle \mathbf {b} \,}
is another vector
c
{\displaystyle \mathbf {c} \,}
defined as
c
=
a
×
b
=
|
a
|
|
b
|
sin
(
θ
)
c
^
{\displaystyle \mathbf {c} ={\mathbf {a} }\times {\mathbf {b} }=|\mathbf {a} ||\mathbf {b} |\sin(\theta ){\hat {\mathbf {c} }}}
where
θ
{\displaystyle \theta \,}
is the angle between
a
{\displaystyle \mathbf {a} \,}
and
b
{\displaystyle \mathbf {b} \,}
, and
c
^
{\displaystyle {\hat {\mathbf {c} }}\,}
is a unit vector perpendicular to the plane containing
a
{\displaystyle \mathbf {a} \,}
and
b
{\displaystyle \mathbf {b} \,}
in the right-handed sense.
In terms of the orthonormal basis
(
e
1
,
e
2
,
e
3
)
{\displaystyle (\mathbf {e} _{1},\mathbf {e} _{2},\mathbf {e} _{3})\,}
, the cross product can be written in the form of a determinant
a
×
b
=
|
e
1
e
2
e
3
a
1
a
2
a
3
b
1
b
2
b
3
|
.
{\displaystyle {\mathbf {a} }\times {\mathbf {b} }={\begin{vmatrix}\mathbf {e} _{1}&\mathbf {e} _{2}&\mathbf {e} _{3}\\a_{1}&a_{2}&a_{3}\\b_{1}&b_{2}&b_{3}\end{vmatrix}}.}
In index notation, the cross product can be written as
a
×
b
≡
e
i
j
k
a
j
b
k
.
{\displaystyle {\mathbf {a} }\times {\mathbf {b} }\equiv e_{ijk}a_{j}b_{k}.}
where
e
i
j
k
{\displaystyle e_{ijk}}
is the Levi-Civita symbol (also called the permutation symbol, alternating tensor).
Some useful vector identities are given below.
a
×
b
=
−
b
×
a
{\displaystyle {\mathbf {a} }\times {\mathbf {b} }=-{\mathbf {b} }\times {\mathbf {a} }}
.
a
×
b
+
c
=
a
×
b
+
a
×
c
{\displaystyle {\mathbf {a} }\times {\mathbf {b} +\mathbf {c} }={\mathbf {a} }\times {\mathbf {b} }+{\mathbf {a} }\times {\mathbf {c} }}
.
a
×
(
b
×
c
)
=
b
(
a
⋅
c
)
−
c
(
a
⋅
b
)
{\displaystyle {\mathbf {a} }\times {({\mathbf {b} }\times {\mathbf {c} })}=\mathbf {b} ({\mathbf {a} }\cdot {\mathbf {c} })-\mathbf {c} ({\mathbf {a} }\cdot {\mathbf {b} })}
(
a
×
b
)
×
c
=
b
(
a
⋅
c
)
−
a
(
b
⋅
c
)
{\displaystyle {({\mathbf {a} }\times {\mathbf {b} })}\times {\mathbf {c} }=\mathbf {b} ({\mathbf {a} }\cdot {\mathbf {c} })-\mathbf {a} ({\mathbf {b} }\cdot {\mathbf {c} })}
a
×
a
=
0
{\displaystyle {\mathbf {a} }\times {\mathbf {a} }=\mathbf {0} }
a
⋅
(
a
×
b
)
=
b
⋅
(
a
×
b
)
=
0
{\displaystyle {\mathbf {a} }\cdot {({\mathbf {a} }\times {\mathbf {b} })}={\mathbf {b} }\cdot {({\mathbf {a} }\times {\mathbf {b} })}=\mathbf {0} }
(
a
×
b
)
⋅
c
=
a
⋅
(
b
×
c
)
{\displaystyle {({\mathbf {a} }\times {\mathbf {b} })}\cdot {\mathbf {c} }={\mathbf {a} }\cdot {({\mathbf {b} }\times {\mathbf {c} })}}
For more details on the topics of this chapter, see Vectors in the wikibook on Calculus .
← Introduction · Vector Calculus →