Vector spaces#
We will primarily work with vectors in \(\mathbb{R}^n\) and occasionally \(\mathbb{C}^n\).
An element of \(\mathbb{R}^n\) is a column vector
Vector addition#
Given \(x,y \in \mathbb{R}^n\),
Scalar multiplication#
For \(\alpha \in \mathbb{R}\) and \(x \in \mathbb{R}^n\),
Vectors in \(\mathbb{C}^n\) are defined the same way, with scalars \(\alpha \in \mathbb{C}\).
Definition: Vector space#
A vector space \(V\) over a field \(\mathbb{F}\) (such as \(\mathbb{R}\) or \(\mathbb{C}\)) is a set equipped with addition \((x,y)\mapsto x+y\) and scalar multiplication \((\alpha,x)\mapsto \alpha x\) satisfying, for all \(x,y,z\in V\) and \(\alpha,\beta\in\mathbb{F}\):
\(x+(y+z)=(x+y)+z\) (Associativity of addition)
\(x+y=y+x\) (Commutativity of addition)
There exists \(0\in V\) with \(x+0=x\) (Additive identity)
For each \(x\) there exists \(-x\) with \(x+(-x)=0\) (Additive inverse)
\(\alpha(\beta x)=(\alpha\beta)x\) (Associativity of scalar multiplication / compatibility with field multiplication)
\(1x=x\) (Multiplicative identity of the field acts as identity on vectors)
\(\alpha(x+y)=\alpha x+\alpha y\) (Left distributivity of scalar over vector addition)
\((\alpha+\beta)x=\alpha x+\beta x\) (Right distributivity of scalar addition over scalar multiplication)
Subspaces#
A subspace \(S \subset \mathbb{R}^n\) is a nonempty set closed under linear combinations. Equivalently, for any \(x,y \in S\) and \(\alpha,\beta \in \mathbb{R}\),
Every subspace of \(\mathbb{R}^n\) is itself a vector space under the same operations.
Linear combinations and span#
Given vectors \(x_1,\dots,x_k \in \mathbb{R}^n\), a linear combination has the form
The span of \(\{x_1,\dots,x_k\}\) is the set of all linear combinations:
The span is always a subspace of \(\mathbb{R}^n\).
Linear independence#
Vectors \(x_1,\dots,x_k\) are linearly independent if the only solution to the homogeneous combination equaling zero is the trivial one:
If there exists a nontrivial choice of coefficients yielding zero, the vectors are linearly dependent.
Bases and dimension#
A set of vectors \(x_1,\dots,x_k\) is a basis for a subspace \(S\) if:
\(x_1,\dots,x_k\) are linearly independent, and
\(\operatorname{span}\{x_1,\dots,x_k\} = S\).
If \(x_1,\dots,x_k\) form a basis for \(S\), then \(k\) is the dimension of \(S\), written
While a subspace can have many different bases, each basis has the same number of vectors. Hence dimension is well defined.
Example (a plane in \(\mathbb{R}^3\)). Let
Then \(a_1\) and \(a_2\) are linearly independent, and
which is the \(x_1\)-\(x_2\) plane in \(\mathbb{R}^3\). Therefore \(\dim(\operatorname{span}\{a_1,a_2\})=2\).
Direct sum#
If \(U\) and \(V\) are subspaces, then \(U+V\) is a subspace. We say that \(W=U \oplus V\) is the direct sum of \(U\) and \(V\) if \(U \cap V = \{0\}\). The direct sum means that if a vector is decomposed into its \(U\) and \(V\) components, this decomposition is unique.
Example: verify that if \(x_1\), …, \(x_k\) are linearly independent, then the space \(S\) spanned by these vectors is the direct sum of their individual spans: