Basis and Dimension

Home > Mathematics > Linear algebra > Basis and Dimension

A basis is a set of linearly independent vectors that span a vector space. The dimension of a vector space is the number of vectors in its basis.

Vector spaces: A vector space is a collection of vectors that can be added together and multiplied by scalars. It forms the foundation of linear algebra.
Linear independence: A set of vectors is said to be linearly independent if the only way to get a linear combination of them to equal zero is if all the coefficients are zero.
Span: The span of a set of vectors is the set of all linear combinations of those vectors.
Basis: A basis is a set of linearly independent vectors that span a given vector space.
Dimension: The dimension of a vector space is the number of vectors in a basis.
Basis transformation: Changing the basis of a vector space involves expressing the vectors in the new basis as linear combinations of the old basis vectors.
Matrix representation: Complex vector spaces or even simple ones can be represented in the form of matrices.
Change of basis matrix: The change of basis matrix is a matrix that describes how the coordinates of a given vector change when the basis of the vector space is changed.
Orthogonality: Two vectors are orthogonal if their inner product (dot product) is zero.
Orthonormal basis: An orthonormal basis is a basis consisting of orthogonal unit vectors.
Linear transformations: Linear transformations are functions that preserve the structure of a vector space, i.e., they respect vector addition and scalar multiplication.
Matrix representation of linear transformations: Just like vector spaces, linear transformations can be represented by matrices.
Kernel and range: The kernel of a linear transformation is the set of all vectors that map to zero under the transformation, while the range is the set of all possible output vectors.
Rank-nullity theorem: The rank-nullity theorem states that the dimension of the kernel plus the dimension of the range equals the dimension of the domain.
Eigenvectors and eigenvalues: Eigenvectors are vectors that, when transformed by a linear transformation, only get scaled, while the eigenvalues are the factors by which the eigenvectors are scaled.
Diagonalization: Diagonalization is the process of representing a linear transformation as a diagonal matrix with the eigenvectors as columns.
Inner product spaces: An inner product space is a vector space equipped with a notion of an inner product, which allows for notions like angles and distances.
Gram-Schmidt process: The Gram-Schmidt process is a method for constructing an orthonormal basis from a set of linearly independent vectors.
Hermitian matrices: Hermitian matrices are matrices that are equal to their adjoint (conjugate transpose).
Unitary matrices: Unitary matrices are matrices that preserve the length and inner product of vectors, i.e., they are orthogonal in the complex sense.
Singular value decomposition: The singular value decomposition (SVD) is a factorization of a matrix into the product of three matrices, one of which is diagonal with the singular values of the original matrix along its diagonal.
Linearly independent basis: This is a type of basis where none of the vectors in the set can be expressed as a linear combination of the others. They are said to be linearly independent.
Linearly dependent basis: In this type of basis, one or more vectors in the set can be expressed as linear combinations of the others. They are linearly dependent.
Finite basis: A finite basis is a set of vectors that can be represented by a finite number of vectors.
Infinite basis: An infinite basis is a set of vectors that cannot be represented by a finite number of vectors.
Standard basis: This is a basis that is formed by taking the unit vectors in each direction of a coordinate system.
Canonical basis: This is a basis specific to certain types of matrices. For example, the canonical basis for a diagonal matrix is formed from the columns or rows of the identity matrix.
Orthonormal basis: This is a basis where all the vectors are unit vectors and are mutually orthogonal.
Principal basis: This is a basis formed by eigenvectors of a matrix. They are useful in diagonalizing matrices.
Nullspace basis: This is a basis formed from the set of vectors that form the nullspace of a matrix.
Row space basis: This type of basis is formed from the set of vectors that form the row space of a matrix.
Column space basis: This type of basis is formed from the set of vectors that form the column space of a matrix.
Power basis: This is a basis formed by taking powers of a single vector. For example, the set {1, x, x^2} forms a power basis for the space of quadratic polynomials.
Monomial basis: This is a basis formed from polynomial functions that are powers of a single variable.
Taylor basis: This is a basis formed from the Taylor series of a function. They are useful for approximating functions.
Fourier basis: This is a basis formed from the Fourier series of a function. They are useful for representing functions as a sum of sine and cosine functions.
"A set B of vectors in a vector space V is called a basis if every element of V may be written in a unique way as a finite linear combination of elements of B."
"The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to B."
"The elements of a basis are called basis vectors."
"A set B is a basis if its elements are linearly independent and every element of V is a linear combination of elements of B."
"A set is linearly independent if none of the vectors in the set can be written as a linear combination of the other vectors."
"A set is a spanning set if every vector in the vector space can be expressed as a linear combination of vectors in that set."
"A vector space can have several bases."
"All the bases have the same number of elements, called the dimension of the vector space."
"This article deals mainly with finite-dimensional vector spaces. However, many of the principles are also valid for infinite-dimensional vector spaces."
"A basis allows us to represent any vector in a vector space as a unique combination of its basis vectors."
"...every element of V may be written in a unique way as a finite linear combination of elements of B."
"No, every element of V may be written in a unique way as a finite linear combination of elements of B."
"The coefficients of this linear combination [components] are referred to as components or coordinates of the vector with respect to B."
"A set is linearly independent if none of the vectors in the set can be written as a linear combination of the other vectors."
"A basis is a linearly independent spanning set."
"The principles are also valid for infinite-dimensional vector spaces."
"The same number of elements, called the dimension of the vector space."
"No, all the bases have the same number of elements."
"This article deals mainly with finite-dimensional vector spaces."
"The principles are also valid for infinite-dimensional vector spaces."