Definition of matrix representation of a linear transformation with respect to bases of the spaces, arbitrary vector space math.la.d.lintrans.mat.repn.arb
Definition of matrix representation of a linear transformation from a vector space to itself, with respect to basis of the space, arbitrary vector space math.la.d.lintrans.mat.repn.self.arb
Definition of linearly dependent set of vectors: one of the vectors can be written as a linear combination of the other vectors, arbitrary vector space. math.la.d.vec.lindep.arb
Definition of linearly dependent set of vectors: one of the vectors can be written as a linear combination of the other vectors, coordinate vector space. math.la.d.vec.lindep.coord
Definition of linearly indepentent set of vectors: if a linear combination is zero, then every coefficient is zero, arbitrary vector space. math.la.d.vec.linindep.arb
Definition of linearly independent set of vectors: if a linear combination is zero, then every coefficient is zero, coordinate vector space. math.la.d.vec.linindep.coord
Example of solving a 3-by-3 homogeneous system of linear equations by row-reducing the augmented matrix, in the case of infinitely many solutions math.la.e.linsys.3x3.soln.homog.row_reduce.i
Example of solving a 3-by-3 homogeneous system of linear equations by row-reducing the augmented matrix, in the case of one solution math.la.e.linsys.3x3.soln.homog.row_reduce.o
Example of solving a 3-by-3 system of linear equations by row-reducing the augmented matrix, in the case of infinitely many solutions math.la.e.linsys.3x3.soln.row_reduce.i
Example of solving a 3-by-3 system of linear equations by row-reducing the augmented matrix, in the case of one solution math.la.e.linsys.3x3.soln.row_reduce.o
Example of solving a 3-by-3 system of linear equations by row-reducing the augmented matrix, in the case of no solutions math.la.e.linsys.3x3.soln.row_reduce.z
Formula for computing the least squares solution to a linear system, in terms of the QR factorization of the coefficient matrix. math.la.t.linsys.leastsquares.qr
The least squares solution to a linear system is unique if and only if the columns of the coefficient matrix are linearly independent. math.la.t.linsys.leastsquares.unique
A linear transformation is one-to-one/injective if and only if the columns of its matrix are linearly independent. math.la.t.lintrans.injective.linindep
A linear transformation of a linear combination is the linear combination of the linear transformation math.la.t.lintrans.lincomb
The determinant of a triangular matrix is the product of the entries on the diagonal. math.la.t.mat.det.trianglar
An n-by-n matrix is diagonalizable if and only if it has n linearly independent eigenvectors. math.la.t.mat.diagonalizable
An n-by-n matrix is diagonalizable if and only if the union of the basis vectors for the eigenspaces is a basis for R^n (or C^n). math.la.t.mat.diagonalizable.basis
An n-by-n matrix is diagonalizable if and only if the characteristic polynomial factors completely, and the dimension of each eigenspace equals the multiplicity of the eigenvalue. math.la.t.mat.diagonalizable.charpoly
Matrix multiplication can be viewed as the dot product of a row vector of column vectors with a column vector of row vectors math.la.t.mat.mult.row.col
The transpose of a product of matrices is the product of the transposes in reverse order. math.la.t.mat.mult.transpose
The number of pivots in the reduced row echelon form of a consistent system determines the number of free variables in the solution set. math.la.t.rref.pivot.free
The number of pivots in the reduced row echelon form of a consistent system determines whether there is one or infinitely many solutions. math.la.t.rref.pivot.oi
Theorem: a set of vectors is linearly dependent if and only if one of the vectors can be written as a linear combination of the other vectors, arbitrary vector space. math.la.t.vec.lindep.arb
Theorem: a set of vectors is linearly dependent if and only if one of the vectors can be written as a linear combination of the other vectors, coordinate vector space. math.la.t.vec.lindep.coord
If a set of vectors in R^n (or C^n) contains more than n elements, then the set is linearly dependent. math.la.t.vec.lindep.more.rncn
A set of two vectors is linearly dependent if and only if neither is a scalar multiple of the other. math.la.t.vec.lindep.two
If a set of vectors contains the zero vector, then the set is linearly dependent. math.la.t.vec.lindep.zero
Theorem: a set of vectors is linearly independent if and only if whenever a linear combination is zero, then every coefficient is zero, arbitrary vector space. math.la.t.vec.linindep.arb
Theorem: a set of vectors is linearly independent if and only if whenever a linear combination is zero, then every coefficient is zero, coordinate vector space. math.la.t.vec.linindep.coord
Two vectors are orthogonal if and only if the Pythagorean Theorem holds. math.la.t.vec.orthogonal
Every basis for a vector space contains the same number of elements, arbitrary vector space. math.la.t.vsp.dim.arb
If a vector space has dimension n, then any subset of n vectors that is linearly independent must be a basis, arbitrary vector space. math.la.t.vsp.dim.linindep.arb
If a vector space has dimension n, then any subset of n vectors that is linearly independent must be a basis, coordinate vector space. math.la.t.vsp.dim.linindep.coord
A set of vectors containing more elements than the dimension of the space must be linearly dependent, arbitrary vector space. math.la.t.vsp.dim.more.lindep.arb
If a vector space has dimension n, then any subset set of n vectors that spans the space must be a basis, arbitrary vector space. math.la.t.vsp.dim.span.arb
If a vector space has dimension n, then any subset set of n vectors that spans the space must be a basis, coordinate vector space. math.la.t.vsp.dim.span.coord
Any linearly independent set can be expanded to a basis for the (sub)space, arbitrary vector space. math.la.t.vsp.linindep.basis.arb