Linear algebra is one of the fundamental branches of mathematics that deals with studying linear equations and their representation in the vector space using matrices. It is the study of vector spaces and linear transformations.
Linear functions, the system of linear equations, vector spaces, vectors, matrices, and linear transformation are the critical concepts in this branch of mathematics. Broadly, vectors are elements that we can add, and linear functions are functions of vectors that respect vector addition. A matrix emerges when the information related to linear functions is arranged in an organized form.
Linear algebra is one of the most important mathematical tools. It is essential in pure and applied mathematics alike. Linear algebra is also widely applicable in several fields such as physics, engineering, economics, computational science, and natural sciences. Linear algebra is one of the most valuable branches of mathematics today. An introduction to linear algebra is the building block to understanding many unknowns in the sciences.
What does it mean for vectors to be orthogonal?
In vector geometry, two or more vectors are orthogonal if they are perpendicular to each other. In other words, the dot product they yield is always zero.
What is a vector subspace?
A vector space V is a collection of objects with a (vector) defined as having two operations, addition and scalar multiplication. Vector addition combines two vectors, u and v, into a single vector u + v. Scalar multiplication is a way of combining scalar, k, with a vector, v, to end up with the vector kv.
Vector spaces are subject to the following ten axioms:
1. Closed under addition: u + v is in V.
2. Addition is commutative: u + v = v + u.
3. Addition is associative: (u + v) + w = u + (v + w)
4. Additive identity 0 (called the zero vector) e
5. xists: u + 0 = u.
6. Additive inverse −v exists: u + (−u) = 0.
7. Closed under scalar multiplication: cu is in V.
8. Distributive law for vectors: c(u + v) =cu+cv.
9. Distributive law for scalars: (c + d)u = cu + du.
10. Multiplication is associative: (cd)u = c(du).
11. Multiplicative identity exists: 1u = u.
A vector subspace S of a vector space V is a nonvoid subset of V and is a vector space in its own right, following under defined operations of the same addition and scalar multiplication. A subspace S of a vector set V is a working set that allows the original data set V to shrink into a smaller data set S.
A row vector vs. A column vector
Row vector and column vector are part of a rectangular array of values or elements. A matrix M has a rows and b columns. A vector v, when treated as a matrix, has either one row or one column.
Let us understand what is a row vector and a column vector.
1. A row vector is an ordering of collection of numbers written in a row or horizontally. Example of a row vector is: x=x₁, x₂,........xₙ.
2. A column vector is an ordering of collection of numbers written in a column (vertically).
What is an orthonormal basis?
If every element of a vector space can be written as a linear combination of some vectors and the vectors are independent of each other it is a basis. A basis must fulfil two conditions. First, it should have linear independence. Secondly, they must span the whole space.
Now, let us understand what is an orthonormal basis. A basis is orthonormal if all of its vectors have a norm (length) of 1. Additionally, all the vectors should be orthogonal (perpendicular) to each other with an inner product of 0.