www.olevels.tk
M atri trix Theory heory and Li L inea near Algebra I
INTRODUCTION
Matrix Theory and Linear Algebra, interconnected branches of mathematics that serve as fundamental tools in pure and applied mathematics and are becoming increasingly important in the physical, biological, and social sciences. sciences.
II
MATRIX THEORY
A matrix is a rectangular array of numbers or elements of a ring ( se e Algebra). One of the principal uses of matrices is in representing systems of equations of the first degree in several unknowns. Each matrix row represents one equation, and the entries in a row are the coefficients of the variables in the equations, in some fixed order. A matrix is usually enclosed in brackets:
I n the above matrices, a , b , and c are arbitrary numbers. I n place of brackets, brackets, parentheses or double vertical lines may be used to enclose the arrays. The horizontal lines, called rows, are numbered from the top down; the vertical lines, or columns, are numbered from left to right; thus, 1 is the element in the second row, third column of M 1. A row or column is called a line.
www.olevels.tk
The size of a matrix matrix is given by the number number of rows rows and columns, columns, so that that M 1, M2, M 3, and M4 are, in that order, of sizes 3 × 3 (3 by 3), 3 × 3, 3 × 2, and 2 × 3. The general matrix of size m × n is frequently represented in doublesubscript notation, with the first subscript i indicating the row number, and the second subscript j indicating the column number; a 23 is the element in the second row, third column. This general matrix
may be abbreviated to A = [ a ij ], in which the ranges i = 1, 2, ..., m and j = 1, 2, ..., n should be explicitly given if they are not implied by the text. I f m = n , the matrix is square, and the number of rows (or columns) is the order of the matrix. Two matrices, A = [ a ij ij ] and B = [ b ij ], are equal if and only if they are of the same size and if, for every i and j , a ij 11, a 22, a 33, ... ij = b ij . The elements a constitute the main or principal diagonal of the matrix A = [ a ij ], if it is square. The transpose A T of a matrix A is the matrix in which the i th th row is the i th th column of A and in which the j th column is the j th row of A; thus, from the matrix matrix M 3, above,
which is the transpose of M 3. Addition and multiplication of matrices can be defined so that certain sets of matrices form algebraic systems. Let the elements of the matrices considered be arbitrary real numbers, although the elements could have been chosen from other fields or rings. A zero matrix is one in which all the elements are zero; an identity matrix, I m of order m , is a square matrix of order m in which all the elements are zero except those on the main diagonal, which are 1. The order of an identity matrix may be omitted if implied by the text, and I m is then shortened shortened to I . The sum of two two matrices matrices is defined defined only only if they they are of the same same size; if A = [ a ij ] and B = [ b ij ] are of the same size, then C = A + B is defined as the matrix [ c ij ij ], in which c ij = a ij + b ij ; that is, two matrices of the same size are added merely by adding corresponding elements. Thus, in the matrices given above
www.olevels.tk
The set of of all matrice matrices s of a fixed fixed size has has the property property that that addition addition is close closed, d, associative associative,, and commutative; a unique matrix O exists such that for any matrix A, A + O = O + A = A; and, corresponding to any matrix A, there exists a unique matrix B such that A + B = B + A = O. The product product AB AB of two matric matrices, es, A and B, is defined defined only only if the number number of colum columns ns of the left left factor factor A is the same as the number of rows of the right factor B; if A = [ a ij ] is of size m × n and B = [ b j k ] is of size n × p , the product AB = C = [ c ik ] is of size m × p , and c ik is given by
That is, the eleme element nt in the i th th row and k th column of the product is the sum of the products of the elements of the i th th row of the left factor multiplied by the corresponding elements of the k th column of the right factor.
III
LINEAR ALGEBRA
The geome geometric tric concep conceptt of a vector vector as a line segment segment of given given length length and and direction direction can can be advantageously generalized as follows. An n vector ( n dimensional vector, vector of order n , vector of length n ) is an ordered set of n elements of a field. As in matrix theory, the elements are assumed to be real numbers. An n vector
v
is represented as:
v
= [ x 1, x 2, ..., x n ] I n particular, particular, the lines lines of a
matrix are vectors; the horizontal lines are row vectors, the vertical lines are column vectors. The x ' s are called the components of the vector. Addition of vectors (of the same length) and scalar multiplication are defined as for matrices and satisfy the same same laws. I f w = [ y 1, y 2, ..., y n ] and k is a scalar (real number), then
v
+
w
= [ x 1 + y 1,
x 2 + y 2, ..., x n + y n ] k v = [ k x 1, k x 2, ..., k x n ]
I f k 1, k 2, ..., k m are scalars and
,
v 1
v 2
, ...,
v m
is called a linear combination of the vectors
are n vectors, the n vector ,
v 1
, ...,
v 2
v
= k 1v 1 + k 2v 2 + ... + k m v m
. The m n vectors are linearly independent if
v m
the only linear combination equal to the zero n vector,
0
= [0,0, ..., 0], is the one in which k 1 = k 2 =
... = k m = 0; otherwise, the vectors are linearly dependent. For example, if v 1 = [0, 1, 2, 3], 2, 3, 4],
v 3
= [2, 2, 4, 4],
v 4
= [3, 4, 7, 8], then
k 2v 2 + k 3v 3 = 0 if and only if k 1 = k 2 = k 3 = 0; v 4
v 2
,
,
v 1
v 2
,
, and
v 3
v 3
v 2
= [1,
are linearly independent, because k 1v 1 + v 4
are linearly dependent dependent because
v 2
+
v 3

= 0. If A is a matrix of rank r , then at least one set of r row, or column, vectors is a linearly
independent set, and every set of more than r row, or column, vectors is a linearly dependent set. A vector space w
V,
then
v
+
V
is a nonempty set of vectors ( se e Set Theory), with the properties properties that (1) if v v V and , and (2) if v v
w V
V
and k is any scalar, then k v } is a set of vectors, all v V. I f S = {v i
of the same length, all linear combinations of the
v
's form a vector space said to be spanned by the
www.olevels.tk
v
's. If the set B = { w 1} spans the same vector space
a basis for V
V.
If a basis for
V
V
and is a linearly independent set, the set B is
contains m vectors, every basis for
V
will contain exactly m vectors and
is called a vector space of dimension m . Two and threedimensional Euclidean spaces are vector
spaces when their points are regarded as specified by ordered pairs or triples of real numbers. Matrices may be used to describe linear changes from one vector space into another.