Matrices and Determinants

niravbvyas 79 views 54 slides Oct 09, 2017
Slide 1
Slide 1 of 54
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54

About This Presentation

Fundamental of Matrix and Determinants with suitable examples..


Slide Content

Matrices & Determinants Dr. N. B. Vyas Humanities & Science Dept. Atmiya Institute of Technology & Science [email protected], [email protected]

Introduction A matrix is an ordered rectangular array of numbers or functions . The numbers or functions are called the elements or the entries of the matrix. We denote matrices by capital letters.

The following are some examples of matrices: The horizontal lines of elements are said to constitute, rows of the matrix and the vertical lines of elements are said to constitute, columns of the matrix .

A matrix having m rows and n columns is called a matrix of order m × n or simply m × n matrix (read as an m by n matrix). In general, an m × n matrix has the following rectangular array:

Types of Matrices Column matrix :- A matrix is said to be a column matrix if it has only one column. Eg . Row matrix:- A matrix is said to be a row matrix if it has only one row. Eg . Square matrix:- A matrix in which the number of rows are equal to the number of columns, is said to be a square matrix. Eg . Thus an m × n matrix is said to be a square matrix if m = n and is known as a square matrix of order ‘n’.

Diagonal matrix:- A square matrix is said to be a diagonal matrix if all its non diagonal elements are zero and at least one diagonal element is non-zero. Eg . Scalar matrix:- A diagonal matrix is said to be a scalar matrix if its diagonal elements are equal. Eg . Identity matrix:- A square matrix in which elements in the diagonal are all 1 and rest are all zero is called an identity matrix. Eg . Zero matrix:- A matrix is said to be zero matrix or null matrix if all its elements are zero. Eg .

Upper Triangular matrix:- A square matrix, in which all the elements below the diagonal are zero is called an upper triangular matrix. Eg . Lower Triangular matrix:- A square matrix, in which all the elements above the diagonal are zero is called a lower triangular matrix. Eg . Trace of a Matrix:- The sum of all the diagonal elements of a square matrix is called the trace of a matrix. Eg .

Transpose of a matrix:- A matrix obtained by interchanging rows and columns of a matrix is called transpose of a matrix and is denoted by A’ or A T

Equality of matrices Two matrices A = [ a ij ] and B = [ b ij ] are said to be equal if ( i ) they are of the same order (ii) each element of A is equal to the corresponding element of B, that is a ij = b ij for all i and j.

Ex. Ex.

Operations on Matrices Addition of Matrices The sum of two matrices is a matrix obtained by adding the corresponding elements of the given matrices. Eg . Furthermore, the two matrices have to be of the same order.

Multiplication of a matrix by a scalar If A is a matrix and k is a scalar, then kA is another matrix which is obtained by multiplying each element of A by the scalar k. Eg . Negative of a matrix The negative of a matrix is denoted by –A. We define –A = (–1) A. Eg . Difference of matrices If A and B are two matrices of the same order, say m × n, then difference A – B is defined as a matrix D where D = A – B = A + (–1) B, that is sum of the matrix A and the matrix – B.

Ex. Ex.

Properties of matrix addition Commutative Law: If A and B are matrices of the same order, say m × n, then A + B = B + A. Associative Law: For any three matrices A, B and C of the same order, say m × n, (A + B) + C = A + (B + C). Existence of additive identity: Let A be an m × n matrix and O be an m × n zero matrix, then A + O = O + A = A. In other words, O is the additive identity for matrix addition. The existence of additive inverse: Let A be any m × n matrix, then we have another matrix as – A such that A + (– A) = (– A) + A= O. So – A is the additive inverse of A or negative of A.

Properties of scalar multiplication of a matrix If A and B be two matrices of the same order, say m × n, and k and l are scalars, then ( i ) k(A +B) = k A + kB , (ii) (k + l)A = k A + l A (ii) k (A + B) = kA + kB (iii) ( k + l) A = k A + l A

Ex. Ex. Ex.

Multiplication of matrices The product of two matrices A and B is defined if the number of columns of A is equal to the number of rows of B. Let A be an m × n matrix and B be an n × p matrix. Then the product of the matrices A and B is the matrix C of order m × p. To get the ( i , k) th element the matrix C, we take the i th row of A and k th column of B, multiply them element wise and take the sum of all these products.

Ex. Ex. Ex. Ex.

Properties of multiplication of matrices The associative law: For any three matrices A, B and C. We have (AB) C = A (BC), whenever both sides of the equality are defined. The distributive law : For three matrices A, B and C. ( i ) A (B+C) = AB + AC (ii) (A+B) C = AC + BC, whenever both sides of equality are defined. The existence of multiplicative identity: For every square matrix A, there exist an identity matrix of same order such that IA = AI = A.

Ex. Ex. Ex.

Transpose of a Matrix If A = [ a ij ] be an m × n matrix, then the matrix obtained by interchanging the rows and columns of A is called the transpose of A. Transpose of the matrix A is denoted by A′ or A T . In other words, if A = [ a ij ] m × n , then A′ = [ a ji ] n × m

Properties of transpose of the matrices For any matrices A and B of suitable orders, we have ( i ) (A′)′ = A, (ii) ( kA)′ = kA′ (where k is any constant) (iii) (A + B)′ = A′ + B′ (iv) (A B)′ = B′ A′

Ex. Ex.

Symmetric & Skew Symmetric Matrices A square matrix A = [ a ij ] is said to be symmetric if A′ = A, that is, [ a ij ] = [ a ji ] for all possible values of i and j. Eg . A square matrix A = [ a ij ] is said to be skew symmetric matrix if A′ = – A, that is a ji = – a ij for all possible values of i and j. Now, if we put i = j, we have a ii = – a ii . Therefore 2aii = 0 or aii = 0 for all i’s . This means that all the diagonal elements of a skew symmetric matrix are zero.

Theorem: For any square matrix A with real number entries, A + A′ is a symmetric matrix and A – A′ is a skew symmetric matrix. Theorem: Any square matrix can be expressed as the sum of a symmetric and a skew symmetric matrix. Let A be a square matrix, then we can write

Ex. Ex. Ex. Ex.

Determinant To every square matrix A = [ a ij ] of order n, we can associate a number called determinant of the square matrix A.

Note: For matrix A, |A| is read as determinant of A and not modulus of A. Only square matrices have determinants.

Determinant of a matrix of order one: Let A = [a] be the matrix of order 1, then determinant of A is defined to be equal to a. Determinant of a matrix of order two:

Determinant of a matrix of order 3 × 3:

Properties of Determinants The value of the determinant remains unchanged if its rows and columns are interchanged. If any two rows (or columns) of a determinant are interchanged, then sign of determinant changes. If any two rows (or columns) of a determinant are identical (all corresponding elements are same), then value of determinant is zero. If each element of a row (or a column) of a determinant is multiplied by a constant k, then its value gets multiplied by k.

Area of a Triangle The area of a triangle whose vertices are (x 1 , y 1 ), (x 2 , y 2 ) and (x 3 , y 3 ), is given by

NOTE: Since area is a positive quantity, we always take the absolute value of the determinant. The area of the triangle formed by three collinear points is zero.

Example Find the area of the following triangle whose vertices are give: (3, 8), (– 4, 2) and (5, 1). (1, 0), (6, 0), (4, 3) (2, 7), (1, 1), (10, 8)

Minors Minor of an element a ij of a determinant is the determinant obtained by deleting its i th row and j th column in which element a ij lies. Minor of an element a ij is denoted by M ij .

Cofactors Cofactor of an element a ij , denoted by A ij is defined by A ij = (–1) i + j M ij , where M ij is minor of a ij .

Adjoint of a Matrix The adjoint of a square matrix A = [ a ij ] n × n is defined as the transpose of the matrix [ A ij ] n × n , where A ij is the cofactor of the element a ij . Adjoint of the matrix A is denoted by adj A.

A square matrix A is said to be singular if |A| = 0. A square matrix A is said to be non-singular if |A| ≠ 0. If A and B are nonsingular matrices of the same order, then AB and BA are also nonsingular matrices of the same order.

Inverse of a matrix If A is a square matrix of order m, and if there exists another square matrix B of the same order m, such that AB = BA = I, then B is called the inverse matrix of A and it is denoted by A – 1 . In that case A is said to be invertible.

Theorem: Inverse of a square matrix, if it exists, is unique. Theorem: If A and B are invertible matrices of the same order, then (AB) –1 = B –1 A –1 .

Solution of system of linear equations using inverse of a matrix