Inner Product Space

PatelRaj16 6,046 views 31 slides May 03, 2017
Slide 1
Slide 1 of 31
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31

About This Presentation

Some Information on Inner product Space...........


Slide Content

(Managed By Shree Tapi Brahmcharyashram Sabha) Shree Swami Atmanand Vidhya Sankul, Kapodra, Varachha Road, Surat, Gujarat, India. 395006 Phone: 0261-2573552 Fax No.: 0261-2573554 Email: [email protected] Web: www.ssasit.ac.in Shree Swami Atmanand Saraswati Institute of Technology, Surat 1 Inner Product Space

Presented by: Patel Raj G.

Inner Product Spaces Orthogonal and Orthogonal basis Gram Schmidt Process Orthogonal complements Orthogonal projections Least Squares Approximation Contents

Inner Product Spaces (1) (2) ( 3) (4) (5) if and only if Inner product : represented by angle brackets Let u , v , and w be vectors in a vector space V , and let c be any scalar. An inner product on V is a function that associates a real number with each pair of vectors u and v and satisfies the following axioms (abstraction definition from the properties of dot product in Theorem 5.3 on Slide 5.12) (commutative property of the inner product) (distributive property of the inner product over vector addition) (associative property of the scalar multiplication and the inner product)

Note: Note: A vector space V with an inner product is called an inner product space Vector space: Inner product space:

Properties of inner products Let u, v , and w be vectors in an inner product space V , and let c be any real number (1) (2) (3)

u and v are orthogonal if Distance between u and v : Angle between two nonzero vectors u and v : Orthogonal: Norm (length) of u : The definition of norm (or length), distance, angle, orthogonal, and normalizing for general inner product spaces closely parallel to those based on the dot product in Euclidean n -space

Properties of norm: 1) ( 2) if and only if (3 ) Properties of distance: (the same as the properties for the dot product in R n on Slide 5.9) (1) (2) if and only if (3)

Let u and v be vectors in an inner product space V (1) Cauchy-Schwarz inequality: (2) Triangle inequality: (3) Pythagorean theorem: u and v are orthogonal if and only if

Orthogonal vectors : Two vectors u and v in R n are orthogonal (perpendicular) if Note: The vector is said to be orthogonal to every vector Orthogonal and Orthogonal basis

Ex : Finding orthogonal vectors Determine all vectors in R n that are orthogonal to u = ( 4, 2) Let Sol:

Orthogonal basis Definition : a,b in V, a b if (a|b)=0. The zero vector is orthogonal to every vector. An orthogonal set S is a set s.t. all pairs of distinct vectors are orthogonal. An orthonormal set S is an orthogonal set of unit vectors. Every nonzero finite dimension inner product space has an orthogonal basis

Theorems : If S={ v1,v2,…,vn} is an orthogonal set of nonzero vectorin an inner product space V then S is linearly independent. Any orthogonal set of n nonzero vector in Rn is basis for Rn . If S={v1,v2,…,vn} is an orthonormal basis for an inner product space V, and u is any vector in V then it can be expressed as a linear combination of v1,v2,…,vn . u=< u,v1>v1 + < u,v2>v2 +…+ < u,vn>vn If S is an orthonormal basis for an n-dimensional inner product space, and if coordinate vectors of u & v with respect to S are [u]s=(a1,a2,…,an) and [v]s=(b1,b2,…,bn) then ||u ||= (a ) + ( a ) + …+( a ) d(u,v )= ( a1-b1)2 + (a2-b2)2 +…+ (an-bn)2 < u,v>=a1b1 + a2b2 +…+ anbn =[ u]s.[ v]s.  

Gram Schmidt Process Gram schidt process to orthogonalisation of a set of vectors. Let } be the given set of vector which is basis for vector space V. We shall constuct an orthogonal set{ } of vector of V which becomes basis for V as under. Consider the vector space with the Euclidean inner product. Apply the Gram–Schmidt process to transform the basis vectors , , into an orthogonal basis ; then normalize the orthogonal basis vectors to obtain an orthonormal basis .  

Step 1. Let = Step 2. = Step 3. = and so on …………………… And orthonormal basis are { }  

Use gram schmidt process the set = = (1,2,1)   < >=<(-1,1,0)(1,1,1)> =-1+1+0 =0 = =(-1,1,0)- =2  

Orthogonal basis set is {(1,1,1)(-1,1,0) ( )} Now,orthonormal basis are { } {( )( )( )}  

Orthogonal complements Let W be a subspace of on inner product space V. A vector u in V is orthogonal to W if is orthogonal to every vector in W. The set of all vector in V that are orthogonal to W is called orthogonal complement of W and is denoted by W  .

Properties of Orthogonal complements If W is a subspace of inner product space V than ● A vector u is in W  if and only if u is orthogonal to every spans W. ● The only vector common to W and W  is 0. ● W  is subspace of v. ● W  W  = W

● Let W=span { u ,u } u =( 2,0,-1 ) u =( 4,0,-2) We have The homo. system is AX=0; Ex : Fine the basis for an orthogonal complement of the subspace of W span by the vector u =( 2,0,-1) u =4,0 ,-2)

(1 ÷2)R1 The A.M. is R2 +(-4)R1 ~ ~ X = = = + W  = { ( 0,1,0), (1÷2,0,1)

For inner product spaces: Let u and v be two vectors in an inner product space V . If , then the orthogonal projection of u onto v is given by Orthogonal projections : For the dot product function in R n , we define the orthogonal projection of u onto v to be proj v u = a v (a scalar multiple of v ), and the coefficient a can be derived as follows

Ex : Finding an orthogonal projection in R 3 Use the Euclidean inner product in R 3 to find the orthogonal projection of u = ( 6, 2, 4) onto v = ( 1, 2, 0) Sol:

Least S quares Approximation : (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with the back substitution to solve for x (2) When the system is inconsistent, only the “best possible” solution of the system can be found, i.e., to find a solution of x for which the difference (or said the error) between A x and b is smallest Note: the system of linear equations A x = b is consistent if and only if b is in the column space of A

Least Squares Approximation : Given a system A x = b of m linear equations in n unknowns, the least squares problem is to find a vector x in R n that minimizes the distance between A x and b , i.e., with respect to the Euclidean inner product in R n . Such vector is called a least squares solution of A x = b ※ The term least squares comes from the fact that minimizing is equivalent to minimizing = ( A x – b ) · ( A x – b ), which is a sum of squared errors

(the n × n linear system of normal equations associated with A x = b ) (The nullspace of A T is a solution space of the homogeneous system A T x = ) (To find the best solution which should satisfy this equation)

Note: The problem of finding the least squares solution of is equal to the problem of finding an exact solution of the associated normal system Theorem associated with Least Squares Approximation solution: For any linear system , the associated normal system is consistent, and all solutions of the normal system are least squares solution of A x = b . In other words, if W is the column space of A , and is the least squares solution of A x = b , then the orthogonal projection of b on W is , i.e.,

Ex : Solving the normal equations Find the least squares solution of the following system and find the orthogonal projection of b onto the column space of A

Sol: the corresponding normal system

the least squares solution of A x = b the orthogonal projection of b onto the column space of A

Thanks