Title of Project Group Members: Course name: Instructor:
Part 1 Introduction the idea of multivariate Gaussian distribution is presented The relation between Covariance matrix, Eigenvalues, Eigenvectors of a multivariate distribution is thoroughly demonstrated using figures and certain cases The important case of rank deficiency for covariance matrix is also presented.
1: Scatter plot for three cases are plotted:
2: In 1 st matrix only one variance (along Z) is largest and in matrix 2, only two variances are larger (along Y and Z) axes as evident in respective Figures.
3: The required changes for this part are made in code and results are plotted as:
4): The required changes for this part are made in code and the results are :
As 0.01 is almost nearly equal to 0 as compared to 1 and also evident in Figure-16 to Figure 19, the dimension of the data is effectively 1 in this case even though covariance matrix has rank ‘2’.
Part 2 Introduction the concept of Principle Component Analysis is deeply studied and understood After loading training data and estimating the given test images by adding one by one a less variation containing subspace , principle component, the image gets more closer to the original one but at some stage, adding further principle component does not make much difference.
Estimation Results for different No. of Principle Comp. Vectors
After getting the results of 360 th principle component for all 40 images, they are all compared and closely observed for the best and the worst estimates. It is found that image no. 33 is estimated best by using this estimator. The result for image 33 is shown in Figure
Conclusion in part 1 , we learnt about multivariate Gaussian distribution and the important concept that how a covariance matrix of such distribution can be well approximated by a covariance matrix that is rank deficient . In part 2 , the important concept of Principle Component Analysis was studied and the significance of subspace identification and subspace approximation by using PCA.