Curve fitting

47,477 views 63 slides Oct 10, 2015
Slide 1
Slide 1 of 63
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63

About This Presentation

Curve fitting (Theory & problems)


Slide Content

Curve fitting (Theory & problems) Session: 2013-14 (Group no: 05) Numerical Analysis CEE-149 Credit 02   Curve fitting (Theory & problems) Numerical Analysis

Definition Curve fitting: is the process of constructing a curve, or mathematical function, that has the best fit to a series of data points, possibly subject to constraints. It is a statistical technique use to drive coefficient values for equations that express the value of one(dependent) variable as a function of another (independent variable)

What is curve fitting Curve fitting is the process of constructing a curve, or mathematical functions, which possess closest proximity to the series of data. By the curve fitting we can mathematically construct the functional relationship between the observed fact and parameter values, etc. It is highly effective in mathematical modelling some natural processes.

Interpolation & Curve fitting In many application areas, one is faced with the test of describing data, often measured, with an analytic function. There are two approaches to this problem:- 1. In Interpolation, the data is assumed to be correct and what is desired is some way to descibe what happens between the data points . 2. The other approach is called curve fitting or regression, one looks for some smooth curve that ``best fits'' the data, but does not necessarily pass through any data points.

Curve fitting There are two general approaches for curve fitting: Least squares regression: Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data. Interpolation: Data is very precise. The strategy is to pass a curve or a series of curve through each of the points.

General approach for curve fitting

Engineering applications of curve fitting technique 1.Trend Analysis:- Predicating values of dependent variable ,may include extrapolation beyond data points or interpolation between data points.

Some important relevant parameters In engineering, two types of applications are encountered: Trend analysis . Predicting values of dependent variable, may include extrapolation beyond data points or interpolation between data points. Hypothesis testing . Comparing existing mathematical model with measured data.

Data scatterness Positive Correlation Positive Correlation No Correlation

Mathematical Background Arithmetic mean . The sum of the individual data points ( yi ) divided by the number of points (n). Standard deviation . The most common measure of a spread for a sample .

Mathematical Background (cont’d) Variance . Representation of spread by the square of the standard deviation . Coefficient of variation . Has the utility to quantify the spread of data.

Least square method The Method of Least Squares is a procedure to determine the best fit line to data; the proof uses simple calculus and linear algebra. The basic problem is to find the best fit straight line y = ax + b given that, for n ∈ { 1 , . . . , N} , the pairs ( xn , yn ) are observed. The method easily generalizes to finding the best fit of the form y = a 1 f 1( x ) + · · · + cKfK ( x ); it is not necessary for the functions fk to be linearly in x – all that is needed is that y is to be a linear combination of these functions.

Least square method

Least Squares Regression Linear Regression Fitting a straight line to a set of paired observations: (x 1 , y 1 ), (x 2 , y 2 ),…,( x n , y n ). y = a + a 1 x + e a 1 - slope a - intercept e - error, or residual, between the model and the observations

Linear Regression: Residual

Linear Regression: Criteria for a “Best” Fit e 1 = -e 2

Linear Regression: Criteria for a “Best” Fit

Linear Regression: Criteria for a “Best” Fit

Linear curve fitting (Straight line)? Given a set of data point (x i, f(x i )) find a curve that best captures the general trend Where g(x) is approximation function Try to fit a straight line Through the data

Linear curve fitting (Straight line)? Let g(x)=a +a 1 x       Try to fit a straight line Through the data

Linear curve fitting(straight line) Error is a function of a , a 1 For error (E) to have extreme value: Two equation of two unknowns , solve to get a0,a1  

Linear Regression: Least Squares Fit Yields a unique line for a given set of data .

Linear Regression: Least Squares Fit The coefficients a and a 1 that minimize S r must satisfy the following conditions:

Linear Regression: Determination of a o and a 1 2 equations with 2 unknowns, can be solved simultaneously

Linear Regression: Determination of ao and a1

Error Quantification of Linear Regression Total sum of the squares around the mean for the dependent variable, y, is S t Sum of the squares of residuals around the regression line is S r

Example The table blew gives the temperatures T in C and Resistance R in Ω of a circuit if R=a + a 1 T Find the values of a and a 1 T 10 20 30 40 50 60 R 20.1 20.2 20.4 20.6 20.8 21

Solution T=Xi R= yi Xiyi=TR g(xi)=Y 10 20.1 100 201 20.05 20 20.2 400 404 20.24 30 20.4 900 612 20.42 40 20.6 1600 824 20.61 50 20.8 2500 1040 20.80 60 21 3600 1260 20.98 T=Xi R= yi Xiyi=TR g(xi)=Y 10 20.1 100 201 20.05 20 20.2 400 404 20.24 30 20.4 900 612 20.42 40 20.6 1600 824 20.61 50 20.8 2500 1040 20.80 60 21 3600 1260 20.98

Solution 6a +210a 1 =123.1 a =19.867 210a +9100a 1 =4341 a 1 =0.01857 g(x)=19.867+0.01857*T

Least Squares Fit of a Straight Line: Example Fit a straight line to the x and y values in the following Table: x i y i x i y i x i 2 1 0.5 0.5 1 2 2.5 5 4 3 2 6 9 4 4 16 16 5 3.5 17.5 25 6 6 36 36 7 5.5 38.5 49 28 24 119.5 140

Least Squares Fit of a Straight Line: Example Y = 0.07142857 + 0.8392857 x

Least Squares Fit of a Straight Line: Example (Error Analysis)

Least Squares Fit of a Straight Line: Example (Error Analysis ) The standard deviation (quantifies the spread around the mean): The standard error of estimate (quantifies the spread around the regression line)

Algorithm for linear regression

Linearization of Nonlinear Relationships The relationship between the dependent and independent variables is linear. However, a few types of nonlinear functions can be transformed into linear regression problems. The exponential equation. The power equation. The saturation-growth-rate equation.

Linearization of Nonlinear Relationships 1. The exponential equation. y* = a o + a 1 x

Linearization of Nonlinear Relationships 2. The power equation y* = a o + a 1 x*

Linearization of Nonlinear Relationships 3. The saturation-growth-rate equation y* = 1/y a o = 1/a 3 a 1 = b 3 /a 3 x* = 1/x

Example Fit the following Equation: To the data in the following table: x i y i X*=log x i Y*=logy i 1 0.5 0.602 2 1.7 0.301 0.753 3 3.4 0.301 0.699 4 5.7 .226 0.922 5 8.7 .447 2.079 15 19.7 .534 2.141

Example Xi Yi X* i =Log(X) Y* i =Log(Y) X*Y* X*^2 1 0.5 0.0000 -0.3010 0.0000 0.0000 2 1.7 0.3010 0.2304 0.0694 0.0906 3 3.4 0.4771 0.5315 0.2536 0.2276 4 5.7 0.6021 0.7559 0.4551 0.3625 5 8.4 0.6990 0.9243 0.6460 0.4886 Sum 15 19.700 2.079 2.141 1.424 1.169

Linearization of Nonlinear Functions: Example log y =-0.334+1.75log x

Polynomial Regression Some engineering data is poorly represented by a straight line. For these cases a curve is better suited to fit the data. The least squares method can readily be extended to fit the data to higher order polynomials.

Polynomial Regression (cont’d) A parabola is preferable

Polynomial Regression (cont’d) A 2 nd order polynomial (quadratic) is defined by: The residuals between the model and the data: The sum of squares of the residual:

Polynomial Regression (cont’d) 3 linear equations with 3 unknowns (a o ,a 1 ,a 2 ), can be solved

Polynomial Regression (cont’d) A system of 3x3 equations needs to be solved to determine the coefficients of the polynomial . The standard error & the coefficient of determination

Polynomial Regression (cont’d) General: The mth -order polynomial: A system of (m+1)x(m+1) linear equations must be solved for determining the coefficients of the mth -order polynomial. The standard error: The coefficient of determination:

Polynomial Regression- Example Fit a second order polynomial to data: x i y i x i 2 x i 3 x i 4 x i y i x i 2 y i 2.1 1 7.7 1 1 1 7.7 7.7 2 13.6 4 8 16 27.2 54.4 3 27.2 9 27 81 81.6 244.8 4 40.9 16 64 256 163.6 654.4 5 61.1 25 125 625 305.5 1527.5 15 152.6 55 225 979 585.6 2489

2 nd order polynomial Example xi fi fixi g (x) 1 4 1 1 1 4 4 4.505 2 11 4 8 6 22 44 10.15 4 19 16 64 256 76 304 19.43 6 26 36 216 1296 156 936 26.03 8 30 64 512 4096 240 1920 29.95 = 121 xi fi fixi g (x) 1 4 1 1 1 4 4 4.505 2 11 4 8 6 22 44 10.15 4 19 16 64 256 76 304 19.43 6 26 36 216 1296 156 936 26.03 8 30 64 512 4096 240 1920 29.95

2 nd order polynomial Example 5a +21a 1 +121a 2 =90 21a +121a 1 +801a 2 =498 121a +801a 1 +5665a 2 =3208 a =-1.81 ,a 1 =6.65 ,a 2 =-0.335 So the required equation is g (x)=-1.81+6.65X-0.335  

Exponential function x 1 2 3 4 5 y 1.5 4.5 6 8.5 11 Solution y=a lny= lna =lna+bx Y=a +a 1 X   Where Y=lny=fi, a =a , a 1 =b , X=x

X= xi yi Y=lny xiyi g (x) 1 1.5 0.405 1 0.405 2.06 2 4.5 1.504 4 3.008 3.27 3 6 1.791 9 5.373 5.186 4 8.5 2.14 16 8.56 8.22 5 11 2.39 25 11.95 13.03 X= xi yi Y=lny xiyi g (x) 1 1.5 0.405 1 0.405 2.06 2 4.5 1.504 4 3.008 3.27 3 6 1.791 9 5.373 5.186 4 8.5 2.14 16 8.56 8.22 5 11 2.39 25 11.95 13.03         Solution

Solution 5a +15a 1 =8.23 ; a = 0.2642 15a + 55a 1 = 29.296 ;a 1 =0.4606 a= =1.30234, b=0.4606 Require equation g (x)=1.30238  

Example Power function: y=a lny = lna + blnx Y=a +a 1 X Where, Y=lny, a = lna ; X= lnx ; a 1 =b   x 2 2.5 3 3.5 4 y 7 8.5 11 12.75 15 Solution:

Solution x y lnx =X lny=Y XY g (x) 2 7 0.6931 1.946 0.480 1.3487 6.868 2.5 8.5 0.9163 2.140 0.8396 1.9608 8.813 3 11 1.098 2.397 1.2056 2.6319 10.806 3.5 12.75 1.252 2.545 1.5675 3.1863 12.838 4 15 1.386 2.708 1.9209 3.7532 14.904 x y lnx =X lny=Y XY g (x) 2 7 0.6931 1.946 0.480 1.3487 6.868 2.5 8.5 0.9163 2.140 0.8396 1.9608 8.813 3 11 1.098 2.397 1.2056 2.6319 10.806 3.5 12.75 1.252 2.545 1.5675 3.1863 12.838 4 15 1.386 2.708 1.9209 3.7532 14.904        

Solution 5a +5.3454a 1 =11.736 5.3454a +6.0136a 1 =12.8809 a =1.1521 ; a 1 =1.1178 a= = =3.1648 b=a 1 =1.1178 Required equation=3.1648  

Polynomial Regression- Example (cont’d) The system of simultaneous linear equations:

Polynomial Regression- Example (cont’d) x i y i y model e i 2 (y i -y`) 2 2.1 2.4786 0.14332 544.42889 1 7.7 6.6986 1.00286 314.45929 2 13.6 14.64 1.08158 140.01989 3 27.2 26.303 0.80491 3.12229 4 40.9 41.687 0.61951 239.22809 5 61.1 60.793 0.09439 1272.13489 15 152.6 3.74657 2513.39333 The standard error of estimate: The coefficient of determination:

Reference Introduction Method of Numerical Analysis (S.S Sastry ) Page: 126-175 N umerical methods for engineers ( Steven c. Chapra , Raymond p. canale ) page: 561-69 https:// en.wikipedia.org/wiki/Curve_fitting http:// site.iugaza.edu.ps/iismail/files/Ch17-curve-fitting.ppt http:// caig.cs.nctu.edu.tw/course/NM07S/slides/chap3_1.pdf

Reference http:// www.eas.uccs.edu/wickert/ece1010/lecture_notes/1010n6a.PDF http:// www.chee.uh.edu/sites/chbe.egr.uh.edu/files/files/CHEE3334.pdf Lecture Sheet of (Dr. Md. Shahidur Rahman Assistant Pro. CEE, SUST)

THE END THANK FOR BEING WITH US