Upload
others
View
16
Download
0
Embed Size (px)
Citation preview
UNIVERSITI TEKNOLOGI MALAYSIA
SKMM 3023 Applied Numerical Methods
Curve-Fitting & Interpolation
ibn ‘Abdullah
Faculty of Mechanical Engineering
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 1 / 33
Outline
1 Introduction
2 Polynomial Curve Fitting
Direct Fit Polynomials
Lagrange Polynomials
Divided Difference Polynomials
3 Least Square Regression
Least-squares Line Method
Least-squares Parabola Method
Least-squares with Polynomial Order m Method
Multiple Regression
4 Curve Fitting with Matlab
5 Bibliographyibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 2 / 33
Introduction
Field data is often accompanied by noise. Even though all control parameters
(independent variables) remain constant, the resultant outcomes (dependent
variables) vary. A process of quantitatively estimating the trend of the outcomes,
also known as regression or curve fitting, therefore becomes necessary.
The best-fitting curve can be obtained through various methods and we shall lookat approximating function obtained through
Polynomial Curve Fitting1 direct fit polynomials fit,2 Lagrange polynomials,3 divided difference polynomials
Least-Squares Regression1 line method,2 parabola method,3 polynomial order m method,4 multiple regression
which can be applied to both unequally spaced data and equally spaced data.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 3 / 33
Polynomial Curve FittingDirect Fit Polynomials
Also known as the method of direct fit polynomials. Based on fitting the data
directly by a polynomial, which is given by
Pn(x) = a0 + a1x + a2x2 + . . . + anx
n(1)
where Pn(x) is determined by one of the following methods:
Given N = n + 1 points, [xi, f(xi)], determine the exact nth-degree polynomial thatpasses through the data points.Given N > n + 1 points, [xi, f(xi)], determine the least squares nth-degree polynomialthat best fits the data points—a much detailed coverage on this later!
Derivatives obtained by differentiating the approximating polynomial
f′
≈ P′
n = a1 + 2a2x + 3a3x2 + . . . (2a)
f′′
≈ P′′
n = 2a2 + 5a3x + . . . (2b)
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 4 / 33
Polynomial Curve FittingLagrange Polynomials
Can be used for both unequally spaced data and equally spaced data. It is based on
fitting second degree Lagrange polynomial, Eq. (3), to a (sub)set of three discrete
data pairs, e.g. (a, f(a)), (b, f(b)), and (c, f(c)), such that
P2(x) =(x − b)(x − c)
(a − b)(a − c)f(a) +
(x − a)(x − c)
(b − a)(b − c)f(b) +
(x − a)(x − b)
(c − a)(c − b)f(c) (3)
Differentiating Eq. (3) yields
f′
≈ P′
2(x) =2x − (b + c)
(a − b)(a − c)f(a) +
2x − (a + c)
(b − a)(b − c)f(b) +
2x − (a + b)
(c − a)(c − b)f(c)
(4a)
f′′
≈ P′′
2 (x) =2f(a)
(a − b)(a − c)+
2f(b)
(b − a)(b − c)+
2f(c)
(c − a)(c − b)(4b)
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 5 / 33
Polynomial Curve FittingDivided Difference Polynomials
Can be used for both unequally spaced data and equally spaced data. Given thepoints (x0, y0), (x1, y1), (x2, y2), (x3, y3), we construct a divided difference table:
xi yi = f(0)i
f(1)i
f(2)i
f(3)i
x0 y0 = f(0)0
f(1)0
=f(0)1
−f(0)0
x1−x0
x1 y1 = f(0)1
f(2)0
=f(1)1
−f(1)0
x2−x0
f(1)1
=f(0)2
−f(0)1
x2−x1f(3)0
=f(2)1
−f(2)0
x3−x0
x2 y2 = f(0)2
f(2)1
=f(1)2
−f(1)1
x3−x1
f(1)2
=f(0)3
−f(0)2
x3−x2
x3 y3 = f(0)3
from which we derive the divided difference polynomial representing these points:
Pn(x) = f(0)i + (x − x0) f
(1)i + (x − x0)(x − x1) f
(2)i + (x − x0)(x − x1)(x − x2) f
(3)i
(5)ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 6 / 33
Polynomial Curve FittingDivided Difference Polynomials
Differentiating Eq. (5) yields
f′
≈ P′
n(x) = f(1)i + [2x − (x0 + x1)] f
(2)i
+ [3x2− 2(x0 + x1 + x2) x + (x0 x1 + x0 x2 + x1 x2)] f
(3)i (6a)
f′′
≈ P′′
n (x) = 2f(2)i + [6x − 2(x0 + x1 + x2)] f
(3)1 + . . . (6b)
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 7 / 33
Polynomial Curve FittingExample 1
Problem Statement:Evaluate the derivatives by numerical differentiation formulae developed by fitting
direct fit polynomial,
Lagrange polynomial, and
divided difference polynomial
to the following set of discrete data:
x 3.4 3.5 3.6y 0.294118 0.285714 0.277778
The exact derivatives at x = 3.5 are f ′(3.5) = −0.081633 . . . and f ′′(3.5) = 0.046647 . . .
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 8 / 33
Polynomial Curve FittingExample 1
Solution:Fit the quadratic polynomial, P2(x) = a0 + a1x + a2x2
2 , to the three data points:
0.294118 = a0 + a1(3.4) + a2(3.4)2(7a)
0.285714 = a0 + a1(3.5) + a2(3.5)2
(7b)
0.277778 = a0 + a1(3.6) + a2(3.6)2(7c)
Solving for a0, a1, and a2 by Gauss elimination results in a0 = 0.858314, a1 = −0.245500, a2 = 0.023400,which are then substituted into P2(x) = a0 + a1x + a2x2
2 to yield the solution for direct fit polynomial:
P2(x) = 0.023400x2− 0.245500x + 0.858314 (8a)
Evaluating the first and second derivatives at x = 3.5 yields
P′
2(3.5) = (0.04680)(3.5) − 0.245500 = −0.081700 (8b)
P′′
2 (3.5) = 0.046800 (8c)
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 9 / 33
Polynomial Curve FittingExample 1
Solution: continued. . .Substituting the tabular values into Eq. (3) yields the Lagrange polynomial
P2(x) =(x − b)(x − c)
(a − b)(a − c)f(a) +
(x − a)(x − c)
(b − a)(b − c)f(b) +
(x − a)(x − b)
(c − a)(c − b)f(c)
=(x − 3.5)(x − 3.6)
(3.4 − 3.5)(3.4 − 3.6)× 0.294118 +
(x − 3.4)(x − 3.6)
(3.5 − 3.4)(3.5 − 3.6)× 0.285714
+(x − 3.4)(x − 3.5)
(3.6 − 3.4)(3.6 − 3.5)× 0.277778
= 14.7059(x − 3.5)(x − 3.6) − 28.5714(x − 3.4)(x − 3.6) + 13.8889(x − 3.4)(x − 3.5)
= 0.0234 x2− 0.2455 x + 0.8583 (9a)
Evaluating Eqs. (4a) and (4b) at x = 3.5
P′
2(3.5) =2(3.5) − (3.5 + 3.6)
(3.4 − 3.5)(3.4 − 3.6)(0.294118) +
2(3.5) − (3.4 + 3.6)
(3.5 − 3.4)(3.5 − 3.6)(0.285714)
+2(3.5) − (3.4 + 3.5)
(3.6 − 3.4)(3.6 − 3.5)(0.277778) = −0.087100 (9b)
P′′
2 (3.5) =2(0.294118)
(3.4 − 3.5)(3.4 − 3.6)+
2(0.283714)
(3.5 − 3.4)(3.5 − 3.6)+
2(0.277778)
(3.6 − 3.4)(3.6 − 3.5)= 0.046800
(9c)ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 10 / 33
Polynomial Curve FittingExample 1
Solution: continued. . .Construct a divided difference table for the tabular data to use the divided difference polynomial:
xi f(0)i
f(1)i
f(2)i
3.4 0.294118−0.084040
3.5 0.285714 0.023400−0.079360
3.6 0.277778
Substituting these values into Eq. (5) yields
P2(x) = f(0)i
+ (x − x0) f(1)i
+ (x − x0)(x − x1) f(2)i
+ (x − x0)(x − x1)(x − x2) f(3)i
= 0.294118 − 0.084040(x − 3.4) + 0.023400(x − 3.4)(x − 3.5)
= 0.0234 x2− 0.2455 x + 0.8583 (10a)
Substituting these values into Eqs. (6a) and (6b) yields the solution for the divided difference polynomial:
P′
2(3.5) = −0.084040 + [2(3.5) − (3.4 + 3.5)](0.023400) = −0.081700 (10b)
P′′
2 (3.5) = 2(0.023400) = 0.046800 (10c)ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 11 / 33
Polynomial Curve FittingExample 1
Solution: continued. . .The results obtained by the three procedures are identical since the same three points are used in all threeprocedures.The error in f ′(3.5) is
Error = f′
(3.5) − P′
2(3.5)
= −0.081700 − (−0.081633)
= −0.000067
The error in f ′′(3.5) is
Error = f′′(3.5) − P
′′
2 (3.5)
= 0.046800 − 0.046647
= 0.000153
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 12 / 33
Least Square Regression
The methods of least squares assumes that the best-fit curve of a given type is the
curve that has the minimal sum of the deviations squared (least square error) from
a given set of data.
Suppose that the data points are (x1, y1), (x2, y2), . . . , (xn, yn) where x is the
independent variable and y is the dependent variable.
The fitting curve f(x) has the deviation (error) d from each data point, i.e.,
d1 = y1 − f(x1), d2 = y2 − f(x2), . . . dn = yn − f(xn). According to the method of
least squares, the best fitting curve has the property that:
Π = d21 + d
22 + . . . + d
2n =
nX
i=1
d2i =
nX
i=1
[ yi − f(xi)]2 = a minimum
Polynomials are one of the most commonly used types of curves in curve fitting.Methods of least squares curve fitting using polynomials include:
least-squares line (i.e. 1st degree polynomial) method,least-squares parabola (i.e. 2nd degree polynomial) method,least-squares mth degree polynomials methodmultiple regression least-squaresibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 13 / 33
Least Square RegressionLeast-squares Line Method
To approximate the given set of data, (x1, y1), (x2, y2), . . . , (xn, yn) where n ≥ 2, the
least-squares line uses a straight line
y = a1x + a0 (11)
The best fitting curve f(x) has the least square error, i.e.,
Π =nX
i=1
[ yi − f(xi)]2
=nX
i=1
[ yi − (a1x + a0)]2
= a minimum
where a0 and a1 are unknown coefficients while all xi and yi are given.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 14 / 33
Least Square RegressionLeast-squares Line Method
To obtain the least square error, the unknown coefficients a0 and a1 must yield zero
first derivatives
∂Π
∂a0= 2
nX
i=1
x0i [ yi − (a1xi + a0)] = 0
∂Π
∂a1= 2
nX
i=1
x1i [ yi − (a1xi + a0)] = 0
Expanding the above equations, we have:
nX
i=1
x0i yi = a0
nX
i=1
x0i + a1
nX
i=1
x1i
nX
i=1
x1i yi = a0
nX
i=1
x1i + a1
nX
i=1
x2i
The unknown coefficients a0, and a1 can hence be obtained by solving the above
system of linear equations, or . . .ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 15 / 33
Least Square RegressionLeast-squares Line Method
. . . through Statistics, the optimal solution to the least squares approximation using
a straight line, Eq. (11), is
a1 =
n
„
nP
i=1
xi yi
«
−
„
nP
i=1
xi
«„
nP
i=1
yi
«
n
„
nP
i=1
x2i
«
−
„
nP
i=1
xi
«2(12a)
a0 =1
n
nX
i=1
yi − a0
nX
i=1
xi
!
(12b)
where a1 is the gradient and a0 the y-intercept.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 16 / 33
Least Square RegressionLeast-squares Parabola Method
To approximate the given set of data, (x1, y1), (x2, y2), . . . , (xn, yn) where n ≥ 3, the
least-squares line uses a second order polynomial
y = a2x2 + a1x + a0 (13)
The best fitting curve f(x) has the least square error, i.e.,
Π =nX
i=1
[ yi − f(xi)]2
=nX
i=1
[ yi − (a2x2i + a1xi + a0)]
2
= a minimum
where a0, a1 and a2 are unknown coefficients while all xi and yi are given.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 17 / 33
Least Square RegressionLeast-squares Parabola Method
To obtain the least square error, the unknown coefficients a0, a1 and a2 must yield
zero first derivatives
∂Π
∂a0= 2
nX
i=1
x0i [ yi − (a2x
2i + a1xi + a0)] = 0
∂Π
∂a1= 2
nX
i=1
x1i [ yi − (a2x
2i + a1xi + a0)] = 0
∂Π
∂a2= 2
nX
i=1
x2i [ yi − (a2x
2i + a1xi + a0)] = 0
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 18 / 33
Least Square RegressionLeast-squares Parabola Method
Expanding the above equations, we have:
nX
i=1
x0i yi = a0
nX
i=1
x0i + a1
nX
i=1
x1i + a2
nX
i=1
x2i
nX
i=1
x1i yi = a0
nX
i=1
x1i + a1
nX
i=1
x2i + a2
nX
i=1
x3i
nX
i=1
x2i yi = a0
nX
i=1
x2i + a1
nX
i=1
x3i + a2
nX
i=1
x4i
The unknown coefficients a0, a1, and a2 can be obtained by solving the above
system linear equations.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 19 / 33
Least Square RegressionLeast-squares Parabola Method
Since we expect a quadratic relationship between x and y, it means that all our
plotted data points should lie on a single parabolic curve, Eq. (13),
y = a2x2 + a1x + a0
In other words, the system of equations below
8
>
>
>
<
>
>
>
:
a2x21 + a1x1 + a0 = y1
a2x22 + a1x2 + a0 = y2
...
a2x2n + a1xn + a0 = yn
9
>
>
>
=
>
>
>
;
=⇒
2
6
6
6
4
x21 x1 1
x22 x2 1...
x2n xn 1
3
7
7
7
5
8
<
:
a2
a1
a0
9
=
;
=
8
>
>
>
<
>
>
>
:
y1
y2
...
yn
9
>
>
>
=
>
>
>
;
(14)
should have exactly one consistent solution!
This is unlikely because data measurements are subject to errors. If the exact
solution does not exists, we seek to find the equation of the parabola
y = a2x2 + a1x + a0 which fits our given data best.ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 20 / 33
Least Square RegressionLeast-squares Parabola Method
In general, our problem in Eq. (14) reduces to finding a solution to a system of n
linear equations in m variables, with n > m. Using our traditional notations for
systems of linear equations, we translate our problem into matrix notation. Thus,
we are seeking to solve
Ax = b,
where A is an n × m given matrix (with n > m), x is a column vector with m
variables, and b is a column vector with n given entries.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 21 / 33
Least Square RegressionLeast-squares mth Polynomial Method
To approximate the given set of data, (x1, y1), (x2, y2), . . . , (xn, yn) where
n ≥ m + 1, the least-squares line uses an mth degree polynomial
y = amxm + am−1x
m−1 + . . . + a2x2 + a1x + a0
The best fitting curve f(x) has the least square error, i.e.,
Π =
nX
i=1
[ yi − f(xi)]2
=nX
i=1
[ yi − (amxm + am−1xm−1 + . . . + a2x2 + a1x + a0)]2
= a minimum
where a0, a1, a2 . . . and am are unknown coefficients while all xi and yi are given.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 22 / 33
Least Square RegressionLeast-squares mth Polynomial Method
To obtain the least square error, the unknown coefficients a0, a1, a2 . . . and am must
yield zero first derivatives
∂Π
∂a= 2
nX
i=1
x0i [ yi − (amx
m + am−1xm−1 + . . . + a2x
2 + a1x + a0)] = 0
∂Π
∂b= 2
nX
i=1
x1i [ yi − (amx
m + am−1xm−1 + . . . + a2x
2 + a1x + a0)] = 0
∂Π
∂b= 2
nX
i=1
x2i [ yi − (amx
m + am−1xm−1 + . . . + a2x
2 + a1x + a0)] = 0
... . . .
∂Π
∂b= 2
nX
i=1
xmi [ yi − (amx
m + am−1xm−1 + . . . + a2x
2 + a1x + a0)] = 0
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 23 / 33
Least Square RegressionLeast-squares mth Polynomial Method
Expanding the above equations, we have:
nX
i=1
x0i yi = a0
nX
i=1
x0i + a1
nX
i=1
x1i + a2
nX
i=1
x2i + . . . + am
nX
i=1
xmi
nX
i=1
x1i yi = a0
nX
i=1
x1i + a1
nX
i=1
x2i + a2
nX
i=1
x3i + . . . + am
nX
i=1
xm+1i
nX
i=1
x2i yi = a0
nX
i=1
x2i + a1
nX
i=1
x3i + a2
nX
i=1
x4i + . . . + am
nX
i=1
xm+2i
... . . .
nX
i=1
xmi yi = a0
nX
i=1
xmi + a1
nX
i=1
xm+1i + a2
nX
i=1
xm+2i + . . . + am
nX
i=1
xm+mi
The unknown coefficients a0, a1, a2, . . . , and am can be obtained by solving the
above system of linear equations.ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 24 / 33
Least Square RegressionMultiple Regression
This method estimates the outcomes (dependent variables) which may be affected
by more than one control parameter (independent variables) or there may be more
than one control parameter being changed at the same time.
To approximate the given set of data of two independent variables x and y and one
dependent variable z, e.g. (x1, y1, z1), (x2, y2, z2), . . . , (xn, yn, zn), where n ≥ 3, in
the linear relationship case:
z = a + bx + cy
the best fitting curve f(x) has the least square error, i.e.,
Π =nX
i=1
[ zi − f(xi, yi)]2
=
nX
i=1
[ zi − (a + bxi + cyi)]2
= a minimum
where a, b and c are unknown coefficients while all xi and yi are given.ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 25 / 33
Least Square RegressionMultiple Regression
To obtain the least square error, the unknown coefficients a, b and c must yield zero
first derivatives
∂Π
∂a= 2
nX
i=1
x0i [ zi − (a + bxi + cyi)] = 0
∂Π
∂b= 2
nX
i=1
x1i [ zi − (a + bxi + cyi)] = 0
∂Π
∂c= 2
nX
i=1
y1i [ zi − (a + bxi + cyi)] = 0
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 26 / 33
Least Square RegressionMultiple Regression
Expanding the above equations, we have:
nX
i=1
x0i zi = a
nX
i=1
x0i + b
nX
i=1
x1i + c
nX
i=1
y1i
nX
i=1
x1i zi = a
nX
i=1
x1i + b
nX
i=1
x2i + c
nX
i=1
xiyi
nX
i=1
y1i zi = a
nX
i=1
y1i + b
nX
i=1
xiyi + c
nX
i=1
y2i
The unknown coefficients a, b, and c can be obtained by solving the above system
linear equations.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 27 / 33
Curve Fitting with MatlabExample 2: Using polyfit, polyval & polyder Functions
Problem Statement:An experiment yields the following tabulated x-y pairs:
x 0.000 0.100 0.200 0.300 0.400 0.500
y -0.447 1.978 3.280 6.160 7.080 7.340
x (cont. . . ) 0.600 0.700 0.800 0.900 1.000y (cont. . . ) 7.660 9.560 9.480 9.300 11.20
Perform a curve fit to the data and find the approximating polynomial. Determine the
derivative at x = 0.5.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 28 / 33
Curve Fitting with MatlabExample 2: Using polyfit, polyval & polyder Functions
Solution:To find a polynomial that fits at a set of data we use the command polyfit(x,y,n),where x is a vector containing x-axis values, y is a vector containing y-axis values and nis the order of the polynomial that we want to fit.
Matlab Session>> x = [ 0.000 0.100 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.00℄;>> y = [-0.447 1.978 3.28 6.16 7.08 7.34 7.66 9.56 9.48 9.30 11.2℄;>> n = 2;>> P = polyfit(x,y,n)Next we differentiate the polynomial
Matlab Session>> dP = polyder(P)ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 29 / 33
Curve Fitting with MatlabExample 2: Using polyfit, polyval & polyder Functions
Solution: continued . . .and compute the slope at x = 0.5
Matlab Session>> slope_of_poly = polyval(dP,0.5)We could easily plot the approximating polynomial over the original x-y points to see effect of the fitting on theapproximation of derivatives
Matlab Session>> xi = [0.0:0.01:1.0℄;>> yi = polyval(P,xi);>> plot(x,y,'o',xi,yi,'r-')ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 30 / 33
Curve Fitting with MatlabExample 3: Using diff Function
Problem Statement:An experiment yields unequally spaced data in x and y
x y x y x y
0.00 0.2000 0.36 2.0749 0.64 3.18190.12 1.3097 0.40 2.4560 0.70 2.36300.22 1.3052 0.44 2.8430 0.80 0.23200.32 1.7434 0.54 3.5073 0.81 0.0036
Determine the differences between adjacent elements of both x and y vectors and
compute divided-difference approximations of the derivative.
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 31 / 33
Curve Fitting with MatlabExample 3: Using diff Function
Solution:To differentiate unequally spaced data in x and y we use the command diff(x) anddiff(y), where x is a vector containing x values, y is a vector containing y values
Matlab Session>> diff(x);>> diff(y);To compute divided-difference approximations of the derivative we perform vectordivision of y differences by x differences
Matlab Session>> dydx = diff(y)./diff(x)ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 32 / 33
Bibliography
1 STEVEN C. CHAPRA & RAYMOND P. CANALE (2009): Numerical Methods for Engineers, 6ed,ISBN 0-39-095080-7, McGraw-Hill
2 SINGIRESU S. RAO (2002): Applied Numerical Methods for Engineers and Scientists, ISBN0-13-089480-X, Prentice Hall
3 DAVID KINCAID & WARD CHENEY (1991): Numerical Analysis: Mathematics of ScientificComputing, ISBN 0-534-13014-3, Brooks/Cole Publishing Co.
4 STEVEN C. CHAPRA (2012): Applied Numerical Methods with MATLAB for Engineers andScientists, 3ed, ISBN 978-0-07-340110-2, McGraw-Hill
5 JOHN H. MATHEWS & KURTIS D. FINK (2004): Numerical Methods Using Matlab, 4ed, ISBN0-13-065248-2, Prentice Hall
6 WILLIAM J. PALM III (2011): Introduction to MATLAB for Engineers, 3ed, ISBN978-0-07-353487-9, McGraw-Hill
ibn.`abdullah�dev.null 2014 SKMM 3023 Applied Numerical Methods Curve-Fitting & Interpolation 33 / 33