This preview shows page 1. Sign up to view the full content.
Unformatted text preview: We often needs to fit a straight line to data in order to get a physical constant. An example is when we fit stress and strain data in order to get Young's modulus. Consider, for example, testing of aluminum. Assume that the modulus is E=10 million psi. So if we had exact measurements, we would get the following table Of strains vs. stress. We will measure the strains in milistrains and the stress in kpsi. Then = 10 Strain 0 (milistrain) Stress (ksi) 0 1 10 2 20 3 30 4 40 5 50 Because of experimental and measurement errors, the measured values will be slightly different. We typically assume that the errors are normally distributed (that is follow The bell curve). We can generate random normal variables by using the randn function in Matlab. It generates numbers with a mean of zero, and with unit standard deviation. >> randn(1,6) ans = 0.4326 1.6656 0.1253 0.2877 1.1465 1.1909 >> randn(1,6) ans = 1.1892 0.0376 0.3273 0.1746 0.1867 0.7258 If we want to get the same "random" number again and again, we can fix the state of random number function. >> randn('state',0) >> randn(1,6) ans = 0.4326 1.6656 0.1253 0.2877 1.1465 1.1909 If we want to convince ourselves that we are getting the bell curve, we can generate a large number of random numbers and do a histogram >> y=randn(1,10000); hist(y) 3000 2500 2000 1500 1000 500 0 4 3 2 1 0 1 2 3 4 We can refine the histogram by specifying more boxes. >> x=[4:0.5:4]; >> hist(y,x) 2500 2000 1500 1000 500 0 5 4 3 2 1 0 1 2 3 4 5 Let us add the 6 random numbers we generated to the stress data. >> stress=[0:10:50] stress = 0 10 20 30 40 50 >> error_m=randn(1,6) error_m = 0.4326 1.6656 0.1253 0.2877 1.1465 1.1909 >> stress_m=stress+error_m stress_m = 0.4326 8.3344 20.1253 30.2877 38.8535 51.1909 We can now fit the stresses and the strains using the Matlab function polyfit >> strain=[0:1:5] strain = 0 1 2 3 4 5 >> [p s]=polyfit(strain,stress_m,1) p = 10.2811 0.9761 s = R: [2x2 double] df: 4 normr: 1.9903 That is we get that = 0.9761 + 10.2811 >> stress_fit=polyval(p,strain) stress_fit = 0.9761 9.3050 19.5860 29.8671 40.1481 50.4292 We plot the measured stress (circles), true stress (blue) and fitted stress (green). 60 50 40 30 20 10 0 10 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 We see that the fitted stress is closer to the true one than the data. We can see that also by comparing the errors error_m = 0.4326 1.6656 0.1253 0.2877 1.1465 1.1909 >> etrue=stress_fitstress etrue = 0.9761 0.6950 0.4140 0.1329 0.1481 0.4292 However, we normally do not have the true error, but the error in the fit. >> efit=stress_fitstress_m efit = 0.5435 0.9705 0.5393 0.4206 1.2946 0.7617 The standard error, Eq. 13.19 is an estimate of the standard deviation of the error in the data. It is the length of the error vector, divided by the square root of (n2), where n is the number of points >> std_err=sqrt(norm(efit)/2) std_err = 0.9976 Which very close to what we generated with the random numbers. On the other hand, the root mean square of the true error is >> rms=sqrt(etrue*etrue'/6) rms = 0.5524 By using six data points to fit two coefficients we reduce the measurement error by about a factor of 2. Besides the standard error, a common measure of the goodness of the fit is the coefficient of determination r 2 = 1  St / S r St = ( yi  y ) 2 St = ( yi  y fiti ) 2 = e fiti 2 >> stress_dev=stress_mmean(stress_m) stress_dev = 25.1591 16.3921 4.6012 5.5611 14.1270 26.4644 >> st=stress_dev*stress_dev' st = 1.8537e+003 >> sr=efit*efit' sr = 3.9614 >> rsquare=1sr/st rsquare = 0.9979 Which is very good because it is close to one. ...
View
Full
Document
This note was uploaded on 03/27/2012 for the course EGM 3344 taught by Professor Raphaelhaftka during the Spring '09 term at University of Florida.
 Spring '09
 RAPHAELHAFTKA
 Strain, Stress

Click to edit the document details