next up previous contents
Next: Symbolic Variables Up: Vectors and Matrices (2) Previous: LAPACK

Comparison of LAPACK and Jasymca Routines

We calculate the 4-th degree regression polynomial for the following x,y-data:
>> x=[1:6],y=x+1
x = [ 1  2  3  4  5  6 ]
y = [ 2  3  4  5  6  7 ]
>> polyfit(x,y,4)
p = 
  5.1958E-14   -9.6634E-13  -2.4727E-12  1   1
The coefficients p(1),p(2),p(3) should vanish since x and y represent a perfect straight line. This is an unstable problem, and it can be easily extended to make Jasymca completely fail. In our second attempt we use the Lapack-routine linlstsq:
>> x=[1:6],y=x+1;
>> l=length(x);n=4;
>> X=(x'*ones(1,n+1)).^(ones(l,1)*(n:-1:0))
>> linlstsq(X,y')
ans = 
  -1.6288E-18  
  -7.0249E-17  
  1.0653E-15   
  1            
  1
The coefficients p(1),p(2),p(3) are now significantly smaller. This particular problem can be solved exactly using Jasymca-routines and exact numbers, which avoids any rounding errors:
>> x=rat([1:6]);y=x+1;
>> polyfit(x,y,4)
p = 
  0  0  0  1  1
0



Helmut Dersch
2009-03-15