curve fitting of two equation to two data sets simultaneously

29 views (last 30 days)
I have two sets of dependent variable column vectors (G1, G2) for a set of independent variable column vectors (x). I want to fit the (x, y1) plot with the function f1(ca. f1 = a*b^2*x^2/(1 + b^2*x^2)) and the (x, G2) plot with the function f2(ca. f2 = a*b*x/(1 + b^2*x^2)) simultaneously. To get the constant a and b.
Many thanks, in advance! Irfan

Answers (3)

Matt J
Matt J on 30 Mar 2023
Edited: Matt J on 31 Mar 2023
lsqcurvefit will do it.
x=x(:); G1=G1(:); G2=G2(:); %ensure column vectors
I=logical(x);
b0=mean(G1(I)./G2(I)./x(I)); %Initial guesses
a0=mean([G1,G2],'all')./mean( modelFcn([1,b0],x) ,'all');
ab=lsqcurvefit(@modelFcn,[a0,b0],x,[G1,G2]);
function f=modelFcn(p,x)
a=p(1); b=p(2);
denom=(1 + b^2*x.^2);
f1=a*b^2*x.^2./denom;
f2 = a*b*x./denom;
f=[f1,f2];
end
  2 Comments
Torsten
Torsten on 31 Mar 2023
Error message will most probably be:
length of xdata and ydata must be equal.
Further it's strange that x=0 as a value of the independent variable will cause a division-by-zero in the calculation of initial values for a and b.
Matt J
Matt J on 31 Mar 2023
Edited: Matt J on 31 Mar 2023
length of xdata and ydata must be equal.
There is no requirement that xdata and ydata be of equal length, e.g,
xdata=[];
ydata=2;
p0=1;
p=lsqcurvefit(@(p,~)p, p0,xdata,ydata )
Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
p = 2
x=0 as a value of the independent variable will cause a division-by-zero in the calculation of initial values for a and b.
Yes, I've fixed that.

Sign in to comment.


Walter Roberson
Walter Roberson on 30 Mar 2023
Calculus and least-squared residues:
format long g
rng(655321)
x = -10:10; %change as appropriate
G1 = sort(rand(size(x))); %change as appropriate
G2 = sort(rand(size(x))); %change as appropriate
syms a b
f1 = a.*b.^2*x.^2./(1 + b.^2.*x.^2);
f2 = a.*b.*x./(1 + b.^2.*x.^2);
residue1 = sum((f1 - G1).^2);
residue2 = sum((f2 - G2).^2);
residue = residue1 + residue2;
a_partial = solve(diff(residue,a),a)
a_partial = 
eqn = subs(residue,a,a_partial);
b_partial = vpasolve(diff(eqn,b),b)
b_partial = 
b_partial(abs(imag(b_partial)) > 1e-30) = [];
b_partial = real(b_partial);
a_full = double(subs(a_partial, b_partial))
a_full = 2×1
-2.6390814897826e-41 0.734664435474116
b_full = double(b_partial)
b_full = 2×1
-0.106506080674488 0.220130326363154

John D'Errico
John D'Errico on 30 Mar 2023
Are the values of x the same for the two vectors y1 and y2?
You don't show any data, so it is difficult to help more, as there are several question I would have that would be trivially easily answered just from the data.
For example, IF x is the same for BOTH vectors y1 and y2, then assuming there are not some other issues I can think of, then this would hold:
y1./y2 = b*x
If so, then you can trivially estimate b as:
b = x(:)\(y1(:)./y2(:));
Once you know b, then again, it is trivial to solve for a.
a = [b^2*x(:).^2./(1+ b^2*x(:).^2) ; b*x(:)./(1+ b^2*x(:).^2)]\[y1(:);y2(:)];
Can this fail? OF COURSE! If there are regions where y1 and y2 are very near zero, then you will be in an almost divide by zero regime when you compute b. That will make this scheme poorly conditioned. But I cannot know that to be the case, since I don't have your data.
  2 Comments
Torsten
Torsten on 30 Mar 2023
Doesn't it introduce bias to b if you determine it from fitting y1/y2 ~ b*x ?
Similarly (but I think even worse) to fitting a and b from
log(y) = a*log(x)+log(b)
instead of
y = b*x^a
?
John D'Errico
John D'Errico on 30 Mar 2023
Possibly there may be an issue. But without seeing any data, that is tough to know. If the noise is fairly low, and y2 is not near zero, then there should be little problem. And this olution would then be a very nice scheme to compute decent estimates.
At the very least, this would potentially be a very realiztic way to estimate starting values for hte parameters.
Again, without seeing any data, all is just a wild guess.

Sign in to comment.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!