initlalize several variables at once

I currently have a code where I am initializing many parameters, that will be solved for later.
I do this by:
C1=params(1);
C2=params(2);
C3=params(3);
C4=params(4);
C5=params(5);
C6=params(6);
C7=params(7);
C8=params(8);
C9=params(9);
C10=params(10);
C11=params(11);
C12=params(12);
C13=params(13);
There's got to be an easier way...
Additionally, I set my intial guesses like:
params0=[0.1,0.1,0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1];
Is there a way to more easily set this, assuming they all have the same initial guess? Typing them all out that way is tedious, and my code is constantly changing. How can I automate it?

 Accepted Answer

Matt J
Matt J on 19 Mar 2019
Edited: Matt J on 19 Mar 2019
If all you're trying to do is polynomial surface fits of various orders, I would just use the Curve Fitting Toolbox, if you have it
f = fit([x,y],z, 'poly55')
See also,

22 Comments

I'm not really sure if this will work. I'm multiplying the polynomial by another function. The total function being optimized is not just a polynomial. I didn't know if there was an easier way than what I was doing.Perhaps you could understand better if I just post the whole thing. Here is the code, and some data is atached if you want to run it. I'm just trying to get a good fit of this data. Note that the rdf_contact function cannot change.
clear
clc
close all
num=xlsread('C:\example.xlsx',1,'A2:F18110');
eta = num(:,3);
r = num(:,4);
g = num(:,6);
%Do the surface fit
options=optimoptions(@lsqcurvefit,'SpecifyObjectiveGradient',false);
xdata={r,eta};
params0=linspace(0.1, 0.1, 21);
[params, resnorm]=lsqcurvefit(@modelfun,params0,xdata,g,[],[],options)
for i=1:21
C(i)=params(i);
end
xmin = 1;
xmax = 2;
dx = 0.01;
etamin=0.10472;
etamax=0.55;
deta=0.01;
[Xgrid,etagrid]=ndgrid(xmin:dx:xmax,etamin:deta:etamax);
surf(Xgrid,etagrid,modelfun(params,{Xgrid,etagrid}))
zlim([0 8]);
hold on;
scatter3(r(:),eta(:),g(:),'MarkerEdgeColor','none',...
'MarkerFaceColor','b','MarkerFaceAlpha',.5); hold off
xlabel 'x', ylabel '\eta', zlabel 'g(x,\eta)'
function [out,Jacobian]=modelfun(params,xdata)
X=xdata{1};
Y=xdata{2};
for i=1:21
C(i)=params(i);
end
A1 = -0.4194;
A2 = 0.5812;
A3 = 0.6439;
A4 = 0.4730;
eta_c = 0.6452;
PV = 1 + 3*Y./(eta_c-Y)+ A1*(Y./eta_c) + 2*A2*(Y./eta_c).^2 + 3*A3*(Y./eta_c).^3 + 4*A4*(Y./eta_c).^4;
rdf_contact = (PV - 1)./(4*Y);
[I,J]=ndgrid(0:1);
d=I+J;
[orders{1:max(d(:)+1)}]=deal(0);
for k=1:numel(d)
i=I(k); j=J(k);
orders{i+j+1}=orders{i+j+1} + C(k) * X.^i .* Y.^j ;
end
zeroth = C(1) .* (X.^0) .* (Y.^0);
first = C(2) .* (X.^1) .* (Y.^0) + C(3) * (X.^0) .* (Y.^1);
second = C(4) .* (X.^2) .* (Y.^0) + C(5) * (X.^1) .* (Y.^1) + C(6) * (X.^0) .* (Y.^2);
third = C(7) .* (X.^3) .* (Y.^0) + C(8) * (X.^2) .* (Y.^1) + C(9) * (X.^1) .* (Y.^2) + C(10) * (X.^0) .* (Y.^3);
fourth = C(11) .* (X.^4) .* (Y.^0) + C(12) * (X.^3) .* (Y.^1) + C(13) * (X.^2) .* (Y.^2) + C(14) * (X.^1) .* (Y.^3)+ C(15) * (X.^0) .* (Y.^4);
fifth = C(16) .* (X.^5) .* (Y.^0) + C(17) * (X.^4) .* (Y.^1) + C(18) * (X.^3) .* (Y.^2) + C(19) * (X.^2) .* (Y.^3)+ C(20) * (X.^1) .* (Y.^4)+ C(21) * (X.^0) .* (Y.^5);
poly_guess = zeroth+first+second+third+fourth;
out = (poly_guess).*rdf_contact;
if nargout>1
%Jacobian=[X(:), Y(:), ones(size(X(:)))];
end
end
Matt J
Matt J on 19 Mar 2019
Edited: Matt J on 19 Mar 2019
Still, there are tools on the File Exchange like these that make evaluating 2D polynomial functions and their derivatives easier, and which you could incorporate to ease the coding of your model function. You would just have to conform to the conventions of these tools in terms of the ordering of the polynomial coefficients.
Benjamin Cowen
Benjamin Cowen on 19 Mar 2019
Edited: Benjamin Cowen on 20 Mar 2019
But my functions is more than just a polynomial.I multiply the polynomial by another function. Also, in my last post, you recommended lsqcurvefit tool. Why are you suggesting another tool now? What is wrong with my current method? I'm just adding higher order polynomials, why do I need a file exchange file to do this?
I guess it would help me if you could clarify why you are suggesting another tool now, what is wrong with lstcurvefit?
How would I change my code to incorporate the poly55? It needs to work in context with the lsq model.
Note that in the code I posted above i included the snippet of the code you sent, but it is not being used since I am not sure how to use it.
also the link is just a link to this page
I guess it would help me if you could clarify why you are suggesting another tool now, what is wrong with lstcurvefit?
I'm not suggesting that you leave aside lsqcurvefit anymore, now that I know your model function isn't purely polynomial. What I am suggesting, though, is that the sections of your model function code that are 2D polynomial evaluations can be replaced/simplified with the evaluation tools from the link I gave you. For example, this part,
zeroth = C(1) .* (X.^0) .* (Y.^0);
first = C(2) .* (X.^1) .* (Y.^0) + C(3) * (X.^0) .* (Y.^1);
second = C(4) .* (X.^2) .* (Y.^0) + C(5) * (X.^1) .* (Y.^1) + C(6) * (X.^0) .* (Y.^2);
third = C(7) .* (X.^3) .* (Y.^0) + C(8) * (X.^2) .* (Y.^1) + C(9) * (X.^1) .* (Y.^2) + C(10) * (X.^0) .* (Y.^3);
fourth = C(11) .* (X.^4) .* (Y.^0) + C(12) * (X.^3) .* (Y.^1) + C(13) * (X.^2) .* (Y.^2) + C(14) * (X.^1) .* (Y.^3)+ C(15) * (X.^0) .* (Y.^4);
fifth = C(16) .* (X.^5) .* (Y.^0) + C(17) * (X.^4) .* (Y.^1) + C(18) * (X.^3) .* (Y.^2) + C(19) * (X.^2) .* (Y.^3)+ C(20) * (X.^1) .* (Y.^4)+ C(21) * (X.^0) .* (Y.^5);
poly_guess = zeroth+first+second+third+fourth;
can all be repaced with a single line of code,
poly_guess = polyVal2D(C,X,Y,5,5);
and you can easily play with other polynomial orders as well. The catch, however, is that polyVal2D has its own idea about how the coefficients C(i) are to be ordered, so you would have to conform to that convention.
Note also that by studying the code inside polyVal2D.m, you can find an answer to your originally posted question, as well. The code in polyVal2D.m works for any polynomial order without breaking the array C into separate variables C1,C2,....Cn.
I downloaded polyVal2D.m and put in same directory. This is where I downloaded it: https://www.mathworks.com/matlabcentral/fileexchange/41097-polyval2d-and-polyfit2d Then I ran it. Then I got these errors below. If I change it 2,2, or 3,3 it works. Why does 5,5 not work? 4,4 also does not work. Also, why does 2,2 give me so much better results than when I ran it with my own 2nd order polynomial?
When I ran my 2nd order, i get resnorm of 1262. This code gives 528. Why is this one better? It is interesting... my 2nd order polynomial only would have 6 coefficients. Why does this one have 9?
Index exceeds array bounds.
Error in polyVal2D (line 64)
g = g.*x+p(mj+ni);
Error in fitting_rdf>modelfun (line 74)
poly_guess = polyVal2D(C,X,Y,5,5);
Error in lsqcurvefit (line 213)
initVals.F = feval(funfcn_x_xdata{3},xCurrent,XDATA,varargin{:});
Error in fitting_rdf (line 17)
[params, resnorm]=lsqcurvefit(@modelfun,params0,xdata,g,[],[],options)
Caused by:
Failure in initial objective function evaluation. LSQCURVEFIT cannot continue.
I'm guessing here a bit, because your latest code is hidden from me.
However, the reason is probably this: the polyVal2D code assumes that when you call
polyVal2D(C,X,Y,m,n);
you are evaluating a polynomial consisting of X.^m .* Y.*n and all lower order mononomials. In other words, it expects (m+1)*(n+1) coefficients as input. You appear to be omitting some of the terms and giving a correspondingly shorter C vector than it expects. If there are terms you are certain that you don't want to include, a way to handle that is to set the appropriate coefficients to zero.
The reason you are getting lower fitting error for your "2nd order" model is undoubtedly because polyVal2D includes more polynomial terms than your own implementation (9 instead of 6).
Where can I see how the coefficients are set up? How does the polynomial in that code differ from mine?
Matt J
Matt J on 20 Mar 2019
Edited: Matt J on 20 Mar 2019
In the help doc for polyVal2D, or maybe in polyFit2D. I definitely saw it in there somewhere.
Ah yes, I found it. Thanks. One more question: I keep getting:
Solver stopped prematurely.
lsqcurvefit stopped because it exceeded the function evaluation limit,
options.MaxFunctionEvaluations = 5000 (the default value).
How can I change the default value so that more iterations are done? It currently does 100*number of C variables (So 5,000 since I initiated 50). Is there a way I can change this default setting? I notice that resnorm decreases if allow more variables even if they aren't used, because it allows more iterations.
If I read it correctly, here is a way of generating a binary map of the coefficients that you have been including,
N=5; %order of the polynomial
map=(N:-1:0).'+(N:-1:0)<=N
So for example, with N=5, you get this
map =
6×6 logical array
0 0 0 0 0 1
0 0 0 0 1 1
0 0 0 1 1 1
0 0 1 1 1 1
0 1 1 1 1 1
1 1 1 1 1 1
In other words, polyVal2D expects a 6x6 matrix of coefficients as input and the above map variable shows which of the coefficients in the matrix you have been including.
I think you misread my question. How can I manually increas the number of iterations? Right now it is saying the solver stopped prematurely due to the defualt value of options.MaxFunctionEvaluations
Add a MaxFunctionEvaluations parameter to your optimoptions call. https://www.mathworks.com/help/optim/ug/lsqcurvefit.html#buuhcjo-options
No, I did not misread your question. My remark was in reference to earlier comments about the correspondence between your coefficients and polyVal2D's.
To adjust the options, however, you should get familiar with optimoptions. You have seen me use this in earlier examples.
I've tried adding this line in my code (outside of function) and then inside function, and it does not seem to change the MaxFunctionEvaluations
options = optimoptions('patternsearch','MaxFunctionEvaluations',100000);
Why 'patternsearch'?
What I meant was, why 'patternsearch' instead of 'lsqcurvefit'? Are you now trying to use a different solver?
ah ok got it, sorry
Once again, you are summarizing what you see instead of showing us the code you are running. So, once again, I will guess.
My guess is that you have created the options object, but you haven't actually passed it to lsqcurvefit
[coefficients,resnorm]=lsqcurvefit(@modelfun,params0,{X,Y},[],[],options)
Benjamin
Benjamin on 20 Mar 2019
Edited: Benjamin on 20 Mar 2019
Sorry, I got that figured out. You were right, I wasn't passing it correctly. I have a question about the Jacobian. As these polynomials get very large, doesn't this mean I should just let MATLAB estimate the Jacobian? Seems tedious to calculate this for each different polynomial that may be optimizing over 20 different coeffients right?
Benjamin
Benjamin on 20 Mar 2019
Edited: Benjamin on 20 Mar 2019
How can I constraint the fit? When eta or Y in this case = 0, I need my function to be 1. Seems like then C_00 needs to equal 1, and Ci0 = 0? Any ideas how to add this contraint?
Matt J
Matt J on 20 Mar 2019
Edited: Matt J on 20 Mar 2019
I would post that as a new question. Our back and forth has already deviated considerably from your originally posted topic.
Yeah, I posted a new question. Thanks. I accepted the answer here as you are right, we have deviated a lot

Sign in to comment.

More Answers (3)

Matt J
Matt J on 19 Mar 2019
Edited: Matt J on 19 Mar 2019
Is there a way to more easily set this, assuming they all have the same initial guess?
One way:
params0=linspace(0.1,0.1,13)
Another is:
params0=repelem(0.1,1,13);
There's got to be an easier way...
If you have a very large vector of variables, it does not make sense to assign them to separate variables. You should probably be manipulating them as a vector. But if you must index the variables individually, I would just rename params to something shorter,
C=params;
and then refer to the variables via indexing C(1), C(2),..C(13) throughout the code. Although, it is then unclear why you chose to name the vector params in the first place...

4 Comments

Hmmm I like the idea... Here is what I have:
zeroth = C1 * (X.^0) .* (Y.^0);
first = C2 * (X.^1) .* (Y.^0) + C3 * (X.^0) .* (Y.^1);
How can I change these to C(1) etc? and then quickly create all of these, instead of all individually?
Nothing, I just had to create C. I did it like what I have below and it seems to be working
for i=1:13
C(i)=params(i);
end
Perhaps instead of zeroth, first,etc... you generate a cell array called "orders" like so:
T=[0 0; 1 0; 0 1]; %table of exponent combinations
[orders{1:max(d)+1}]=deal(0); %initialize
for k=1:numel(C)
i=T(k,1); j=T(k,2);
orders{i+j+1}=orders{i+j+1} + C(k) * X.^i .* Y.^j ;
end
I'm currently doing this: I'm not exactly sure how to replace it with your code. The way I currently have it allows be to check the validity of my fit starting at zeroth order, and then I can individually add in higher order. How can I use your code, which seems much more efficient, to do what I am doing here? Where I just change the order to say, 4, or 5, and accomplishes what I am below? I'm finding that a 5th order polynomial fits my data really well. Unfortunately, the surface extends beyond my data, and I am not sure how to automatically get rid of those values, in order to make my surface look nice. I know you said the accuracy calcuation is not affected, but my graph would look so much better if the surface could be chopped off somehow in regions where there is no data.
zeroth = C(1) .* (X.^0) .* (Y.^0);
first = C(2) .* (X.^1) .* (Y.^0) + C(3) * (X.^0) .* (Y.^1);
second = C(4) .* (X.^2) .* (Y.^0) + C(5) * (X.^1) .* (Y.^1) + C(6) * (X.^0) .* (Y.^2);
third = C(7) .* (X.^3) .* (Y.^0) + C(8) * (X.^2) .* (Y.^1) + C(9) * (X.^1) .* (Y.^2) + C(10) * (X.^0) .* (Y.^3);
fourth = C(11) .* (X.^4) .* (Y.^0) + C(12) * (X.^3) .* (Y.^1) + C(13) * (X.^2) .* (Y.^2) + C(14) * (X.^1) .* (Y.^3)+ C(15) * (X.^0) .* (Y.^4);
fifth = C(16) .* (X.^5) .* (Y.^0) + C(17) * (X.^4) .* (Y.^1) + C(18) * (X.^3) .* (Y.^2) + C(19) * (X.^2) .* (Y.^3)+ C(20) * (X.^1) .* (Y.^4)+ C(21) * (X.^0) .* (Y.^5);
poly_guess = zeroth+first+second+third+fourth;

Sign in to comment.

Matt J
Matt J on 20 Mar 2019
Edited: Matt J on 20 Mar 2019
One other thing I would point out is the model function you have shown us here is linear in the coefficients. This means that it can be represented as a matrix multiplication and the entire fit can be done with simple linear algebra instead of an iterative solver like lsqcurvefit.
To faciliate this you could download my func2mat utility and then do as follows
A=func2mat(@(p) modelfun(p,xdata), params0);
coefficients =A\g(:);
That's it!

4 Comments

It is better, because it is non-iterative. You don't need to change anything in your model function code (as long as it works, of course).
So what all do I need to change? I feel like my entire code is built around lsqcurvefit. I'm not sure how to change it to use this utilty.
Nothing. Whatever modelfun() you were planning to use with lsqcurvefit, you simply use instead in the above 2 lines of code.
Hmmm, I can't seem to figure it out... Here is my code (in all of its glory). What do I need to do to change from lsq to matrix multiplication? Btw, the lsq method seems to be working really well. The fits are looking amazing, and I appreciate all your help with getting that to work!
clear
clc
close all
num=xlsread('C:\example.xlsx',1,'A2:F18110');
eta = num(:,3);
r = num(:,4);
g = num(:,6);
%Do the surface fit
options=optimoptions(@lsqcurvefit,'SpecifyObjectiveGradient',false,'MaxFunctionEvaluations',10000);
xdata={r,eta};
params0=linspace(0.1, 0.1, 50);
[params, resnorm]=lsqcurvefit(@modelfun,params0,xdata,g,[],[],options)
for i=1:50
C(i)=params(i);
end
xmin = 1; xmax = 2; dx = 0.01;
etamin=0.10472; etamax=0.55; deta=0.01;
[Xgrid,etagrid]=ndgrid(xmin:dx:xmax,etamin:deta:etamax);
surf(Xgrid,etagrid,modelfun(params,{Xgrid,etagrid}))
zlim([0 8]);
hold on;
scatter3(r(:),eta(:),g(:),'MarkerEdgeColor','none',...
'MarkerFaceColor','b','MarkerFaceAlpha',.5); hold off
xlabel 'x', ylabel '\eta', zlabel 'g(x,\eta)'
function [out,Jacobian]=modelfun(params,xdata)
X=xdata{1};
Y=xdata{2};
for i=1:50
C(i)=params(i);
end
A1 = -0.4194;
A2 = 0.5812;
A3 = 0.6439;
A4 = 0.4730;
eta_c = 0.6452;
PV = 1 + 3*Y./(eta_c-Y)+ A1*(Y./eta_c) + 2*A2*(Y./eta_c).^2 + 3*A3*(Y./eta_c).^3 + 4*A4*(Y./eta_c).^4;
rdf_contact = (PV - 1)./(4*Y);
poly_guess = polyVal2D(C,X,Y,5,5);
out = (poly_guess.*rdf_contact);
if nargout>1
%Jacobian=[X(:), Y(:), ones(size(X(:)))];
end
end

Sign in to comment.

Just to sum things up here, below is my implementation of the fit, combining all my various recommendations. Both the algebraic method that I proposed and the method using lsqcurvefit are implemented and compared. You will see that the algebraic method is both faster and more accurate (gives a lower resnorm) than lsqcurvefit.
%% Data Set-up
num=xlsread('example.xlsx',1,'A2:F18110');
Np=4; %polynomial order
g = num(:,6);
otherStuff.r = num(:,4);
otherStuff.eta = num(:,3);
otherStuff.g = g;
otherStuff.eta_c = 0.6452;
otherStuff.Np=Np;
otherStuff.A1 = -0.4194;
otherStuff.A2 = 0.5812;
otherStuff.A3 = 0.6439;
otherStuff.A4 = 0.4730;
%% Fitting (2 methods)
%fit using matrix algebra
tic;
A=func2mat(@(p) modelfun(p,otherStuff), ones(Np+1,Np+1));
Cfit1 =A\g(:);
resnorm1=norm(A*Cfit1-g(:));
toc %Elapsed time is 0.124593 seconds.
%fit iteratively with lsqcurvefit
tic;
options=optimoptions(@lsqcurvefit,'MaxFunctionEvaluations',10000);
Cinitial(1:Np+1,1:Np+1)=0.1;
[Cfit2, resnorm2]=lsqcurvefit(@modelfun,Cinitial,otherStuff,g,[],[],options);
toc %Elapsed time is 1.687990 seconds.
%% Display
figure(1); showFit(Cfit1,otherStuff);
figure(2); showFit(Cfit2,otherStuff);
resnorm1,resnorm2,
function ghat=modelfun(C,otherStuff)
r = otherStuff.r;
eta = otherStuff.eta;
eta_c = otherStuff.eta_c;
Np = otherStuff.Np;
A1 = otherStuff.A1;
A2 = otherStuff.A2;
A3 = otherStuff.A3;
A4 = otherStuff.A4;
PV = 1 + 3*eta./(eta_c-eta)+ A1*(eta./eta_c) + 2*A2*(eta./eta_c).^2 +...
3*A3*(eta./eta_c).^3 + 4*A4*(eta./eta_c).^4;
rdf_contact = (PV - 1)./(4*eta);
Cflip=rot90(reshape(C,Np+1,Np+1),2);
poly_guess = polyVal2D(Cflip,r-1,eta/eta_c,Np,Np);
ghat = (poly_guess.*rdf_contact);
end
function showFit(Cfit,otherStuff)
r = otherStuff.r(:);
eta = otherStuff.eta(:);
g = otherStuff.g(:);
ghat=modelfun(Cfit,otherStuff);
tri = delaunay(r,eta);
%% Plot it with TRISURF
h=trisurf(tri, r, eta, ghat);
h.EdgeColor='b';
h.FaceColor='b';
axis vis3d
hold on;
scatter3(r,eta,g,'MarkerEdgeColor','none',...
'MarkerFaceColor','r','MarkerFaceAlpha',.05);
xlabel 'r', ylabel '\eta', zlabel 'g(r,\eta)'
hold off
end
Both methods give a similar looking surface plot:
untitled.png

Categories

Tags

Asked:

on 19 Mar 2019

Answered:

on 20 Mar 2019

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!