Error in lsqnonlin() - "Failure in initial objective function evaluation."
Show older comments
I'm building a script to generate a non-linear least squares estimation, after prepping my data, I was able to generate the correct results into an array with the following function:
for x = 1:1000
fun = @(x)((leis_psi_minus(x)*(mu*cons_phi(x)+(1-mu)*poll_negphi(x)))/(cons_phi_minus(x)*(rho*leis_psi(x)+(1-rho)*poll_negpsi(x))));
w2(x) = fun(x);
end
where leis_psi_minus, cons_phi, poll_negphi, cons_phi_minus, leis_psi, poll_negpsi are all arrays of data. But when I try to do lsqnonlin(fun,x0), I get:
Error in lsqnonlin (line 196) initVals.F = feval(funfcn{3},xCurrent,varargin{:});
Caused by: Failure in initial objective function evaluation. LSQNONLIN cannot continue.
I haven't used lsqnonlin before, so I'm trying to research if I need to adjust options, but any insight or advice would be greatly appreciated!
Answers (1)
Star Strider
on 13 Aug 2016
0 votes
For the optimisation functions, the argument of the objective function is the vector of parameters you want to optimise. (Your anonymous function objective function will get your data vectors from your workspace. It is not necessary to include them as arguments.)
What function did you start with, and what do you want to do with it?
10 Comments
Philip Newell
on 13 Aug 2016
Star Strider
on 13 Aug 2016
I’m not following.
My observation is that ‘x’ in your function should be the vector of the parameters you want to optimise. You’re treating your data arrays as functions (it seems to me at least), and that’s confusing MATLAB.
MATLAB wants something like this:
t = ... ; % Vector Of Independent Variables
y = ... ; % Vector Of Dependent Variables
f = @(x) sum((y(:) - x(1).*t(:) + x(2)).^2); % Objective Function
This is just an example and is not what you’re doing. It illustrates what I am doing my best to communicate.
Philip Newell
on 14 Aug 2016
Edited: Star Strider
on 14 Aug 2016
Star Strider
on 14 Aug 2016
My pleasure!
You need to completely vectorise your function, using element-wise operators for all multiplication and division (and exponentiation, although I don’t see any of those in your code), unless you intend to do matrix operations:
fun = @(x)((leis_psi_minus(:).*(x(1).*cons_phi(:)+(1-x(1)).*poll_negphi(:)))./(cons_phi_minus(:).*(x(2).*leis_psi(:)+(1-x(2)).*poll_negpsi(:))));
See if using that version improves your result.
That should do what you want. If it doesn’t, I’ll help as much as I can to get your code to work.
Philip Newell
on 14 Aug 2016
Edited: Philip Newell
on 14 Aug 2016
Star Strider
on 14 Aug 2016
You need a vector of two values for ‘x0’, since you have two parameters.
Other than that, run your function with the initial value of ‘x0’:
x0 = [0.5; 0.5];
test_call = fun(x0)
and see what it returns.
Philip Newell
on 14 Aug 2016
Edited: Philip Newell
on 14 Aug 2016
Star Strider
on 14 Aug 2016
Possibly.
I’m flying blind here, so you will have to experiment to get your function to work with lsqnonlin.
I’m out of ideas.
Philip Newell
on 15 Aug 2016
Star Strider
on 16 Aug 2016
Delete the observations with NaN values. That’s what the Statistics and Machine Learning Toolbox functions do.
Providing your own Jacobian function can speed the regression considerably for large data sets. I don’t know if it improves the ability of the regression to find the global minimum if there are many local minima. The Jacobian you supply is not an objective function. You can include it within your objective function, as described in the lsqnonlin documentation for fun.
You can only fit one objective function at a time, but you can fit as many models (objective functions) as you want. There are ways to compare two models to see which is the best fit, but that requires that both models describe the actual process that created your data with reasonable accuracy. (When I did that, I used the likelihood ratio test to determine if there was a significant difference between them. You have to determine if that is appropriate for your application.) Comparing multiple models (more than two) requires that you take the statistics of multiple comparisons into consideration. This varies with the test you’re doing, so I’ll leave that to you to research.
Categories
Find more on Nonlinear Least Squares (Curve Fitting) in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!