How to make the initial population in genetic algorithm fixed?

40 views (last 30 days)
i am currently using genetic algorithm to optimize some variables for a problem, i know that there is creation function that creates random initial population, but how to stop creating random initial population? i want to compare the results of each time i run the algorithm with each other, and i cant compare between them if the initial population is different.
% Create optimization variables
Cf17 = optimvar("Cf","LowerBound",1e-9,"UpperBound",1e-2);
Lc15 = optimvar("Lc","LowerBound",1e-9,"UpperBound",1e-2);
Lf16 = optimvar("Lf","LowerBound",1e-9,"UpperBound",1e-2);
Cc14 = optimvar("Cc","LowerBound",1e-9,"UpperBound",1e-2);
f17 = optimvar("f","LowerBound",1e3,"UpperBound",15e4);
% Set initial starting point for the solver
initialPoint16.Cf = repmat(1e-9,size(Cf17));
initialPoint16.Lc = repmat(1e-9,size(Lc15));
initialPoint16.Lf = repmat(1e-9,size(Lf16));
initialPoint16.Cc = repmat(1e-9,size(Cc14));
initialPoint16.f = repmat(1e3,size(f17));
% Create problem
problem = optimproblem;
% Define problem objective
problem.Objective = fcn2optimexpr(@objectiveFcn,Lc15,Cc14,Lf16,Cf17,f17);
% Define problem constraints
problem.Constraints = constraintFcn(Cf17,f17);
% Set nondefault solver options
options8 = optimoptions("ga","ConstraintTolerance",1e-06,"Display","iter",...
% Display problem information
OptimizationProblem : Solve for: Cc, Cf, Lc, Lf, f minimize : objectiveFcn(Lc, Cc, Lf, Cf, f) subject to : (Cf - (380 ./ (((2 .* f) .* 6) .* 14.2857))) == 0 variable bounds: 1e-09 <= Cc <= 0.01 1e-09 <= Cf <= 0.01 1e-09 <= Lc <= 0.01 1e-09 <= Lf <= 0.01 1000 <= f <= 150000
% Solve problem
[solution,objectiveValue] = solve(problem,initialPoint16,"Solver","ga",...
Solving problem using ga. Single objective optimization: 5 Variable(s) 1 Nonlinear equality constraint(s) Options: CreationFcn: @gacreationuniform CrossoverFcn: @crossoverscattered SelectionFcn: @selectionroulette MutationFcn: @mutationadaptfeasible Best Max Stall Generation Func-count f(x) Constraint Generations 1 111 -248.877 0.007184 0 2 212 -248.877 0.007184 1 3 313 -248.877 0.007184 2 4 414 -248.877 0.007184 3 5 515 -248.877 0.007184 4 6 616 -248.877 0.007184 5 7 717 -248.877 0.007184 6 8 818 -248.877 0.007184 7 9 919 -248.877 0.007184 8 10 1020 -248.877 0.007184 9 Optimization terminated: maximum number of generations exceeded.
% Clear variables
clearvars Cf17 Lc15 Lf16 Cc14 f17 initialPoint16 options8
function objective = objectiveFcn(Lc,Cc,Lf,Cf,f)
dp = 1-DutyC;
w1 = -Lc;
w2 = dp*R;
u1 = Lc*Cc*R;
u2 = Lc;
u3 = (dp^2)*R;
x1 = Lf*Cf*(Lc^2)*dp*R*Cc;
x2 = ((Lf+(Lc*dp*R)-((dp^2)*R*Lf*Cf))*Lc*Cc)+((Lc^2)*Lf*Cf*dp*R);
x3 = ((Lf+(Lc*dp*R)-((dp^2)*R*Lf*Cf))*Lc)+(Lf*Cf*Lc*(dp^3)*(R^2));
x4 = ((Lf+(Lc*dp*R)-((dp^2)*R*Lf*Cf))*(dp^2)*R)-((dp^2)*R*Lc);
x5 = -((dp^4)*(R^2));
y1 = Lc*Cc*Lf*Cf;
y2 = (Lc*Cc) + (Lc*Lf*Cf);
y3 = Lc+Lf+(Lf*Cf*(dp^2)*R);
y4 = R*(dp^2);
Gvd = (Vo/dp)*((s*w1+w2)/((s^2)*u1+(s*u2)+u3))*((((s^4)*x1)+((s^3)*x2)+((s^2)*x3)+(s*x4)+x5)/(((s^3)*y1)+((s^2)*y2)+(s*y3)+y4));
w = 1.542362224647084e+04;
H = freqresp(Gvd,w);
objective = -20*log(abs(H));
function constraints = constraintFcn(Cf,f)
V_Ripple = 14.2857;
d = 0.76;
constraints(1) = Cf - (d*Vo)/(2*f*R*V_Ripple)==0;

Accepted Answer

Alan Weiss
Alan Weiss on 20 Jan 2023
Edited: Alan Weiss on 20 Jan 2023
In addition to what Walter said, you can set the global random seed.
rng default % Or rng(seed)
[x,fval] = solve(problem,"Solver","ga","Options",options);
And to set an initial point using the problem-based approach, follow the instructions in Initial Points for Global Optimization Toolbox Solvers.
Alan Weiss
MATLAB mathematical toolbox documentation
Walter Roberson
Walter Roberson on 20 Jan 2023
A challenge with using rng() is that you end up controlling not only the initial population matrix but also the evolution of the algorithm, and that evolution can be affected by the order of operations.
Consider for example one algorithm that loops making calls to the objective function and only checks the nonlinear constraints if the objective result is better than the previous, compared to a second algorithm that loops checking nonlinear constraints and only calls the objective function for locations that pass the constraints. (If you imagine that an external program has to be invoked, then the first algorithm has time advantages if the constraints involve calling the external program, whereas the second algorithm has time advantages if the objetctive involves calling the external program.) But the difference in order of operations would "use up" random numbers in a different order, and so even if the algorithms were in some sense exactly as good as each other, you could end up being led into thinking that one was better than the other because they happened to explore different spaces.
It is true, though, that this is going to be a challenge no matter whether you use a fixed rng() seed or not. In any algorithm that involves randomness, you can end up in different "basin of attraction" just by pseudo-chance.
Anyhow, using the same rng() seed is not enough to guarantee that the initial populations will be generated the same way. There is no control over the internals of the population generation, and no documentation of how each is done, so if one algorithm generated velocities first and the other algorithm generated positions first, then you would end up with different populations.
For checking how each algorithm works on a fixed population it is therefore better to pass in the exact population, rather than using rng() and hoping that means the algorithms will generate the same populations.

Sign in to comment.

More Answers (0)





Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!