out of memory due to ploting
31 views (last 30 days)
Show older comments
I have a large batch program that should calculate approx 400 simulations. Each Simulation is a function call, say a call to fcn, that draws 22 plots. Outside of fcn there are almost no variables (aproxx 100 numbers). Inside fcn there are local data. These local data are read from files. After a plot is drawn it is plotted to a png-image. before a simulation all figures of the prior call are closed with: close all. at the end of the call to fcn all local data should be freed.
the problem is that the whole program runs out of memory after approx. 40 function calls, i.e. after approx 800 plots.
using the windows task manager i can observe that the memory continuously grows to its maximum of 16 GB. I presume this is due to plotting.
Each simulation is a function call, that receive some parameters (10 numbers), all other data are read in the function call. So after the function call all local variables should be freed.
since all figures are close periodically (close all statement) associated data should be freed.
what can I do?
3 Comments
Jan
on 21 Jun 2019
Edited: Jan
on 21 Jun 2019
"the problem lies in internal data structures of the figure objects" - do you have any evidence for this assumption? If the memory is exhausted, the code can fail during creating a figure also, because this is an expensive job. But this does not mean, that the figure command has a problem.
Please post copies of the complete error messages. A "low level graphics error" can be caused by an outdated driver of the graphics board or a problem in the Software-OpenGL emulation.
That the sum of the memory used by the local variables does not exceed 100MB does not mean, that their creation cannot exhaust the memory manager completely. See:
S = []; for k = 1:1e6, S(k)=k; end
This needs 8MB only at the end, but during the creation 4 Terrabyte (sum(1:1e6) * 8) are allocated by the operating system.
So please post some code which reproduces the problem and a copy of the error messages, mention the used hardware and the versions of the operating system and of Matlab. Then there is a chance that the forum can help you.
Did you try already to save the data and to create the diagrams afterwards?
If the code is written in a modular way, you can replace the parts for the calculations by creating some random arrays to feed the part for producing the graphics. If the code still fails, this would be a harder hint for a problem in the graphics tools. I do not get a problem, if I open 400 figures one after the other with 22 axes objects in it with Matlab R2018b on a 8 GB RAM machine under Windows 10. I've opened thousands of figures in a Matlab session also, when my code runs all the unittests and some integration tests. In one test it have been 9000 figures with 12 subplots, some legends, buttons, menus and text objects and up to 40 lines per axes, with OpenGL and Painters renderer. This worked with Matlab 5.3, 6.5, 2009a, 2011b, 2015b, 2016b and 2018b.
There is a problem in your case, but you did not provide enough details yet to narrow it down.
Answers (3)
Steven Lord
on 21 Jun 2019
the problem is that the whole program runs out of memory after approx. 40 function calls, i.e. after approx 800 plots.
Does this mean you have 800 figure windows open simultaneously? Or is your maximum on the order of 20 plots open simultaneously?
Even if you end the function call that defined the data and created the plot, thus deleting the variable in the function workspace as the function workspace itself gets deleted, the graphics object in the figure window needs to store the data. [If you're plotting using plot, use findobj to find the handle of the line plot created and check its XData and YData properties to see this.] If you don't need to keep all 20 (or all 800?!) figures open simultaneously, don't.
You mention your data is hundreds of MB in size. How many points is this, and if you were to plot each one as a separate pixel on your screen would you use a substantial fraction of your pixels? If so consider if you need to plot each data point, or if plotting every other point (or every fifth, or every tenth, etc.) would give you a plot that's accurate enough for your purposes while reducing the memory required to create it. Maybe smoothing your data with smoothdata would help.
0 Comments
Martin Mössner
on 24 Jun 2019
2 Comments
Image Analyst
on 24 Jun 2019
You can explicitly force variables to clear if you want, just to make darn sure they are gone:
clear('myvar1', 'myVar2', 'thirdVariable', 'giganticImage', 'otherUnneededVariable');
Jan
on 24 Jun 2019
@Martin: Of course you can start Matlab from a batch script. Older Matlab versions used the -r flag to define a script or function to run. You can start the next function iteratively also:
function fcn(num)
if nargin == 0
num = 1
end
yourCodeToBeExecuted(num);
if num < 20
system(sprintf('matlab -r fcn(%d)', num + 1));
exit;
else
disp('ready');
end
end
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!