out of memory due to ploting

31 views (last 30 days)
Martin Mössner
Martin Mössner on 21 Jun 2019
Answered: Martin Mössner on 25 Jun 2019
I have a large batch program that should calculate approx 400 simulations. Each Simulation is a function call, say a call to fcn, that draws 22 plots. Outside of fcn there are almost no variables (aproxx 100 numbers). Inside fcn there are local data. These local data are read from files. After a plot is drawn it is plotted to a png-image. before a simulation all figures of the prior call are closed with: close all. at the end of the call to fcn all local data should be freed.
the problem is that the whole program runs out of memory after approx. 40 function calls, i.e. after approx 800 plots.
using the windows task manager i can observe that the memory continuously grows to its maximum of 16 GB. I presume this is due to plotting.
Each simulation is a function call, that receive some parameters (10 numbers), all other data are read in the function call. So after the function call all local variables should be freed.
since all figures are close periodically (close all statement) associated data should be freed.
what can I do?
  3 Comments
Martin Mössner
Martin Mössner on 21 Jun 2019
@jan the problem is not related to the local data of the function which I called fcn above. The local data in the function fcn are some 100 MB of data. Anyways, all these data should be released after each function call, because they are LOCAL variables.
the problem lies in internal data structures of the figure objects (e.g. fonts, etc) which presumably are not released after closing the figure.
in every case matlab crashes it gives messages which hint to the graphics system: 1) out of memory during ploting 2) low level graphics error
Jan
Jan on 21 Jun 2019
Edited: Jan on 21 Jun 2019
"the problem lies in internal data structures of the figure objects" - do you have any evidence for this assumption? If the memory is exhausted, the code can fail during creating a figure also, because this is an expensive job. But this does not mean, that the figure command has a problem.
Please post copies of the complete error messages. A "low level graphics error" can be caused by an outdated driver of the graphics board or a problem in the Software-OpenGL emulation.
That the sum of the memory used by the local variables does not exceed 100MB does not mean, that their creation cannot exhaust the memory manager completely. See:
S = []; for k = 1:1e6, S(k)=k; end
This needs 8MB only at the end, but during the creation 4 Terrabyte (sum(1:1e6) * 8) are allocated by the operating system.
So please post some code which reproduces the problem and a copy of the error messages, mention the used hardware and the versions of the operating system and of Matlab. Then there is a chance that the forum can help you.
Did you try already to save the data and to create the diagrams afterwards?
If the code is written in a modular way, you can replace the parts for the calculations by creating some random arrays to feed the part for producing the graphics. If the code still fails, this would be a harder hint for a problem in the graphics tools. I do not get a problem, if I open 400 figures one after the other with 22 axes objects in it with Matlab R2018b on a 8 GB RAM machine under Windows 10. I've opened thousands of figures in a Matlab session also, when my code runs all the unittests and some integration tests. In one test it have been 9000 figures with 12 subplots, some legends, buttons, menus and text objects and up to 40 lines per axes, with OpenGL and Painters renderer. This worked with Matlab 5.3, 6.5, 2009a, 2011b, 2015b, 2016b and 2018b.
There is a problem in your case, but you did not provide enough details yet to narrow it down.

Sign in to comment.

Answers (3)

Steven Lord
Steven Lord on 21 Jun 2019
the problem is that the whole program runs out of memory after approx. 40 function calls, i.e. after approx 800 plots.
Does this mean you have 800 figure windows open simultaneously? Or is your maximum on the order of 20 plots open simultaneously?
Even if you end the function call that defined the data and created the plot, thus deleting the variable in the function workspace as the function workspace itself gets deleted, the graphics object in the figure window needs to store the data. [If you're plotting using plot, use findobj to find the handle of the line plot created and check its XData and YData properties to see this.] If you don't need to keep all 20 (or all 800?!) figures open simultaneously, don't.
You mention your data is hundreds of MB in size. How many points is this, and if you were to plot each one as a separate pixel on your screen would you use a substantial fraction of your pixels? If so consider if you need to plot each data point, or if plotting every other point (or every fifth, or every tenth, etc.) would give you a plot that's accurate enough for your purposes while reducing the memory required to create it. Maybe smoothing your data with smoothdata would help.

Martin Mössner
Martin Mössner on 24 Jun 2019
Thanks to Jan and Steven Lord I got a got some new insights! Thanks a lot.
First of all, the size of my local variables is not the problem. Each of the 20 plots, which are simultaneously open, draws 3d surfaces by drawing 20 lines in the x-direction and 20 lines in the y-direction. Each line consists of 200 points.
Since you both very insistently urged that ploting is not the problem, I began to search somewhere else. In the first step I made a version of my Matlab program, that just reads the calculated data from file and does the plotting. And - big surprise - after some few hours of ploting the job finished. So plotting is not the problem.
The parts. which I commented out. were calls to external dll's which do the number crunching. They do musculoskelettal simulation stuff, implemented by OpenSim. These dll's must have small memory holes, which get large after hundreds of consecutive calls. maybe its just bad garbage collection.
Now I understand the cause of the problem, but do not have a solution yet.
As I mentioned above the whole calculations can be split into 800 calculations, which can be done seperately. At present I have a Matlab program that does this with a loop that calls a function, I called this function fcn, above. In each call fcn gets some parameters that determines what fcn should do.
At present I see three possibilities to handle this problem:
1) call each function by hand ... I do not want to do this, since it is a lot of senseless work
2) write a DOS batch program that calls Matlab executing a m-file, say fcn.m After finishing fcn.m Matlab terminates. For the sake of simplicity (1st step) fcn.m can get its input arguments by reading a configure-file.
3) In the scenario above, when performing the loop, over fcn(...). After a single call of fcn I could free all workspace. The problem is, that I would have to free memory, that was assigned to Matlab by calling an external dll.
My question would be: Is there a possibilit to do 2 or 3 ?
Sincerely, Martin
  2 Comments
Image Analyst
Image Analyst on 24 Jun 2019
You can explicitly force variables to clear if you want, just to make darn sure they are gone:
clear('myvar1', 'myVar2', 'thirdVariable', 'giganticImage', 'otherUnneededVariable');
Jan
Jan on 24 Jun 2019
@Martin: Of course you can start Matlab from a batch script. Older Matlab versions used the -r flag to define a script or function to run. You can start the next function iteratively also:
function fcn(num)
if nargin == 0
num = 1
end
yourCodeToBeExecuted(num);
if num < 20
system(sprintf('matlab -r fcn(%d)', num + 1));
exit;
else
disp('ready');
end
end

Sign in to comment.


Martin Mössner
Martin Mössner on 25 Jun 2019
Thanks again to Jan, I think, i have a workaround for my problem.
Calling Matlab from batch file with the -r option works, but I have to shut down Matlab after each function call and then start it for the next function call again. This has to be done to get rid of the huge amount of allocated data, which are not freed.
My solution runs like this:
Dos Batch file: mk.bat
@echo off
matlab -nosplash -wait -r fcn(1)
timeout /t 6
matlab -nosplash -wait -r fcn(2)
Matlab File: fcn.m
function fcn( i )
format compact
get(0,'DefaultFigurePosition')
set(0,'DefaultFigurePosition',[1320,558,560,420])
nam0 = sprintf('tmp%1i_diary',i)
diary(nam0)
diary on
t = (1:0.1:2)'
s = sin(i*t)
tmp = [t,s]
figure(1)
line(t,s), grid on
nam1 = sprintf('fig%1i.png',i)
print(nam1,'-dpng')
nam2 = sprintf('data%1i.dat',i)
save(nam2,'tmp','-ascii')
diary off
exit
end
The Matlab file does some calculations and writes results to files which are indexed by the parameter i. In my application I additionally make the figures invisible. The diary file is for documentation.
Important is the exit call at the end of the function to shut down Matlab after work is done.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!