MATLAB Answers

Memory usage very high

179 views (last 30 days)
Julia Rhyins
Julia Rhyins on 22 Nov 2019
Commented: tianyuan wang on 30 Jul 2020 at 11:18
I always have problems with matlab (R2019b) using too much memory (way more than the variables I have saved). Currently I'm running a function to extract data from a number of structures. I paused the function because the level of RAM being used just doesn't make any sense. Task manager says that Matlab is using 4.7gb of memory, even though I'm not running anything right now. The total size of all the variables in my workspace is ~0.055gb and I have no figure windows open. The only two programs I have running on my computer are Matlab and Task Manager. Is there any reason that Matlab would be using so much memory and is there a way for me to reduce it?


Sign in to comment.

Answers (2)

Jose Sanchez
Jose Sanchez on 28 Jan 2020
I am having a similiar issue while running on an HPC cluster.
My University cluster allow me using up to 520 workers where each HPC node (4 workers) has 8 GB RAM. I controlled that the RAM consumed inside my parfor loop were no higher than 500 MB. However, when I run in the cluster using 100 parallel processes, the cluster crash with "Out of Memory" error.
Then, I did a test running locally on my PC (32 GB RAM) and I can see clearly that every worker is consuming over 2 GB of RAM, which is more than 5 times the amount of RAM consumed within each PARFOR.
In my opinion, clearly, MATLAB is doing something that is not working as expected! I didn't notice this issue in the HPC MATLAB version 2017a despite using our cluster very often.


Show 1 older comment
mustafa mete
mustafa mete on 19 May 2020
Hey, could you solve this problem. now, i have same problem. I dont know how to deal with it.
Mohammad Sami
Mohammad Sami on 20 May 2020
R2020a introduced new thread based parallel cluster. It has some limitations compared with the process based cluster. If your code is compatible with the thread based cluster, you can use it instead to reduce the memory overhead for process based clusters.
You can read more details here
tianyuan wang
tianyuan wang on 30 Jul 2020 at 11:18
I have the same problem.If I call MATLAB on the control node of HPC (one control node, 17 computing nodes, distributed memory) to process the super large matrix, can the new version of MATLAB solve the problem of insufficient memory for single node?

Sign in to comment.

Jan on 22 Nov 2019
How do you observe the memory consumption? The Taskmanager displays the memory reserved for Matlab. If Matlab allocates memory and releases it afterwards, this is not necessarily free'd directly. As long as no other application asks for the memory, it is efficient to keep its status.
Does it cause any troubles, that the OS reserves 4.7GB RAM for Matlab? Why to you say, that this is "too much" memory?
Although the current consumption of memory is small, growing arrays can need much more memory. Example:
x = [];
for k = 1:1e6
x[k] = k;
Although the final array x occupies 8MB of RAM only (plus about 100 Bytes for the header), the intermediate need fro RAM is much higher: sum(1:1e6)*8 bytes = 4 TerraBytes. Explanation: If x=[1], the next step x(2)=2 duplicates the former array and appends a new element. Although the intermediately used memory is released, there is no guranateed time limit for the freeing.
Can you post some code, which reproduces the problem.


Show 4 older comments
Julia Rhyins
Julia Rhyins on 25 Nov 2019
Someone pointed out to me that .mat files are compressed so when the file is loaded it may be larger than what I see in the file explorer. I think this may be my problem because each file that I load contains a couple of figure handles. In terms of the error, I think there is some initial memory warnign, but it is quickly buried in the command window by continuous printing of 'Warning: Error updating line. Update failed for unknown reason'
Walter Roberson
Walter Roberson on 25 Nov 2019
The total size of all the variables in my workspace is ~0.055gb and I have no figure windows open.
each file that I load contains a couple of figure handles.
There is a contradiction there. The only way to load figure handles is to create figure windows from them. Those figures might not be visible but they are open. And if you do not close those figures after you are finished with them then the memory for the (possibly not visible) figures will add up.
tom3w on 24 Jun 2020
Jan mentions that Task Manager reports the memory reserved for Matlab.
Is there a way to know the effective amount of memory used by Matlab considering Task Manager or Resource Monitor, or any other way (preferably not in Matlab)?
For now, I can't find any other memory metric mentioning an effective memory use being much lower than 3.7Gb (with an empty Matlab session)...

Sign in to comment.