Maximum Array size limit

276 views (last 30 days)
lucksBi
lucksBi on 16 Mar 2017
Answered: Muniba Ashfaq on 11 Sep 2020
hi i am using R2016a and geting this error Requested 95319x84602 (60.1GB) array exceeds maximum array size preference. Creation of arrays greater than this limit may take a long time and cause MATLAB to become unresponsive. See array size limit or preference panel for more information.
I have changed maximum array size in prefrences to 10000 (Maximum allowed size) Any solution for this? Thanks in advance
  2 Comments
KSSV
KSSV on 16 Mar 2017
What is the necessity to load such a huge data at once?
lucksBi
lucksBi on 16 Mar 2017
I am working on dataset of size 841372x3 and need to perform some calculations on it

Sign in to comment.

Answers (4)

Adam
Adam on 16 Mar 2017
Any solution for what? The fact that you want a 60 GB array in memory? Do you even have 60 GB of RAM? What do you want to do with the array? Even if you could find enough memory for it any calculations done on it would likely require still more memory.
doc matfile
can be used to create arrays too large to fit into memory though if you are sure you really need an array this big.

Jan
Jan on 16 Mar 2017
Edited: Jan on 16 Mar 2017
You set the preferences to 10000 what? MB or GB? As far as I remember the size is set to the percentage of available RAM, so I'd expect something like 100%, but not "10'000".
Concerning your comment: 841372x3 means a 20MB array, 95319x84602 is a 64GB array. This is a remarkable difference.
Do you have 64GB of free RAM, or better the double size, which is required when you want to work with such arrays?
Please read the documentation concerning tall arrays.
  1 Comment
Royi Avital
Royi Avital on 30 Mar 2017
Is there a command to set it (Like with groot or something)?

Sign in to comment.


eun joo yoon
eun joo yoon on 6 Feb 2020
I have similar problem.
I want to deal with 500x500m scale global data in Matlab. (80150*34036 array)
I made tif to ascii in Arcmap.
But it is not open because of lack of memory,,
I wonder how other people deal with such global data such as MODIS data.
  2 Comments
Rik
Rik on 6 Feb 2020
Have a read here and here. It will greatly improve your chances of getting an answer.
I'm unsure where to move your comment, so I will give you the opportunity to post it in the appropriate comment section or use it in your separate question. This answer will be deleted later today.
Steven Lord
Steven Lord on 6 Feb 2020
If you want to work with Big Data (data that's larger than you can store in memory) consider using one or more of the tools listed on this documentation page. For ASCII data that large, setting up a datastore for the file or files and using that datastore to create a tall array would probably be my first attempt.
Rik, while this question doesn't seem directly related to the original question it's at least in the same general space. I'm not sure it should be deleted.

Sign in to comment.


Muniba Ashfaq
Muniba Ashfaq on 11 Sep 2020
I was also getting this error. I deleted all unnecessary variables from workspace to free up memory. I just keep those variables that was needed for my MATLAB code execution.
This worked amazingly.

Tags

No tags entered yet.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!