Matlab slowing down while reading netCDF

20 views (last 30 days)
Andrew Yool
Andrew Yool on 16 Aug 2011
Commented: shunping chen on 23 Feb 2023
I'm experiencing a problem when I try to read data from a succession of netCDF files. After a certain point (~10 loops of variable y in the example below), the reading operation suddenly slows to a crawl (a few seconds per read -> hundreds or thousands of seconds per read) and never recovers (though it "happily" runs overnight if I let it). The way I resolve it is to save my read-in data frequently, and just kill then restart Matlab to pick up where it got itself into a tizz.
I am currently using customised libraries to read netCDF-4, but I've experienced the same problem in the past with netCDF-3. In the deeper past, and with a much older version of Matlab, I had a faintly similar problem that stemmed from me not closing the files I was reading from, but that doesn't seem to apply here (I'm not even sure if there's a way to close open files now).
About the only thing I've been able to establish is that it appears to be a function of the number of read operations (19 per file below) rather than the number of files being read (336 in total below).
Anyway, is there some basic rule of file management in Matlab that I'm ignorantly flouting? It wouldn't be the first time.
Andrew.
-----
fld2d = zeros([292 362 28 19]);
for y = 1:1:28
for m = 1:1:12
iname = sprintf('EXP500/%d/ORCA1_%dm%dD.nc', 1965+y, 1965+y, m);
for f = 1:1:19
t1 = nc_varget(iname, fld(f,1:flen(f)));
fld2d(:,:,y,f) = fld2d(:,:,y,f) + (t1 / 12);
end
end
end
  4 Comments
Ashish Uthama
Ashish Uthama on 16 Aug 2011
It you are in a position to upgrade, version R2010b onwards support netcdf 4 out of the box, no additional set up required.
Andrew Yool
Andrew Yool on 17 Aug 2011
I'm afraid that's not an option just yet. We're stuck here at v2009a, and I'm a little worried that the precarious financial situation may persuade our IT services to skip upgrading for a bit. Anyway, thanks very much for letting me know about v2010b - I'll be prodding my IT group to see if I can convince them to adopt it.

Sign in to comment.

Answers (5)

Andrew Yool
Andrew Yool on 13 Feb 2012
For those who come across this archaic question while looking for an answer to a similar problem, a solution (or workaround) that I've found is to use the command ...
clear functions
I'm not sure what exactly this is doing, but it appears to clear out whatever buffer it is of Matlab's that is causing the problem, and it does so while leaving variables, etc., all in place.
Andrew.
  1 Comment
shunping chen
shunping chen on 23 Feb 2023
I also tried clear mex, not getting better. It looks like clear makes no defferent for me.
I tried profile, but I'm not sure if I can get any improvements from the reports.

Sign in to comment.


Kelly Kearney
Kelly Kearney on 16 Aug 2011
What version of Matlab are you running? I don't have a clear answer for you, but I experienced some rather dramatic slowdown of the snctools functions when I upgraded to a version of Matlab with the native netcdf support. Though that was more of a across-the-board slowdown, rather than the gradual slowing you describe.
  4 Comments
Andrew Yool
Andrew Yool on 17 Aug 2011
We (that's the institutional "we") bought it to do very specific tasks that involved large memory overheads. Basically high resolution ocean modelling (where "high" = 1/4 or 1/12 degree global resolution). The downside is that, while fast, its speed is not as impressive as its memory. So if too many people use it at once (not unusual), it can be slow even while it's more happily managing a vast memory overhead. Oh, and no, it's not free. ;-)
Jerry Gregoire
Jerry Gregoire on 12 Jun 2012
If you are not using the 64 bit matlab, on a 64 bit machine your 200GByte Ram will be worthless. ML will only see about 2GB for workspace use.

Sign in to comment.


Ashish Uthama
Ashish Uthama on 17 Aug 2011
I really have no clue, wild guesses:
  • Instead of having to restart MATLAB, try issuing a clear mex after a read, how does this impact the next read?
  • Does the order of the files being read make a difference? maybe the second file is 'different' in some way?
  • Could you elaborate on what you mean by 'customized libraries'? might help some of us try to replicate this.
  1 Comment
Andrew Yool
Andrew Yool on 17 Aug 2011
I'll try your first suggestion when I'm in Matlab next (should be later this PM). It certainly "feels" like Matlab has hit up against some limit, but I just can't see what this might be.
I don't think the file order matters at all. If I kill Matlab then start it up again, the "second" file is read just as fast at the "first" was on the previous instance. Usually, I'm quite a few files down before I hit up against any problems. And a restart of Matlab always solves them (... at first).
By "customised libraries" all I meant was that our third party toolbox for reading netCDF (which we started using before native netCDF was an option) was recompiled with netCDF-4 libraries instead of the usual netCDF-3 ones. I'm afraid my grasp on the specifics doesn't extend much further than that.
Thanks again for your help. I'll report back once I've tried your first suggestion.

Sign in to comment.


Jan
Jan on 17 Aug 2011
Please use the PROFILEr to find out, which lines needs to most time. Perhaps there is a missing pre-allocation in the function nc_varget, you did not show?
  3 Comments
Jan
Jan on 17 Aug 2011
See "help profile"... E.g. you can type this in the command line:
profile on
<start your code>
profile report
profile off
Ashish Uthama
Ashish Uthama on 17 Aug 2011
http://www.mathworks.com/help/techdoc/matlab_env/f9-17018.html

Sign in to comment.


Kevin
Kevin on 14 Jun 2013
I have similar problems. This worked for me. At the end of your loop (whatever it does that accesses files and plots figures...) add the following line just before the return to the top of the loop statement (end):
close all force
This helps!

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!