How to avoid memory leaks with py.pandas.DataFrame objects
24 views (last 30 days)
Show older comments
I was thrilled to see that pandas data frames allow us to pass matlab table to python and to read in pandas data frames into Matlab. However, these objects lead to memory leaks that run out of control when dealing with large amounts of data.
For example, the code below causes the memory use to increase and never go down
for ii = 1:1000
T = array2table(rand(100000,10));
pyT = py.pandas.DataFrame(T);
clear pyT
end
I've tried this to no avail:
for ii = 1:1000
T = array2table(rand(100000,10));
pyT = py.pandas.DataFrame(T);
clear pyT
py.gc.collect();
end
I've also tried this:
for ii = 1:1000
T = array2table(rand(100000,10));
pyT = py.pandas.DataFrame(T);
delete(pyT)
end
I know this is still a very fresh release and its a new feature but does anyone have ideas on how to make use of DataFrames in matlab without causing memory issues? When I close Matlab the memory is deallocated just fine.
2 Comments
cui,xingxing
on 3 Apr 2024
I reproduced your problem, and one solution is that using python's data directly avoids infinite memory growth.
for ii = 1:1000
% T = array2table(rand(100000,10));
T = py.numpy.random.rand(int32(100000),int32(10)); % use this instead
pyT = py.pandas.DataFrame(T);
end
As for the reason for passing large arrays from MATLAB to python, ask tech support. If there is something new, please post it below.
Answers (0)
See Also
Categories
Find more on Call Python from MATLAB in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!