Clear Filters
Clear Filters

unable to import large data txt file due to memory

4 views (last 30 days)
Hi,
I have been struggeling to import the data I need to process in MATLAB. It contains 3 fundamental cycles each of 20 ms, and I just need the last 20 ms. However, MATLAB wil still not allow me to import that data section. I have tried to increase the JAVA Heap size, but still not enough. Is there a way I can process my data without loosing too many valuable data points?
The file is 18.142.928 KB, each variable in the set has 101799346 values. The data goes from 0 to 60 ms, so I guess i only need 2/3 og the dataset and I only need 3 variables. Again, I have tried to just take one variable at at time, but 2/3 of the variable is still too large.

Answers (1)

Venkat Siddarth Reddy
Venkat Siddarth Reddy on 6 May 2024
Edited: Venkat Siddarth Reddy on 6 May 2024
Hi,
You can try using "datastore," which is designed to store data that is too large to fit into memory. This enables you to read data in smaller portions that fit in memory, i.e., it facilitates the incremental import of data into memory, allowing users to access portions of the data sequentially.
To learn more about datastore, refer to the following documentation:
I hope it helps!

Products


Release

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!