A faster way to read a large text file
26 views (last 30 days)
Show older comments
Hi,
I am reading a large text file ~3GB in chunks. Each time I open the file,I read the header lines, then read a chunk "of lets say 1000 lines", then I close the file before I process my data. The second time I want to read another chunk, I re-open my file and go through the file with "fgetl" line by line till I reach the line where I stopped previously before I grab a new chunk of data. I was wondering if there is a faster way to grab chunks of lines instead going through the beginning of the file line by line in each and every time.
Any help with this will be highly appreciated.
Best wishes
AA
0 Comments
Answers (2)
Jonathan Sullivan
on 2 Jul 2012
Edited: Jonathan Sullivan
on 2 Jul 2012
Before you exit, store what line you are at, then, when you reopen it, seek to that line.
Right before you close:
file_position = ftell(fid);
Then after you open it:
fseek(fid,file_position,'bof');
0 Comments
See Also
Categories
Find more on File Operations in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!