Writing files to .csv
Show older comments
Hi, I am curious to know about the execution speed of a task I want to perform. I have a lot of data (variables) to store that is generated at every time step. Currently, I am dumping the output to a .csv file at each timestep. Something like:
%demo
fid1='C:\Users\Desktop\data\demo.csv';
v1 = fopen(fid1,'w');
for i =1:10000
%say these are the variables that I need to store
x2 = rand(5,2);
x3 = rand(6,3);
fprintf(v1,'%d \t', i);
fprintf(v1,'%10.3e \t', i,x2(:,1)',x3(:,1)');
fprintf(v1,'%d \t', i);
fprintf(v1,'%10.3e \t', i,x2(:,1)',x3(:,1)');
fprintf(v1,'\n');
end
fclose(v1);
Is this the fastest way? Or should I allow matlab to store all the values for all the variables and then after the end of process runtime, I dump them to a .csv? Are either of the process going to impact the execution speed of the actual model code which generates the data?
5 Comments
matquest
on 9 Apr 2020
Typically writing to a file is a very slow operation and the more items you have to write, the longer it will take. I would personally keep the values stored in variables and write them all at the end because it's cleaner, however, that likely won't speed up the time it takes because you'll still have to loop over the same number of items.
Walter Roberson
on 9 Apr 2020
I agree with matquest, writing all at once is the most efficient.
An exception to that would be that if you cannot reasonably estimate the number of values you will have then you cannot pre-allocate -- and growing arrays in place is very inefficient. If you don't know how many values you will have and can't make a guess, or if you are going to be processing a continuous stream of data for some unknown time, then writing in batches of at least 8 kilobytes is suggested.
A second exception would be that you should never delay writing output for longer than you can afford to lose. For example if you can afford to lose 5 minutes of output, then you can afford to write every 5 minutes but you cannot afford to delay indefinitely.
Angshuman Podder
on 10 Apr 2020
Walter Roberson
on 10 Apr 2020
Suppose you are logging data from an hour long experiment. Suppose that if you lost 5 minutes of data, that you could still get reasonable results. But suppose losing the entire hour would be expensive. In such a case, every (say) 4 minutes you could take the data you had accumulated in the last 4 minutes and try to write it out, and if the write failed you would still have most of a minute to raise an alarm so that the people running the experiment could take appropriate action. Likewise if you were writing every 4 minutes, then if the power failed or the system rebooted for some reason, you would have only lost at most 4-and-a-hair worth of data.
Collecting all of the data first and then writing it all out at once is the most efficient from the point of view of time and system resources used, but that kind of efficiency is not always the most important factor.
At the same time, if you do not absolutely need every value to be logged every time, then it is not advisable to log every result immediately, because that is costly in system resources.
Typically the most efficient hard disk useage is if you write out multiples of 8 kilobytes at a time, with the operating system overhead being least if you write out a number of those at a time, such as a megabyte.
Angshuman Podder
on 10 Apr 2020
Answers (0)
Categories
Find more on Entering Commands in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!