This is a general question if anyone has ever seen this. Im not aiming to pick apart complex code here.
I have a data handing program i wrote that is meant to be extremely fast. Lots of care was taken to get it to perform to the highest speed.
Over the course of several versions i have made 100 X timing performance gains.
I started using par for which gave me a 4X speed up which was wonderful. However, when i ask par for to chew on a large version of my test files it just pukes and dies. Locks up my computer and that is the end of it until i force reboot.
My smaller test files are identical except less quantity of lines to parse.
When i use a 10,000 line test csv i get great performance.
When i use a 50,000 line csv it just falls down dead
Has anyone seen this before? i have a fairly significant machine to run this on so its not a resource problem i dont think. In looking at the task manager i do see 90% memory ussage.
Any general thoughts?