Hi
I am writing a piece of software that creates a start guess for a simulation of Ginsburg Landau theory. I have the following problem:
When I create a small system, with say 100*100*100*7 data points it all goes well, but when I create a big system (300*300*300*7) points, it seg faults.
So, I am not storing the data in my software in any array or so. Rather, I compute a value for a certain point and write it to file, and then the next...
After about 70,000,000 the software gets seg fault. So, I am wondering, is there a limit in the file size? This one is about 700 megs when it crashes. Do I need to split it into multiple files, or is the problem elsewhere?
I plot by using
ofstream out
out << someData
The code is 64 bit compiled with icc. I have tried under osx and linux with 4-8 gigs of ram.
Best
Carlis
I am writing a piece of software that creates a start guess for a simulation of Ginsburg Landau theory. I have the following problem:
When I create a small system, with say 100*100*100*7 data points it all goes well, but when I create a big system (300*300*300*7) points, it seg faults.
So, I am not storing the data in my software in any array or so. Rather, I compute a value for a certain point and write it to file, and then the next...
After about 70,000,000 the software gets seg fault. So, I am wondering, is there a limit in the file size? This one is about 700 megs when it crashes. Do I need to split it into multiple files, or is the problem elsewhere?
I plot by using
ofstream out
out << someData
The code is 64 bit compiled with icc. I have tried under osx and linux with 4-8 gigs of ram.
Best
Carlis