terminate called after throwing an instance of 'std::bad_alloc'
I have written a code in which i have used vectors:
vector< vector<double> > arr; arr.resize(x,vector<double>(x,0)); x is a variable which is taken from a very beg text file > 64MB first line of my code is cout<<"\n\nWait Running..."; my code takes text file as an input, takes its data and generates an output text file.... Code is running fine for small data tried till x= 10 but while trying to run with large data ie x = 5000000 approx it is giving error Even the first line of the code is not displayed. NOTE: variable is declared global but its size is defined in main. The error that i am getting after approx 2-3 minutes is: terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc Wait Running...Aborted (core dumped) ----------------------------------------------------------------------- please help and thanks in advance. |
Quote:
The long story: What you're doing is making x vectors, and each one of them gets a different copy of a vector that has enough storage space for x doubles. On my machine, a double is 8 bytes. 5,000,000 doubles is 8 * 5,000,000 = 40,000,000 bytes = 39,062.5 kB = 38.1 MB. That's the size (the absolute minimum size) of each vector<double>. Make 5,000,000 of those and that's 38.1 * 5,000,000 = 190,500,000 MB = 186,035.1 GB = 181.7 TB. I'd bet it's actually impossible for you to have that much hard disk space, and I know it's physically impossible for you to have that much RAM. The short story: Use a smaller x. |
Quote:
|
Quote:
Can you tell me 1 thing is it the RAM size or HDD size on which the size of dynamically allocated memory is directly proportional |
Quote:
What entity directly proportional to what other entity ? Why did you mention HDD size ? |
Did you read 'man top' and manpages of other utilities mentioned in the end of it ?
|
Quote:
The simple answer could have been that what you were trying is impossible with current hardware. But JohnGraham apparently didn't want to just say "impossible" without considering extreme hardware that might bring it to borderline possible. Certainly Cern and Google and several other such organizations have no trouble making 200 TB of disk space directly accessible to one computer. That still doesn't get you to the point of being able to take such a crude approach to working on nearly 200TB of data. Cern and Google and others certainly do attack problems that involve data sets that large, but not so crudely. For ordinary computer users that is still an impossible scale of problem. |
All times are GMT -5. The time now is 11:47 AM. |