LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   terminate called after throwing an instance of 'std::bad_alloc' (https://www.linuxquestions.org/questions/programming-9/terminate-called-after-throwing-an-instance-of-std-bad_alloc-819699/)

mayankladoia 07-13-2010 08:39 PM

terminate called after throwing an instance of 'std::bad_alloc'
 
I have written a code in which i have used vectors:


vector< vector<double> > arr;
arr.resize(x,vector<double>(x,0));

x is a variable which is taken from a very beg text file > 64MB


first line of my code is
cout<<"\n\nWait Running...";


my code takes text file as an input, takes its data and generates an output text file....


Code is running fine for small data tried till x= 10

but while trying to run with large data ie x = 5000000 approx it is giving error
Even the first line of the code is not displayed.
NOTE: variable is declared global but its size is defined in main.

The error that i am getting after approx 2-3 minutes is:

terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Wait Running...Aborted (core dumped)


-----------------------------------------------------------------------
please help and thanks in advance.

JohnGraham 07-15-2010 08:08 AM

Quote:

Originally Posted by mayankladoia (Post 4032448)
arr.resize(x,vector<double>(x,0));

but while trying to run with large data ie x = 5000000 approx it is giving error

I'll bet it is.



The long story:

What you're doing is making x vectors, and each one of them gets a different copy of a vector that has enough storage space for x doubles.

On my machine, a double is 8 bytes. 5,000,000 doubles is 8 * 5,000,000 = 40,000,000 bytes = 39,062.5 kB = 38.1 MB. That's the size (the absolute minimum size) of each vector<double>. Make 5,000,000 of those and that's 38.1 * 5,000,000 = 190,500,000 MB = 186,035.1 GB = 181.7 TB.

I'd bet it's actually impossible for you to have that much hard disk space, and I know it's physically impossible for you to have that much RAM.



The short story:

Use a smaller x.

Sergei Steshenko 07-15-2010 08:29 AM

Quote:

Originally Posted by JohnGraham (Post 4033908)
...
The short story:

Use a smaller x.

The story to begin with: watch memory consumption in 'top' or similar.

mayankladoia 07-16-2010 03:08 PM

Quote:

Originally Posted by Sergei Steshenko (Post 4033941)
The story to begin with: watch memory consumption in 'top' or similar.

Thanks a lot for your reply it was truely very helpful....
Can you tell me 1 thing is it the RAM size or HDD size on which the size of dynamically allocated memory is directly proportional

Sergei Steshenko 07-16-2010 03:40 PM

Quote:

Originally Posted by mayankladoia (Post 4035680)
...
Can you tell me 1 thing is it the RAM size or HDD size on which the size of dynamically allocated memory is directly proportional

I don't understand your question.

What entity directly proportional to what other entity ?

Why did you mention HDD size ?

Sergei Steshenko 07-16-2010 03:42 PM

Did you read 'man top' and manpages of other utilities mentioned in the end of it ?

johnsfine 07-16-2010 07:54 PM

Quote:

Originally Posted by Sergei Steshenko (Post 4035720)
Why did you mention HDD size ?

If you were using a CPU architecture that allowed that much virtual address space, and you had that much HDD space and you had configured that HDD space as swap partitions, then in theory you could allocate a data structure that big (though not in a reasonable amount of time).

The simple answer could have been that what you were trying is impossible with current hardware. But JohnGraham apparently didn't want to just say "impossible" without considering extreme hardware that might bring it to borderline possible.

Certainly Cern and Google and several other such organizations have no trouble making 200 TB of disk space directly accessible to one computer. That still doesn't get you to the point of being able to take such a crude approach to working on nearly 200TB of data. Cern and Google and others certainly do attack problems that involve data sets that large, but not so crudely. For ordinary computer users that is still an impossible scale of problem.


All times are GMT -5. The time now is 11:47 AM.