LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   dynamic mem allocation, how to change heap size? (https://www.linuxquestions.org/questions/programming-9/dynamic-mem-allocation-how-to-change-heap-size-614617/)

parv 01-18-2008 05:43 PM

dynamic mem allocation, how to change heap size?
 
In my program, I was trying to allocate a large chunk
memory and then assign values to the array as:
int *array = new(MAX);
for (....)
*(array+i) = some value;

but segmentation fault occurred and I figured out that it was because
heap size was not enough. It always exit when i = 33790 or
if the array is double, then i=33790/2 = 16895.

So, I want to know if there is a way to increase the system heap size while
compiling the program in gcc. And I guess probably by using vector can
avoid the problem. But I am curious to know if indeed vector works out
perfectly, how does it handle dynamic memory allocation? In other words,
what is a probable to avoid segmentation fault?

BTW, if I use array directly such as int array[MAX];
I can claim larger array, but still when MAX = 4*80k, segmentation fault happened.
Here, 4*80K is not necessarily the threshold value like 33790 in the above case.

How to figure out system heap size, data memory size, stack size and so on?

Thanks very much.

Shautieh 01-19-2008 03:57 PM

Do you really need an array this big ? I'd be interested to know if it works with a vector btw, as it shouldn't (a priori)..
As for resizing the heap size, I don't know either :s

parv 01-19-2008 04:14 PM

Quote:

Originally Posted by Shautieh (Post 3028491)
Do you really need an array this big ? I'd be interested to know if it works with a vector btw, as it shouldn't (a priori)..
As for resizing the heap size, I don't know either :s

Wanted to sort an array and demonstrate run time difference among various algorithms.
So, a large array is necessary to show the difference. I knew STL vector works but there
are cases where STL is not available/no needed. And before STL came to this world, how
would people sort a large array?

ntubski 01-19-2008 05:00 PM

Code:

//this doesn't allocate an array,
//it allocates ONE int and initializes with the value MAX
int *array = new(MAX);
//you want this
int *array = new[MAX];

Quote:

How to figure out system heap size, data memory size, stack size and so on?
ulimit maybe?

parv 01-19-2008 06:25 PM

Quote:

Originally Posted by ntubski (Post 3028545)
Code:

//this doesn't allocate an array,
//it allocates ONE int and initializes with the value MAX
int *array = new(MAX);
//you want this
int *array = new[MAX];

ulimit maybe?

yes, new[MAX]. that was a typo.
more details on ulimit?
tried to run ulimit -s 'much bigger number than ulimit -a shows'
but seemed no difference.

paulsm4 01-19-2008 08:04 PM

Hi -

I totally disagree with your conclusion that the problem is necessarily insufficient stack space or insufficient heap. You simply haven't given enough information to determine what the actual problem is yet.

Please do the following:
1. Give specific OS and compiler versions.

2. Run "ulimit" and cut/paste the results

3. Try this test program (or cut/paste the equivalent code)
Code:

#include <iostream>
#include <new>
using namespace std;

int
main ()
{
  unsigned long start = 0x8000L;

  for  (int i=0; i < 10; i++)
  {
    try {
      int* p;
      unsigned long size = (start << i);
      cout << "Allocating " << size << " int's...";
      p = new int[size];
      cout << "OK!" << endl;
      delete[] p;
    }
    catch( bad_alloc &ba) {
      cout << "ALLOC ERROR:" << endl << ba.what( ) << endl;
    }
  }
  return 0;
}

For whatever it's worth, here's my output:
Quote:

]g++ -g -Wall -pedantic -o test test.cpp
./test
Allocating 32768 int's...OK!
Allocating 65536 int's...OK!
Allocating 131072 int's...OK!
Allocating 262144 int's...OK!
Allocating 524288 int's...OK!
Allocating 1048576 int's...OK!
Allocating 2097152 int's...OK!
Allocating 4194304 int's...OK!
Allocating 8388608 int's...OK!
Allocating 16777216 int's...OK!

parv 01-19-2008 10:38 PM

what's the purpose of includeing: #include<new>?
Other than that, I have already figured out what I did wrong.
Many thanks.

Quote:

Originally Posted by paulsm4 (Post 3028692)
Hi -

I totally disagree with your conclusion that the problem is necessarily insufficient stack space or insufficient heap. You simply haven't given enough information to determine what the actual problem is yet.

Please do the following:
1. Give specific OS and compiler versions.

2. Run "ulimit" and cut/paste the results

3. Try this test program (or cut/paste the equivalent code)
Code:

#include <iostream>
#include <new>
using namespace std;

int
main ()
{
  unsigned long start = 0x8000L;

  for  (int i=0; i < 10; i++)
  {
    try {
      int* p;
      unsigned long size = (start << i);
      cout << "Allocating " << size << " int's...";
      p = new int[size];
      cout << "OK!" << endl;
      delete[] p;
    }
    catch( bad_alloc &ba) {
      cout << "ALLOC ERROR:" << endl << ba.what( ) << endl;
    }
  }
  return 0;
}

For whatever it's worth, here's my output:


ntubski 01-19-2008 10:55 PM

Quote:

Originally Posted by parv (Post 3028611)
more details on ulimit?
tried to run ulimit -s 'much bigger number than ulimit -a shows'
but seemed no difference.

Code:

$ cat array.cc
const int GB = 1024*1024*1024, MAX = GB;
int main() {
    char array[MAX];

    array[0] = 43;
    array[MAX-1] = 42;

    return 0;
}
$ g++ -g -Wall -ansi -pedantic array.cc -o array
$ ulimit -s
8192
$ ./array
Segmentation fault
$ ulimit -s unlimited
$ ./array
$

I found at about 1GB + 110MB it segfaults anyway. Limitations of the 32-bit address space maybe?

Note that I used char arrays so that MAX would be in bytes.

I believe #include<new> declares the bad_alloc exception class.

parv 01-19-2008 11:51 PM

got it.
Thanks very much for everybody's help.

Quote:

Originally Posted by ntubski (Post 3028790)
Code:

$ cat array.cc
const int GB = 1024*1024*1024, MAX = GB;
int main() {
    char array[MAX];

    array[0] = 43;
    array[MAX-1] = 42;

    return 0;
}
$ g++ -g -Wall -ansi -pedantic array.cc -o array
$ ulimit -s
8192
$ ./array
Segmentation fault
$ ulimit -s unlimited
$ ./array
$

I found at about 1GB + 110MB it segfaults anyway. Limitations of the 32-bit address space maybe?

Note that I used char arrays so that MAX would be in bytes.

I believe #include<new> declares the bad_alloc exception class.



All times are GMT -5. The time now is 04:13 PM.