LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 03-15-2011, 01:14 AM   #1
anoopbt
LQ Newbie
 
Registered: Feb 2011
Posts: 8

Rep: Reputation: 0
what is the maximum heap size for a process


Hi,

Is there any maximum limit to the heap memory allocation?
My program is in PERL and i am using a solaris system. when i did "pmap pid" (pid = my process id) it is showing a number of heap memory allocated and all of them with GB sizes. This single process is eating up most of the physical memory.
Is it normal and is there any way to get the heap memory size down?

Thanks in advance,
Anoop.
 
Old 03-15-2011, 09:06 AM   #2
johnsfine
LQ Guru
 
Registered: Dec 2007
Distribution: Centos
Posts: 5,286

Rep: Reputation: 1195Reputation: 1195Reputation: 1195Reputation: 1195Reputation: 1195Reputation: 1195Reputation: 1195Reputation: 1195Reputation: 1195
Quote:
Originally Posted by anoopbt View Post
i am using a solaris system. when i did "pmap pid" (pid = my process id) it is showing a number of heap memory allocated and all of them with GB sizes. This single process is eating up most of the physical memory.
Are you sure it is eating that much physical memory?

It is common for heap mappings in a process to be mostly "demand zero". That means the process has requested that address space for heap, but hasn't used it. Until the process actually uses each page, that page has no physical existence in either ram or swap.

Quote:
Is it normal
I have no knowledge regarding your perl program. So I won't guess at its normal heap use.

Quote:
and is there any way to get the heap memory size down? .
I'm not sure about Solaris, but I think it has something similar to ulimit that could be set to prevent the process from taking too much heap. In most cases, preventing a process from taking the heap that it wants would crash the process rather than make the process function with less heap.

There might be some configuration for Perl that makes it pre allocate heap less aggressively, so if the heap is mostly demand zero, there might be a way to reduce that.

But in both the above half baked ideas, it could only be possible to reduce heap size if it is pointless to reduce heap size. If the heap is mainly demand zero, there is no reason to shrink it. The part of the heap that isn't demand zero could only be restricted by forcing the process to fail.
 
Old 03-16-2011, 04:43 AM   #3
anoopbt
LQ Newbie
 
Registered: Feb 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Thank you john for ur reply.

Quote:
Originally Posted by johnsfine View Post
Are you sure it is eating that much physical memory?
Yes, i am quite sure it's taking most of my memory, because it is slowing down my system and when i kill the process, the memory usage will come down drastically.

Quote:
It is common for heap mappings in a process to be mostly "demand zero". That means the process has requested that address space for heap, but hasn't used it. Until the process actually uses each page, that page has no physical existence in either ram or swap.
I dont think it's "demand zero", because i am using a lot of dynamic memory and in a loop. I did 'undef' after every loop to free that space. But still the memory usage keep on increasing.
I am using hashing, and somewhere i have read like hashing demands more memory. I am deleting the hash and arrays i am using after every loop like,

@array=[];
@hash=();
undef (@array);
undef %hash;

Quote:
I'm not sure about Solaris, but I think it has something similar to ulimit that could be set to prevent the process from taking too much heap. In most cases, preventing a process from taking the heap that it wants would crash the process rather than make the process function with less heap.
In my program there is no point in reducing heap space as it requires that. But after each loop this heapsize goes on increasing, which is my issue. I am trying to free memory after each loop and according to me, the perl program should not take more memory.(Even though the memory is not shown as free from the point of view of OS, it's free for the perl program right?).


--

Cheers
Anoop.
 
Old 03-18-2011, 12:12 AM   #4
anoopbt
LQ Newbie
 
Registered: Feb 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Hi,

I got the problem solved. I was making an xml file through the program for insertion of values into database. As the xml file size increases, it occupies most of the memory. Now i am making a .sql file and directly inserting into DB. The change was drastic. Memory utilization came down from 80% to 3%.

So the point is, you should not use xml file if the file size is large.


--

Cheers,
Anoop.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Find the dynamic stack and heap size of a running process in rhel marifran Red Hat 5 08-31-2010 03:15 AM
What is the maximum allocatable heap size for a process? jaewlee Ubuntu 7 04-02-2010 10:53 PM
get the heap size from a specific process in linux powah Linux - General 3 01-04-2008 06:16 PM
How to increase Heap Size? rajat Programming 1 08-01-2007 07:33 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 06:03 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration