what is the maximum heap size for a process
Hi,
Is there any maximum limit to the heap memory allocation? My program is in PERL and i am using a solaris system. when i did "pmap pid" (pid = my process id) it is showing a number of heap memory allocated and all of them with GB sizes. This single process is eating up most of the physical memory. Is it normal and is there any way to get the heap memory size down? Thanks in advance, Anoop. |
Quote:
It is common for heap mappings in a process to be mostly "demand zero". That means the process has requested that address space for heap, but hasn't used it. Until the process actually uses each page, that page has no physical existence in either ram or swap. Quote:
Quote:
There might be some configuration for Perl that makes it pre allocate heap less aggressively, so if the heap is mostly demand zero, there might be a way to reduce that. But in both the above half baked ideas, it could only be possible to reduce heap size if it is pointless to reduce heap size. If the heap is mainly demand zero, there is no reason to shrink it. The part of the heap that isn't demand zero could only be restricted by forcing the process to fail. |
Thank you john for ur reply.
Quote:
Quote:
I am using hashing, and somewhere i have read like hashing demands more memory. I am deleting the hash and arrays i am using after every loop like, @array=[]; @hash=(); undef (@array); undef %hash; Quote:
-- Cheers Anoop. |
Hi,
I got the problem solved. I was making an xml file through the program for insertion of values into database. As the xml file size increases, it occupies most of the memory. Now i am making a .sql file and directly inserting into DB. The change was drastic. Memory utilization came down from 80% to 3%. So the point is, you should not use xml file if the file size is large. -- Cheers, Anoop. |
All times are GMT -5. The time now is 06:47 PM. |