Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
06-15-2012, 09:43 AM
|
#1
|
Member
Registered: Sep 2011
Location: /dev/null
Distribution: ubuntu 64bits
Posts: 135
Rep:
|
how to set memory limit for egrep command
Hi,
I have a problem with egrep command which is running within some application which I did not write.
Even if I have 32 Gbytes ram it is apparently not enough for egrep - because he wants more - about 60Gbytes virtual (-> so it is swapping -> computer is unusable -> no reaction on keyboard or extremely slow ...)
Is it possible to setup max memory limit consumption for egrep
(bonus: or any other command running in terminal - not for GUI applications)
thank you very much for any kind of solution/s. I am willing to try anything.
Kind Regards,
Martin
|
|
|
06-15-2012, 02:48 PM
|
#2
|
Senior Member
Registered: May 2009
Location: center of singularity
Distribution: Xubuntu, Ubuntu, Slackware, Amazon Linux, OpenBSD, LFS (on Sparc_32 and i386)
Posts: 2,906
Rep: 
|
In bash there is the "ulimit -v kkkkkk" command. Execute it in a subshell to affect one command:
( ulimit -v 4194304 && egrep ... )
This example sets the virtual memory limit to 4GB.
|
|
1 members found this post helpful.
|
06-16-2012, 01:14 PM
|
#3
|
Member
Registered: Sep 2011
Location: /dev/null
Distribution: ubuntu 64bits
Posts: 135
Original Poster
Rep:
|
Quote:
Originally Posted by Skaperen
In bash there is the "ulimit -v kkkkkk" command. Execute it in a subshell to affect one command:
( ulimit -v 4194304 && egrep ... )
This example sets the virtual memory limit to 4GB.
|
-Thanks a lot for this very useful hint - going to implement it in many scripts running from crontab.
-(Concerning this problem I have already created crontab script which runs each 60 seconds and kill process which consume more than 30 Gbytes ram)
- One thing - is it possible to setup such limit if I do not have possibility to execute it by subshell.
(Would it be possible to use /etc/security/limits.conf in someway for this purpose or something else ?)
thank you,
regards,
Martin
|
|
|
06-16-2012, 02:53 PM
|
#4
|
Member
Registered: Sep 2011
Location: /dev/null
Distribution: ubuntu 64bits
Posts: 135
Original Poster
Rep:
|
Hi,
I have changed ulimit -v (temporarily in terminal) to:
virtual memory (kbytes, -v) 1000
and commands:
$ sudo locate aaaa
Segmentation fault
$ free -m
Segmentation fault
ls
bash: /usr/local/bin/ls: Argument list too long
pretty ugly - what is happening ?
|
|
|
06-18-2012, 08:41 PM
|
#5
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.x
Posts: 18,441
|
Pretty much what it says ... 1MB (approx) is not enough for those cmds to run with those params ... 
|
|
|
06-19-2012, 04:16 AM
|
#6
|
Member
Registered: Sep 2011
Location: /dev/null
Distribution: ubuntu 64bits
Posts: 135
Original Poster
Rep:
|
(
:-) yes indeed. (could not imagine why for some simple command/s is not enough 50000*1024 ?
- where is the time I had to be happy with 64 kbytes for code segment , stack segment , data segment and extra segment all together :-)
)
(I have just moved all my activities to my cloned OS raid0 (4 partitions) on ssd revodrive - it is really extremely fast.).
1. So I can see it helped to decrease latency within this stuff when it suddenly happened.
2. I have daemon script which is killing processes with extremely big memory consumption.
3. changed my swapfile to 1 Gigabyte and location to ssd revodrive and swappiness=1.
4. more playing with ulimit for my daemons - especially for onlive rsyncing/cloning my OS.
5. setup tmpfs 7G /tmp to memory
hope it helped in future if it is happened again.
? I still do not know what application is starting this egrep - extremely memory consumption command , top and htop shows just egrep ?
do not know how to find it.
going to mark it as solved.
thank you for your help.
Does exist something like ulimit for swap consumption per process ? (ulimit does not have it if i did not overlook something)
kind regards,
M.
|
|
|
06-19-2012, 07:46 PM
|
#7
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.x
Posts: 18,441
|
As a short term thing, lookup ulimit options here http://linux.die.net/man/1/bash & also /etc/security/limits.conf.
In the long term, track down the cause; start with top and/or ps and check the owner and the PID, PPID values.
See http://linux.die.net/man/1/pstree
|
|
|
All times are GMT -5. The time now is 11:11 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|