LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 10-20-2005, 03:46 PM   #1
lowpro2k3
Member
 
Registered: Oct 2003
Location: Canada
Distribution: Slackware
Posts: 340

Rep: Reputation: 30
How to simulate out of memory errors?


Are there any tools designed to do this? I want to test some calls to realloc(), specifically what would happen when realloc fails.
 
Old 10-20-2005, 04:26 PM   #2
jailbait
LQ Guru
 
Registered: Feb 2003
Location: Virginia, USA
Distribution: Debian 12
Posts: 8,337

Rep: Reputation: 548Reputation: 548Reputation: 548Reputation: 548Reputation: 548Reputation: 548
"Are there any tools designed to do this? I want to test some calls to realloc(), specifically what would happen when realloc fails."

You might take a look at electric fence.

http://www.die.net/doc/linux/man/man3/efence.3.html

-----------------------
Steve Stites
 
Old 10-20-2005, 04:43 PM   #3
Hko
Senior Member
 
Registered: Aug 2002
Location: Groningen, The Netherlands
Distribution: Debian
Posts: 2,536

Rep: Reputation: 111Reputation: 111
Or just try to realloc() a really huge amount of memory.
 
Old 10-20-2005, 06:05 PM   #4
lowpro2k3
Member
 
Registered: Oct 2003
Location: Canada
Distribution: Slackware
Posts: 340

Original Poster
Rep: Reputation: 30
I wrote this little program to try and make the system run out of memory - it doesn't seem to work, because my system is still 100% usable and 'ps aux' only shows memory usage as 0.1% for the process.

Code:
/* keep calling realloc() until it runs out of memory */
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>

int main()
{
 char *ptr;
 size_t len;
 len = 1;

 ptr = (char*) malloc(len);

 while(len *= 2)
 {
  if((ptr = (char*) realloc(ptr, len)) == NULL)
   fprintf(stderr, "realloc failed trying to allocate %d bytes\n", len);

  if(len > 67108865) /* 64MB */
  {
   fprintf(stdout, "going to sleep\n");
   sleep(45);
  }
 }

 return 0;
}
Before I added the 64MB check, realloc seemed to fail when integer overflow happened, not when the system ran out of memory.

Basically I'm spending time worrying about this for my little shell program. I'm starting to wonder whether its worth it to check whether realloc failed when I try to allocate size for a line of any length. I'm starting to really doubt it...

I initially malloc LINE_MAX number of bytes, and if thats not enough I call realloc when necessary.

Last edited by lowpro2k3; 10-20-2005 at 06:09 PM.
 
Old 10-20-2005, 06:11 PM   #5
jailbait
LQ Guru
 
Registered: Feb 2003
Location: Virginia, USA
Distribution: Debian 12
Posts: 8,337

Rep: Reputation: 548Reputation: 548Reputation: 548Reputation: 548Reputation: 548Reputation: 548
"I wrote this little program to try and make the system run out of memory - it doesn't seem to work, because my system is still 100% usable and 'ps aux' only shows memory usage as 0.1% for the process."

You can make the problem a little easier by getting rid of swap during the experiment. Stop swap with:
swapoff

Start swap again with:
swapon -a

---------------------
Steve Stites
 
Old 10-20-2005, 10:43 PM   #6
rsheridan6
Member
 
Registered: Mar 2003
Location: Kansas City
Distribution: Debian unstable
Posts: 57

Rep: Reputation: 22
I haven't actually tried this, but I think using setrlimit to reduce RLIMIT_DATA would help. That would tell the kernel not to give your process so much heap memory.
 
Old 10-21-2005, 03:25 PM   #7
jlliagre
Moderator
 
Registered: Feb 2004
Location: Outside Paris
Distribution: Solaris 11.4, Oracle Linux, Mint, Debian/WSL
Posts: 9,789

Rep: Reputation: 492Reputation: 492Reputation: 492Reputation: 492Reputation: 492
Allocating memory is not enough, it reserves pages of virtual memory but does nothing else.

If you want to really simulate a memory shortage, you need to make each of these pages mapped to physical memory, the simpler would be to read or write one byte per page.
 
Old 10-21-2005, 04:32 PM   #8
primo
Member
 
Registered: Jun 2005
Posts: 542

Rep: Reputation: 34
"Unfortunately", it depends on the malloc implementation. From the malloc manpage:
Quote:
By default, Linux follows an optimistic memory allocation strategy.
This means that when malloc() returns non-NULL there is no guarantee
that the memory really is available. This is a really bad bug. In case
it turns out that the system is out of memory, one or more processes
will be killed by the infamous OOM killer. In case Linux is employed
under circumstances where it would be less desirable to suddenly lose
some randomly picked processes, and moreover the kernel version is suf-
ficiently recent, one can switch off this overcommitting behavior using
a command like
# echo 2 > /proc/sys/vm/overcommit_memory
See also the kernel Documentation directory, files vm/overcommit-
accounting and sysctl/vm.txt.
There are malloc implementations out there that use mmap(2) to get file-backed memory in /tmp. It's hard to make robust implementations fail, but this OOM killer is something I turn off. Try with RLIMIT_DATA as rsheridan6 pointed out. Try setting malloc() options too. If you're trying to make your program robust, then check NULLs and -1's. Installing a signal handler for SIGSEGV doesn't make sense. Note that not every malloc implementation sets errno. I usually create a wrapper like this:
Code:
void *emalloc(size_t size)
{
     void *addr;

     errno = 0;
     addr = malloc(size);
     if (addr == NULL && !errno)
         errno = ENOMEM;

     return addr;
}
The same with realloc() and friends

Last edited by primo; 10-21-2005 at 04:37 PM.
 
Old 10-21-2005, 04:47 PM   #9
Dark_Helmet
Senior Member
 
Registered: Jan 2003
Posts: 2,786

Rep: Reputation: 374Reputation: 374Reputation: 374Reputation: 374
Ok, I may be way off, but what's trying to be tested here? The system as a whole or a program running in a constrained memory environment?

If it's the latter, I'd just write a wrapper for realloc() and free(). Some pseudo code:
Code:
function realloc_wrapper( request_size )
{
  if ( request_size + currently_allocated_memory > memory_allowed )
    return NULL;
  else
  {
    realloc_ptr = realloc( request_size );
    if( realloc_ptr != NULL )
      currently_allocated_memory += request_size;
    return realloc_ptr;
  }
}

function free_wrapper( memory_size, memory_block )
{
  currently_allocated_memory -= memory_size;
  free( memory_block );
}
Then either set memory_allowed to some static value with a #define or come up with some fancy-schmancy way of real-time modification to simulate requests made by other processes.

EDIT:
Didn't mean to rehash what primo said... I think we were both headed in the same direction.

Last edited by Dark_Helmet; 10-22-2005 at 06:10 PM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
DBMixer gives Gtk and shared memory errors da fart Linux - Software 1 12-23-2007 01:02 PM
Simulate a CDROM fiod Linux - General 4 02-12-2005 07:47 PM
Is Linux more suseptible to memory errors than Windows? Trying to Learn Linux - Hardware 2 08-02-2003 10:21 PM
device to simulate physical memory with PCI device kevinsung Linux - Software 1 12-14-2002 10:11 AM
RAM/Swap Memory Code Errors ghe1 Programming 1 04-01-2002 07:38 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 02:00 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration