ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
When I run Valgrind memcheck on my C application, it reports that I have a memory leak on an opendir command:
Code:
if( (dp = opendir( inputDir )) == NULL )
{
logError( "Could not open directory path." );
return -1;
}
The inputDir variable is a char *, that is free'd when the program terminates. However, this application is set to run continuously for very long times without leaving. When running Valgrind this is the only memory leak that it reports, and I do not understand why this is being reported or how to fix it. I am running this under Red Hat WS 4 with the Portland Group compilers.
Distribution: Solaris 11.4, Oracle Linux, Mint, Debian/WSL
Posts: 9,789
Rep:
I read elsewhere you were developing on both Red-Hat and Solaris.
If you are compiling on SPARC hardware, I would suggest you to use an alternative method to find out memory leaks on the Solaris side by using Sun Studio 11 compiler and dbx RTC.
sometimes valgrind reports memory leaks on library functions because some library functions do not actually free memory but keep them in a pool. This might be the case?
opendir()
opens the directory dirname and associates a directory
stream with it. opendir() returns a pointer used to
identify the directory stream in subsequent operations.
opendir() uses malloc(3C) to allocate memory.
This is from the man page for dirent which you should read.
Add a free(dp) to relaese the DIR struct when you are done with it.
opendir returns a pointer to a DIR stream so free(dp) wouldent make any sense. He does close the stream properly. So if there is a problem with malloc inside the opendir function then its a problem with the standard library call opendir. If this is the case try and locate the source for the library and see if it properly free's the memory. Im guessing this is going to be encapulated with the rest of the similar calls.
closedir() is better, just doing free() may leak a file handle. Also, never, ever free FILE* and DIR* directly because those are managed by the system.
For the OP issue, what happens in valgrind if you have a trivial program that simply opens and closes a directory?
I am attempting to respond to multiple questions at once, so this is a rather long post!
Quote:
Originally Posted by jlliagre
I read elsewhere you were developing on both Red-Hat and Solaris.
If you are compiling on SPARC hardware, I would suggest you to use an alternative method to find out memory leaks on the Solaris side by using Sun Studio 11 compiler and dbx RTC.
I am using Sun Studio 10 on the Solaris side, however, I have not ran the code through studio10 to determine memory leaks on Solaris, since this application will be primarily ran on RHEL4.
Quote:
Originally Posted by lorebett
sometimes valgrind reports memory leaks on library functions because some library functions do not actually free memory but keep them in a pool. This might be the case?
This may be the case why my application is increasing very quickly. Is there a 'C' function call to force RHEL to free the memory from the pool? The reason I believe this may be a problem, is that when I stop the application and continue to watch the memory, it very slowly drops about 50MB or so, and then holds at that value.
Quote:
Originally Posted by tuxdev
closedir() is better, just doing free() may leak a file handle. Also, never, ever free FILE* and DIR* directly because those are managed by the system.
For the OP issue, what happens in valgrind if you have a trivial program that simply opens and closes a directory?
I ran a quick test using opendir/closedir (please excuse any syntax/sematic errors in the code below, I typed this in from memory, and I have a bad memory!)
Valgrind did not report any memory issues with the test code, even when I wrapped the opendir/closedir code in a for statement. I am wondering if the memory leak is occuring when the child process does not "finish" before the closedir( ) is called. The child processes are not using anything dealing with DIR or dirent pointers. The filenames are copied prior to fork being called.
Code:
if( (dp = opendir( inputDir )) == NULL ) <========Valgrind reports error on this line
{
logError( "Could not open directory path." );
return -1;
}
while( (dirp = readdir( dp )) != NULL )
{
if( (strcmp( dirp->d_name, "." ) != 0) &&
(strcmp( dirp->d_name, ".." ) != 0)
{
// check if valid file using dirp->d_name
// Copy dirp using snprintf adding the path and new prefix
// fork child process to handle the file contents
// Child process does not interact with the parent process at all
// Parent process moves to the next file in the list
}
}
if( closedir( dp ) < 0 )
{
logError( "Error closing file" );
}
return numProcessed;
I do not have direct access to the code right now. It is not on this machine, and the printer is down. If needed, I will try and get more code.
sometimes valgrind reports memory leaks on library functions because some library functions do not actually free memory but keep them in a pool. This might be the case?
by the way, if this is the case (and for library functions it usually is), you can simply suppress this leak warning, by using the suppression options by valgrind.
Distribution: Solaris 11.4, Oracle Linux, Mint, Debian/WSL
Posts: 9,789
Rep:
Quote:
Originally Posted by Rayven
I am using Sun Studio 10 on the Solaris side, however, I have not ran the code through studio10 to determine memory leaks on Solaris, since this application will be primarily ran on RHEL4.
Testing an application under different platforms is a good thing, at it often allows to identify bugs faster.
Quote:
This may be the case why my application is increasing very quickly. Is there a 'C' function call to force RHEL to free the memory from the pool? The reason I believe this may be a problem, is that when I stop the application and continue to watch the memory, it very slowly drops about 50MB or so, and then holds at that value.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.