...or rather memory floods!
My (server) application has an infinite loop with a blocking accept(), and when accept() returns does a pthread_create() to spawn a thread to deal with the new connection.
However, for some reason each time I accept a connection the memory usage of my program goes up by 8 MB!!! Eventually, pthread_create() fails and sets errno to 12, which i think is ENOMEM.
However, when i don't do it with threads, there is no memory leak at all. To demonstrate-
Code:
while (1)
{
curreq = accept(localsock, (struct sockaddr*) &cli_addr, &clilen);
#ifdef __THREAD
rval = pthread_create(&thread, NULL, (void*)tunnel_thread, curreq);
if (rval != 0){
fprintf(stderr, "pthread_create() failed with %i", errno);
exit(1);}
#endif
#ifndef __THREAD
tunnel_thread(curreq);
#endif
}
And tunnel_thread looks like this
Code:
void tunnel_thread(int curreq)
{
//do work
close(curreq);
pthread_exit(NULL);
return;
}
Now, if __THREAD is defined, my program leaks worse than a cane basket, consuming up to 2gb of virtual memory and services about 30 connections before pthread_create() starts failing.
If __THREAD is undefined, my program works perfectly (apart from the fact that i can only service one connection at a time) and doesn't leak.
What am I doing wrong? I'll post more code if you need it, just thought i'd keep the problem general and simple.