Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a box running, within this box I serve approx 18 sites. Starting few months ago. the memory usage seems to climbs up everyday. I've check my cron, and I don't have any cron that runs everyday (but hourly cron that only did checking external server's availability via wget). The only new thing that I've add was a backup script to dump mysql database, which occurs weekly. (I can post it's script if needed).
And lately things are getting worse, everytime I did this:
Code:
${wget} -O /dev/null -t 2 -T 10 ${host}
at least one of my site returns with timeout. When I visit the site using my browser, the site is actually reachable but it takes about 20 seconds to load (my wget use timeout 10 seconds so it will surely returns with timeout). Each time one or two sites downs randomly. But about two-three small sites dominates - their chance to get slowed seems to be higher than the rest 15 sites.
Is there any way I can detect the problem? I really don't have any idea...
You don't post any real details. Like what kind of box this is (CPU, total memory), hard drives (what type? How big? What kind of controller?), and what network connection (one NIC? Two? Bonded? Multiple outbound paths?), and what the sites are written in (Java heavy? PHP? Ruby-on-Rails?), and if you've updated anything on the box before this started happening. Any/all of those could be the issue.
AMD Athlon(tm) 64 X2 Dual Core Processor 3800+
4GB RAM and 8GB swap
one NIC, connected directly to the internet
All sites written in PHP
the same box serve mysql and tomcat serving for it self
no major processing
Everything went (almost) ok few months ago, and AFAIK we did no update upon the box, the only thing I've add just this script (csh)
Code:
#!/bin/csh
set mySQL_User = "myUser"
set mySQL_Passwd = "myPassword"
set Title = "Backup $HOST on account $USER for $mySQL_User at `date +%Y`/`date +%m`/`date +%d`"
set Day = "`date +%d`" # Get today's date
set WeekDays = 7 # Looping interval
set Today = "`date +%Y``date +%m``date +%d`"
@ WeekNumber = 0 #initialize
@ WeekNumber = (( $Day / $WeekDays ) + 1 )
set DB_List_File = "~/bin/my-site.db_list" # file which contains db list
set DB_List = "" #initialize # Array that'll save db list
set Ignore_List_File = "~/bin/my-site.ignore_list" # file which contains ignored table
set Ignore_List = "" #initialize # Array that'll save ignored table
set Backup_Root = "/backup.local/$USER.$mySQL_User" # Backup folder
set Backup_Dir = $Backup_Root"/WEEK"$WeekNumber
set Backup_File = "" #initialize
set Log_File = ~/$USER.mySQL_User.backup.log
set Counter = 0 #initialize
set Error_Num = "" #initialize
set Exit_Code = 0 #initialize
set Email_List = "me@my-site.net"
echo "=========================================================================" >> $Log_File
echo "***** Starting backup procedure at `date +%c` *****" >> $Log_File
if (! -e $Backup_Root) then
echo " ... Directory $Backup_Root not found." >> $Log_File
mkdir $Backup_Root
set Error_Num = $?
if ($Error_Num != "0") then
echo " ...... [ERROR] Unable to create $Backup_Root (Error Number: $Error_Num)." >> $Log_File
set Exit_Code = 1
goto Ending
else
echo " ...... Directory $Backup_Root created." >> $Log_File
endif
else
echo " ... Directory $Backup_Root found." >> $Log_File
endif
if (! -e $Backup_Dir) then
echo " ... Directory $Backup_Dir not found." >> $Log_File
mkdir $Backup_Dir
set Error_Num = $?
if ($Error_Num != "0") then
echo " ...... [ERROR] Unable to create $Backup_Dir (Error Number: $Error_Num)." >> $Log_File
set Exit_Code = 2
goto Ending
else
echo " ...... Directory $Backup_Dir created." >> $Log_File
endif
else
echo " ... Directory $Backup_Dir found." >> $Log_File
endif
# get database list
if (-e $DB_List_File) then
echo " ... Database List File ($DB_List_File) found." >> $Log_File
set DB_List = `grep '' $DB_List_File`
else
echo " ... [ERROR] Database List File ($DB_List_File) not found." >> $Log_File
set Exit_Code = 3
goto Ending
endif
# get ignored table list
if (-e $Ignore_List_File) then
echo " ... Ignored Table List File ($Ignore_List_File) found." >> $Log_File
set Ignore_List = `grep '' $Ignore_List_File`
else
echo " ... Ignored Table List File ($Ignore_List_File) found." >> $Log_File
set Ignore_List = ""
endif
@ Counter = 0
foreach DB_Name ($DB_List)
if ($DB_Name{} != "") then
@ Counter = $Counter + 1
echo " ... Dumping $DB_Name at `date +%c`" >> $Log_File
set Backup_File = $Backup_Dir"/"$DB_Name.$Today
set Ignore_Temp = ""
foreach Ignore_Name ($Ignore_List)
if ($Ignore_Name != "") then
set Ignore_Temp = "$Ignore_Temp --ignore-table=$DB_Name.$Ignore_Name "
endif
end
mysqldump -u $mySQL_User -p"$mySQL_Passwd" $Ignore_Temp $DB_Name > $Backup_File.sql
set Error_Num = $?
if ($Error_Num != "0") then
echo " ...... [ERROR] Dumping $DB_Name failed at `date +%c` (Error Number: $Error_Num)." >> $Log_File
rm $Backup_File.sql
echo " ...... $Backup_File.sql file removed" >> $Log_File
set Exit_Code = 101
else if (! -e $Backup_File.sql) then
echo "[ERROR] Dump file $Backup_File.sql not found" >> $Log_File
set Exit_Code = 102
else
echo " ...... Finished dumping $DB_Name to $Backup_File at `date +%c`" >> $Log_File
echo " ...... Deleting old $Backup_Dir/$DB_Name* files" >> $Log_File
find $Backup_Dir/ ! -name $DB_Name.$Today.sql | grep $DB_Name | xargs rm
endif
endif
end
if ($Counter < 1) then
echo " ...... [ERROR] Database List File ($DB_List_File) is empty." >> $Log_File
set Exit_Code = 4
goto Ending
else
goto Ending
endif
Ending:
# Erase previous symbolic link
echo "Removing symbolic link $Backup_Root/download" >> $Log_File
rm $Backup_Root/download
if ($Exit_Code != 0) then
set Title = "[ERROR] $Title"
else
# If the backup succeded, create new symbolic link to the new backup dir
echo "Creating symbolic link $Backup_Root/download from $Backup_Dir" >> $Log_File
ln -s $Backup_Dir $Backup_Root/download
endif
echo "Showing backup folder content(s) ($Backup_Dir)" >> $Log_File
echo "" >> $Log_File
ls -l $Backup_Dir >> $Log_File
echo "" >> $Log_File
echo "***** Backup procedure completed at `date +%c` (Exit code: $Exit_Code) *****" >> $Log_File
echo "=========================================================================" >> $Log_File
/bin/mail -s "$Title" $Email_List < $Log_File
exit $Exit_Code
Ok, here's my box's info
AMD Athlon(tm) 64 X2 Dual Core Processor 3800+
4GB RAM and 8GB swap
one NIC, connected directly to the internet
All sites written in PHP
the same box serve mysql and tomcat serving for it self
no major processing
Everything went (almost) ok few months ago, and AFAIK we did no update upon the box, the only thing I've add just this script (csh)
Still alot of mystery. The backup script wouldn't do it, unless it was still running the next day (or the next time you ran it...).
You say "one nic"...how big are the sites?? How much of your pages are graphics? Updated site graphics lately? And you say all the sites are written in PHP. Have you added new PHP features, like graph-generation via GD libraries, and did you update the max memory and max runtime parameters in the php.ini file, to reflect the additional processing time?
First place I'd start, would be to look at the amount of traffic running through the NIC. See if you're saturating that pipe. Next place, would be to look at what each site is doing, and see if there have been any changes in those pages. What's the database size?? Added any new complex joins, etc., that would make things run slower?
The backup script wouldn't do it, unless it was still running the next day (or the next time you ran it...).
The backup script only runs for about half till an hour.
Quote:
You say "one nic"...how big are the sites?? How much of your pages are graphics?
my box only contains one big (not huge) site (using PHP), and other medium-small sites (using drupal). Mostly we only serve text data, images only used to decorate the site.
Quote:
Updated site graphics lately?
I'll check with the web people. Mostly they only switch images though...
Quote:
And you say all the sites are written in PHP. Have you added new PHP features, like graph-generation via GD libraries,
No, but about graph-generation, I think we used to use captcha. It's been awhile though
Quote:
and did you update the max memory and max runtime parameters in the php.ini file, to reflect the additional processing time?
No... I don't even know what's "max memory" and "max runtime parameters" in the php.ini file
Quote:
would be to look at the amount of traffic running through the NIC. See if you're saturating that pipe
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.