Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
Hello . Can someone show me a linux software to do an automatic backup to a partition or ftp.? What I want to do is backup some important files of my linux box. It would be good if I can access the software using the browser.
Write a simple batch script which will regularly backup the files and folders of your choice, and compress them really well (tar.gz for example).
I do automatic daily backup of my office work, and monthly backups of the entire system to the second HD.
Thank you all for the reply. One more last question for linuxfond. something like what you wrote I want to make. How can I make that script? do you have a copy or a place where I can get that script? I am using Netmax 4.5 .I dont know if you guys heard about that. netmax.com. Thank you..
Now I'd sugget to make a folder "scripts" or something like that, and save your new batch file, say, "office_daily" or "system_monthly" in that folder. Don't forget to make your batch executable.
Test if it works the way you want. Go to shell and issue ./office_daily and see if it doesn't abort (It will abort if you did not chmod your folders 755 and did not make your files readable).
If it runs ok, check the created archive. If it opens and everything is fine - great. If not, check if you did not leave an empty line at the top of your batch script. Check also that your archive is readable in all PCs at hand.
What's left to do is to schedule this script to run daily or monthly at certain time. See what you have in your distro available for schedulling.
Here is the script:
#Create a compressed backup of all the directories specified and put the resulting file in a directory of your choice. This will backup everything in your /home.
# This backup adds date to the file name, therefore your backups are not overwritten. Modify it if you like your backup to be overwritten every time.
My /home/ dir in tar.bz2 form is over a gig. That mean I can not write it to a CD-Rom. How do I hell enable multivolume -M support in that batch above and how to tell it to backup only modified and new files?
I couldn't follow the other links given, so maybe it's mentioned there, but of late the so-called "hard-link" solution has seen very good reviews, and I have been using it for about a year with good results. It's just a few lines of script.
The trick is to rsync the directory tree to be backed up file to another disk. When done, make hard links to each and every file to another directory tree, e.g.
cp -al . <some name with the date in it>
Next day, rsync to the same place again. When rsync recognizes a modified file, it "removes" (more specifically, unlinks) the file on the backup area. So a new file is generated. Yesterday's file still has the hard link in the other area, and is still around, and you can get it from yesetrday's directory. Today's hard-link operation (the cp -al ) will hard-link the new file.
If no files ever get modified, you just keep adding hardlinks to already existing files, using up very little space. Each time a file does get modified or a new one gets created, it gets on the backup disk and will then get more hardlinks until modified again.
So each day, you have the true snapshot of the file system. But each day, only actually modified or new files get on the backup. In a system where files are modified sporadically, this is a very elegant solution.
It does not work so well where large files are modified each day, e.g. mail files. It still works, but you keep copying new versions to your backup.
But the real advantage is if you mount the backup file system back for user access read-only, each user can retrieve a given file on his or her own, and you don't have to deal with rolling back incremental tar files if a user inadvertantly deletes an important file.
Thank you all for your help. One more question. If i dont want to compress the files I just want to move the files exacly the way they are. What do i need to do on the script that linuxfond made?>
If you don't want compression, you should try the rsync script. Here's mine. It assumes that it backs up from a place called "FROM" - set the value accordingly.
/backup is my backup file system. Each night, we rsync to /backup/current, then do the hard-linking to a directory 2003-11-09 for tonight.
FROM=<set this to your file system you want to back up>
# rsync the file system to the "current" directory
/usr/bin/rsync -a $FROM $TO > $LOG/$DATE.log 2>&1
# now cd there - we need a relative path
# now construct a "daily" directory name
# make this directory if it doesn't exist
[ -d $DAILY_DIR ] || mkdir $DAILY_DIR
# now make mirror of hardlinks in the "daily" directory
/bin/cp -al . $DAILY_DIR/ >> $LOG/$DATE.log 2>&1
# that's it. Run this from cron once a night or so.
Here is an example of a file that hasn't changed in a while, named monitoring/c2.ps.
You can see that this file exists only once, and that all entries in the directories are a link to the same inode 6701091.
Here's the .login file of that account, which was changed on 9/17, then again on 9/21, and has been stable since. We see the inodes, and with it the actual file, change on those dates, and continue to be links to the same files on other days.