backup script
I'm interested in writing a script (I've done PHP, ActionScript and JavaScript before, but never Perl or Shell scripts...) to backup files from a directory name that I supply as an argument.
What I want to achieve is this: 1) Script should examine every file in the directory (and it's children) and get the filesize of that file. 2) Add the filesize to a total filesize. 3) Stop at 650MB total and write the list of files examined to a file. That ilst needs to be fed to tar to create an archive of those files. 4) If there are more files in the path to do, continue the 3 steps above and save a 2nd, 3rd, 4th, etc file of filenames, until the path has been thoroughly examined. I'm using bash. How do I start? PLEASE, tell me if I need to correct my thinking before I begin. Thanks! |
Take a look at stat:
man stat You can get the filesize using: stat -c %s FILENAME |
Not the fastest script around, but it will get you started. If you have a lot of large files you may get more more onto the CD if you use a best fit type of algorithim.
Code:
#!/bin/sh |
All times are GMT -5. The time now is 08:43 PM. |