Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I need to backup several folders,files in my linux box.. I have read about lftp but the thing is every example I find is about local/path to remote/path.. ONE folder and SUBfolders..
I need to backup several folders,files in my linux box.. I have read about lftp but the thing is every example I find is about local/path to remote/path.. ONE folder and SUBfolders..
rsync is your friend. lftp will not be secure and does not offer the flexibility that rsync will provide. use the right tool for the job, rsync is the tool for this job.
the think is that for the moment we have to do the backup to a ftp server running windows. that is why I would like to use lftp. in the past I have used rsync but using a 2nd linux server.
in the log file lftp.out only register the successful transfer if there is any.. but if there is any error because the path or something else, it doesnt show the failed transfer..
if this directory has a big size where it could take some seconds or minutes while it get compressed...does the script will wait until this tar finish to upload or will try to upload the tar file even while it is still compressing?
if this directory has a big size where it could take some seconds or minutes while it get compressed...does the script will wait until this tar finish to upload or will try to upload the tar file even while it is still compressing?
yes to both. bash is linear. in other words it goes line by line in order from top to bottom. it will not move onto the next line until the last is completed. few other things to keep in mind if you are doing this for backups you might want to both encrypt and create a md5sum for the file to be tested before/after of each of the steps as well as a file check after compression.
things are starting to get a bit larger, but if you wish to do that i can help to an extent. ive done a few of those types of scripts in the past. typically though ill just compress, verify the compression, encrypt and transmit. that is if i am backing up daily. if i am only backing up weekly or less often, ill add the md5sum check after encryption and verify it after transmission.
I think it wont go further. let see how this works. maybe I could try using samba to share the folders we need to backup..
thanks
instead of samba, create the share on the MS system and mount it locally on the Linux system as a cifs mount point. you can move the files around much easier that way.
remember samba is not native for linux. cifs is not much better, but if you mount it locally then the linux box will treat it as a local drive not a network drive.
then you would be able to use rsync, but look up in the man page howto point to a local /tmp instead of the remote /tmp that rsync would typically create.
I would like to keep some old backups in the server. so the idea is to create some variable in the file and use it for creating several directories in the ftp server. i.e backup_552013 backup_542013 ....
I'm just running the script.sh and no invoking it with lftp -f...
But the issue now is removing folders older than X day..
Quote:
rm_date=`/bin/date +%m-%d-%y -d '5 days ago'`
lftp -u xx,yy serverip
rm -r -f test/${rm_date}
and it doesnt work...
the other issue is if I connect directly to the ftp server I can list with ls the folder.. then type rm -rf folderName and it delete some of the files but not ALL.. onley some of them are hidden files but other arent....not sure why it doesnt delete ALL...
the find command will continue to execute until it runs out of items to remove. instead of removing first (fyi rm -rf is the same as rf -r -f) you might want to first try it with an ls -laF:
Code:
find test/ -mtime +5 -exec ls -laF '{}' \;
This way you can print out the files and see if there are any that meet the date restrictions. If that works, then replace ls -laF with rm -rf
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.