Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
But the disk space usage is high in the remote server. To reduce space, I need to zip or tar all the files....
I need to rsync the files first and then remotes zip or tar all the folders....
I am using the below command to zip the files remotely
ssh test@192.168.1.1 gzip /tmp/Backup/$DATE/*
but I do not know how to zip the folders remotely. Also the zip or tar should remove the existing folders and only the zip or tar must be present in the remote server
none of these tools will remove files during operation. That means after a successful tar/gzip/whatever can be the source files deleted.
if you want to rsync gzipped files you need to use a version of gzip with the option --rsyncable.
I do not need all the folders to be zipped. I need only selected folders to be zipped/tar.
If I try to tar locally by excluding some folders, I would be left behind with the tar file, for this I need to put a separate cron to remove them, which I don't want. So I thought of zipping them remotely in the same script....
but I do not know how to zip the folders remotely. Also the zip or tar should remove the existing folders and only the zip or tar must be present in the remote server
If you use zip rather than gzip, you could have it remove everything, e.g.
The zip archive /tmp/Backup/${DATE}.zip would contain everything that was in /tmp/Backup/$DATE previously. After the zip command is finished /tmp/Backup/$DATE and its subdirectories and files would be removed.
Alternatively if you just want to use gzip and do not care about everything being in a single archive, you could:
Code:
ssh test@192.168.1.1 "gzip -r /tmp/Backup/$DATE"
For better space saving use xz instead:
Code:
ssh test@192.168.1.1 "find /tmp/Backup/$DATE -type f -exec xz {} \;"
EDIT: Actually, given this is a backup I would use bzip2 rather than gzip or xz. Although the result will not be as small as xz compression, bzip2 has better recovery tools (in the form of bzip2recover), should any of the files get damaged or corrupted in the future.
Code:
ssh test@192.168.1.1 "find /tmp/Backup/$DATE -type f -exec bzip2 {} \;"
P.S. I'd also recommend lbzip2 instead of bzip2 for a quicker result.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.