Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hey, I just finished a server and want it to start backing up computers on the network. The server is of course a Linux server with about 1.1TB of useable storage for backing up. But I want to try and automate the backup process as much as possible. For the most part, there are systems on the network that share their entire drives and partitions with password access, so there are a lot of files to move. However, I only want the files that are copied to take forever once. This means that after all the files are moved over, any time I do another backup, it needs to check the date the file was last modified automatically and determine if the file going over is newer or not. So basically, any time in the future, when I run this backup script, I want it to copy over files that don't exist on the server yet because they are new, and overwrite old files with the newer ones on the workstations.
Is there any available scripts out there that just need slight modifications to work? Currently, I am just moving files over as such:
mount -t smbfs -o username=userforpc //pcname/c$ /mnt/pcname/C_Drive
password:
So pretty much, I want to be able to customize source and destination for certain folders, and then have it copy only the files necessary, being new or modified files.
Just took a peek. Does it absolutely need to have an SSH pipe? The server is one of the very few systems in the office running under Linux, and the rest of the workstations are a mixture of Windows NT4 and Windows XP systems.
No, you can mount the remote share and run rsync locally. This does reduce your security slightly and remember that it is possible to run an ssh server on windows.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.