I have 100 GB space on a mutualized web server, e. g. a machine on which I only have an FTP account, but no shell access. This machine hosts my new package repository. Unfortunately I have no rsync access, so synchronizing the local file tree with the remote file tree is a bit of a headache.
To make things worse, I have a slow DSL connection with a ridiculous upload bandwidth of about 40 kB/s. That's the price you pay for living in the south French countryside.
So far, GFTP and Filezilla don't really behave like they should. I guess the solution will be a short script using lftp, ncftp or whatever command-line client will be apt for the job.
Here's how the script should behave, preferably.
- If a remote file is identical to the local copy, proceed to the next file.
- If a remote file is present but different than the local file (in content), re-upload and squash it.
- If a remote file exists where there is no local equivalent, delete the file.
In short, the whole thing should be equivalent to:
$ rsync -av --delete [localstuff] [remote_server]