-   Linux - General (
-   -   auto-ftp to my website? (

leeko 08-14-2008 04:32 PM

auto-ftp to my website?
Hi everyone. I have a slightly convoluted question, and I have just enough Linux knowledge to know roughly how to do it (but not enough to figure out exactly how!).

I use Zim desktop wiki on my Asus EeePC (xandros) as my catch-all program (note-taking, manuscript preparation, todo list, etc). It's really useful, and I've just figured out how to make it export to webpages from the command-line, as follows:


$ zim --export  format=html,template=/usr/share/zim/templates/Presentation.html,dir=/home/user/My\ Documents/ZimExport,index=index ~/My\ Documents/Zim

Now, what I'd like to do is make zim do this periodically, then auto-ftp the resulting folder to my web server, so that I can access all of my notes from any computer (read-only, of course).

If I can password-protect the resulting website, that would be even better (though not absolutely necessary).

I'm guessing that I need to use a script and crontab to do this, but my knowledge level doesn't extend that far. Here are my sticking-points:

1. I'm not sure of the syntax for the script, or where it should be put.
2. I'm not sure whether EeePC xandros has a built-in commandline ftp program (similar to sendmail for email), that I can incorporate into my script. If not, can anyone suggest a lightweight option?
3. I'm not sure of the syntax for crontab. I'd like it to do this maybe twice a day. If the laptop is not on when it's due to happen, can it be forced to do it on the next boot?

Thanks for any advice you can offer. I'm loving how flexible this wee beastie is!


edit: Forgot one more question - if the script runs when I don't have an internet connection, what happens? I'd prefer not to have dialogue boxes popping up every time it fails. Can I redirect error messages to a log, or is that the default behaviour? Thanks!

ilikejam 08-15-2008 10:24 AM


I'd use 'lftp' for this (don't know if it's included on the Eee, but it's fairly standard on Linux).

Your script (I'm calling it 'zimupload', and putting it in '/home/user/My Documents/bin') would look something like:

zim --export format=html,template=/usr/share/zim/templates/Presentation.html,dir=/home/user/My\ Documents/ZimExport,index=index ~/My\ Documents/Zim

cd "/home/user/My\ Documents/ZimExport"

lftp -c 'open -u UserName,Password; cd /remote/side/destination/directory; mirror -R -e -L'

Your cron entry would look something like:
32 10,22 * * * /home/user/My\ Documents/bin/zimupload

To summarise all this, the cron entry given above would run the script at 10:32 and 22:32 every day.
The script first exports your zim wiki, then mirrors the export directory ('/home/user/My Documents/ZimExport') to the '/remote/side/destination/directory' directory on the FTP site.

Be careful with the lftp script details - the way it's set up '/remote/side/destination/directory' will end up a direct copy of '/home/user/My Documents/ZimExport', so if you get it wrong, you can clobber stuff you didn't want to.

If it runs when there's no Internet connection, the zim export will succeed, but the lftp session will fail. You shouldn't notice anything happening.

Hit me back if none of that made any sense.


leeko 08-15-2008 01:52 PM

You, sir, are a star. :D

This is exactly what I'm looking for. I've hit a couple of minor snags, though:

1. the second line of the script has a small syntax error - I had to remove the quotes to make it work (I think maybe it didn't like the space after the first quote?). It works now.

2. I tested the third line, with the following command:


lftp -c 'open -u USER,PASSWORD 192.168.XXX.XX; cd website/Zim; mirror -R -e -L'
It appeared to stall with the message:


cd `/website/Zim/.zim' [Delaying before reconnect: 23]
It counts down from 40 seconds, tries again, then repeats. When I check the remote directory, it has copied all the html files from the base directory, but none of the subdirectories or the files contained in them.

From the lftp man page, it looks like the command should mirror the local directory recursively without any changes. I'm not sure what's stopping it from doing so.

Any ideas?

Thanks again,


ilikejam 08-15-2008 02:09 PM


Can you do:
$ ftp 192.168.XXX.XX
cd /website/Zim
ls -la
cd .zim

and post the output?


leeko 08-15-2008 02:18 PM

Hi Dave, thanks for the quick reply. The EeePC doesn't have ftp program by default. Here's the output from lftp:


/home/user> lftp
lftp :~> open -u website,XXXXXXX
lftp website@> cd /website/Zim/
cd ok, cwd=/website/Zim
lftp website@> ls -la
-rwxrw----  1 website  everyone    2544 Aug 15 18:25 Computer.html
-rwxrw----  1 website  everyone    3102 Aug 15 18:25 Home.html
-rwxrw----  1 website  everyone    2498 Aug 15 18:25 Misc.html
-rwxrw----  1 website  everyone    2443 Aug 15 18:25 Personal.html
-rwxrw----  1 website  everyone    2395 Aug 15 18:25 TODO.html
-rwxrw----  1 website  everyone    2426 Aug 15 18:25 WishList.html
-rwxrw----  1 website  everyone    2543 Aug 15 18:25 Work.html
-rwxrw----  1 website  everyone    3523 Aug 15 18:25 index.html
lftp website@> cd .zim
lftp website@> quit

When I try to cd to .zim, it does the same thing (counts down from 40). I had to press ctrl+c to break it (that's why it says "interrupt".

I'm not sure why it's trying to cd to .zim, though. I don't have a .zim directory on the local side...



ilikejam 08-15-2008 02:55 PM

I can only imagine the FTP server is set up not to allow dot-directories, or maybe it's set up not to allow _any_ directory creation.

Try doing 'rm -rf /home/user/My\ Documents/ZimExport/.zim', then try the lftp mirror command again, see if it gets any further.


leeko 08-15-2008 08:02 PM

Yup, that did the trick!

I added in that line to the script (rm -rf .zim), and now it works a charm :)

Thanks for all your help!


All times are GMT -5. The time now is 03:22 PM.