hi. there a server on the net that has an IP 3.3.3.3 and i have only https access with username and password. The site simply has a directory listing and it has a few folders(A,B,...,Z). I do need only 2 of them (A, D) to get them in a local machine with IP 192.168.1.5 and serve them through apache. The server gets updated at random times >4 times a day and the files are >1GB. The servers deletes the old files and keeps only the latest ones.
now we get the folders and their files manually using wget. i would like to automate it and have a script running every 30 minutes and check for differences. If it finds a newer file in folder A, delete all the local contents and download the newer. Repeat the same process for folder D.
at the moment i have
Code:
#!/bin/bash
wget -N --no-check-certificate --user atux password null -r -np https://3.3.3.3/D -P /var/www/html
wget -N --no-check-certificate --user atux password null -r -np https://3.3.3.3/A -P /var/www/html
there are two problems:
-it downloads in the local server, everything under /var/www/html/
3.3.3.3/A and D respectively. i would like to have it under /var/www/html//A and D respectively, without the remote IP 3.3.3.3
-it keeps downloading the newer files, without deleting the older ones.
could someone help me over with the script please?