Shell Script for FTP and gzip
I want to be able to write a shell script for downloading files (only *.tar extension) from multiple folders (the sub folder's names may vary) in a FTP site and be able to untar them and then gzip them and then move them to the real folder. Any help appreciated.
|
If I am not wrong, what you want is:
1.) Download a tar file from an ftp site. 2.) Untar them 3.) Tar them a again with gzip and save them to a particular location. Though I am new to Linux but will still try. Please do post, if I was rigt or wrong in my approach, this will help to understand the concept better. vim test.sh Code:
wget http://ftp.domainname.com/filename.tar Naman |
Or you can also do this:
Code:
wget http://ftp.domainname.com/filename.tar |
Thanks for trying to answer my question.
The ftp site has login and pwd (not annonymous login) Also, I do not know the file name or number of files. I want to be able to find out and download all the *.tar files existing in a directory structure (that too does not have the name constant). Once I untar the files, they are with .Z extension (compressed files). I have to uncompress them using gzip -d <file name>. |
Try this then:
Code:
wget ftp://ftp.domainname.com/*.tar.gz |
All times are GMT -5. The time now is 11:50 PM. |