LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 02-05-2011, 02:38 AM   #1
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Rep: Reputation: 17
Transferring files online


Hi,

I signed up for an an a/c with Mozy, a data backup service and backed up about 40GB of my data on my a/c with them. Somehow, their system marked 38GB of my data for deletion and this 38GB can be deleted anytime.

All this data backup was done when I was on working holiday for a period of 5mths and before I left from there, I uninstalled the Mozy software from the laptop there, then formatted the h/drive and returned the laptop to my friend from whom I'd borrowed it, so I have no copies of the data I had uploaded to Mozy.

However, they say they cannot do anything to unmark my files from deletion and I can try my luck and restore my files before they get deleted.

I have setup a restore request with them and the request lasts for 7 days during which I have to complete my restore.

The problem is, I'm back home, where I'm using the best of the worst internet ('cos that all that's available here) which gives me 512k downlink and keeps going down 20 times a day (and Mozy's restore doesn't support resume) and not only that, but we keep getting power outages.

Is there any free service online which would host for me and allow me to transfer files directly from my Mozy restore (the 38GB have been set int0 82 self-extracting .exe files of about 400-500MB each) without me having to download them onto my PC first?
I tried filefactory, but it was giving me some URL error. From my understanding, I don't need to be logged into my Mozy a/c to download the restore files.

Incase it may help, I have a web hosting a/c with Hostgator.com (I highly recommend them - they're good) and I asked them this, but they were not sure if it would work - they did however give a suggestion to use something like cURL or wGET and they even set it up for me but it was a bit too technical to understand and I have only 6days or less remaining.
 
Old 02-05-2011, 05:38 AM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by fkasmani View Post
I have no copies of the data I had uploaded to Mozy.
I hope you have learned from that that one should not run software or use services without a basic understanding of what one uses and that making backups is essential.


Quote:
Originally Posted by fkasmani View Post
The problem is, (..) 512k downlink and keeps going down 20 times a day (..) we keep getting power outages.
One way out could be to ask family, a friend or a colleague (or even rent a laptop and a motel room for a night at a place) with a decent connection and D/L from there. (And I don't really care for practicalities like logistics or cost aspects unless it just proves your data isn't worth all the hassle.)


Quote:
Originally Posted by fkasmani View Post
I tried filefactory, but it was giving me some URL error.
Too vague. Fixing technical errors requires posting commands used, URI's accessed and exact error messages.


Quote:
Originally Posted by fkasmani View Post
(..)I don't need to be logged into my Mozy a/c to download the restore files.
That would imply that everyone could D/L your files...


Quote:
Originally Posted by fkasmani View Post
(..)I have a web hosting a/c (..) they even set it up for me but it was a bit too technical to understand
Too vague again. Post what they wrote to you in their email or give a factual description of what they set up for you, post commands, scripts, et cetera. If this works it's probably easier to accomplish than storing it elsewhere on-line, creating another SPOF in the process.
 
1 members found this post helpful.
Old 02-05-2011, 06:36 AM   #3
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
Quote:
Originally Posted by unSpawn View Post
One way out could be to ask family, a friend or a colleague (or even rent a laptop and a motel room for a night at a place) with a decent connection and D/L from there. (And I don't really care for practicalities like logistics or cost aspects unless it just proves your data isn't worth all the hassle.).
I'm already using the best of what is available here (as I said earlier, I'm using the best of the worst".



Quote:
Originally Posted by unSpawn View Post
Too vague. Fixing technical errors requires posting commands used, URI's accessed and exact error messages.
Filefactory gives the error, "Invalid URL - HTTP Error Code: 0 - id:5012778"


Quote:
Originally Posted by unSpawn View Post
That would imply that everyone could D/L your files...
yep, I think so too and that's why I think they keep this valid for only 7 days.



Quote:
Originally Posted by unSpawn View Post
Too vague again. Post what they wrote to you in their email or give a factual description of what they set up for you, post commands, scripts, et cetera. If this works it's probably easier to accomplish than storing it elsewhere on-line, creating another SPOF in the process.
This is the reply I got from Hostgator
Quote:
Although I am not 100% sure, I believe that you would download the files using a command line tool like the command curl (or wget, or GET). You would have to use an ssh client on your machine to connect to your account on the server, then on the command line use curl followed by the correct flags and URL that you want downloaded.

First the easy part. To do this you need SSH access to your account and access to the tools named curl, wget, and GET. I have enabled both of these options for your main reseller account.

Second, you have to connect to your server using your reseller account name and password. You can find instructions on how to do this at the following support article:
http://support.hostgator.com/article...use-ssh-access

Third, and here is the hard part. You have to figure out which of the three tools (curl, wget, GET) would work for what you are doing, and figure out all of the flags and options that would download the file for you. My suggestion is you trying using curl (sometime spelled with caps like this cURL).
Here is the "man" (or manual) page for curl:
http://www.unix.com/man-page/Linux/1/curl/
I don't mind spending some time to familiarize myself with this, but that's exactly the problem - I'm racing against time.

Isn't there any URL-to-FTP software with which I could upload these Mozy restore segments directly into my Hostgator a/c straight from Mzoy? I believe this would be the fastest way as both Mozy and Hostgator would have good data pipes to the internet backbone.
 
Old 02-05-2011, 11:15 AM   #4
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by fkasmani View Post
I don't mind spending some time to familiarize myself with this
OK. Ensure you can log in to your account as your hosting provider suggested. Ensure you have a file system with 40GB or preferably more free space left ('df -mh' will show). Create a directory, say "mozy": 'mkdir mozy' and go there: 'cd mozy'. Now create a list of files to download (complete URL's, not just the name) and save it in the "mozy" directory as "files.txt". (At this point I would suggest running 'screen' so you can D/L in one window and tail the log file in another but I don't feel like explaining all commands. Try 'yum -y install screen; screen man screen' if you want to.) Now run wget with "files.txt" as input, retry D/L 1 time and output verbose nfo to a log file: 'wget -t 1 -i ./files.txt -v -a ./wget.log &'. The ampersand makes 'wget' work on in the background so you can run 'tail -f ./wget.log' now to keep a tab on the D/L process. (Also see 'man nohup'.)

* If you don't want to find out after downloading 38GB that something went wrong you'll use a "files.txt" containing only 1 entry for testing purposes. Just kill the D/L ('pkill -9 wget') after a few minutes, check the log if it looks OK and only then let it download the whole backup with the right input file.
 
1 members found this post helpful.
Old 02-05-2011, 11:40 AM   #5
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
So firstly I need to have wget on my local PC, right?

Quote:
Originally Posted by unSpawn View Post
OK. Ensure you can log in to your account as your hosting provider suggested.
I'm wondering about this, I just checked out how to connect to my hostgator a/c via SSH using putty and hostgator says, "...To access SSH, download winscp or PuTTY. Enter your IP address and port 2222....." Now how will I connect - I'm behind my ISP's NAT thing and my IP address is 192.168.xxx.xxx
Checking out on whatsmyip.com gives my ISP's NAT address through which tens of other customers like me are connected.
Quote:
Originally Posted by unSpawn View Post
Ensure you have a file system with 40GB or preferably more free space left ('df -mh' will show).
I have an a/c with 50GB at hostgator.
Quote:
Originally Posted by unSpawn View Post
Create a directory, say "mozy": 'mkdir mozy' and go there: 'cd mozy'. Now create a list of files to download (complete URL's, not just the name) and save it in the "mozy" directory as "files.txt". (At this point I would suggest running 'screen' so you can D/L in one window and tail the log file in another but I don't feel like explaining all commands. Try 'yum -y install screen; screen man screen' if you want to.)
I hope I'm on a linux system with hostgator, otherwise.....

Quote:
Originally Posted by unSpawn View Post
Now run wget with "files.txt" as input, retry D/L 1 time and output verbose nfo to a log file: 'wget -t 1 -i ./files.txt -v -a ./wget.log &'. The ampersand makes 'wget' work on in the background so you can run 'tail -f ./wget.log' now to keep a tab on the D/L process. (Also see 'man nohup'.)
Quote:
Originally Posted by unSpawn View Post
* If you don't want to find out after downloading 38GB that something went wrong you'll use a "files.txt" containing only 1 entry for testing purposes. Just kill the D/L ('pkill -9 wget') after a few minutes, check the log if it looks OK and only then let it download the whole backup with the right input file.
it's really very important that nothing goes wrong - I can't afford to lose my data.
 
Old 02-06-2011, 08:09 AM   #6
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by fkasmani View Post
So firstly I need to have wget on my local PC, right?
No. You'll D/L files to the remote system only.


Quote:
Originally Posted by fkasmani View Post
I just checked out how to connect to my hostgator a/c via SSH using putty and hostgator says, "...To access SSH, download winscp or PuTTY. Enter your IP address and port 2222....." Now how will I connect
You will connect from your local PC to your hosting provider. So you need the IP address you reach your server account at. If you have configured SSH key authorization in your cPanel, downloaded your private key (.ppk extension!) and noted the TCP/2222 port then go test if you can connect and issue commands like 'wget --help' to see if it's installed.
If you have any client-side problems configuring or establishing a SSH connection do post commands, steps, error messages and log excerpts. If you have any server-side problems configuring or making SSH work posting commands, steps, errors and logs may help but in the end your hosting provider should support you (that's what you're paying for).


Quote:
Originally Posted by fkasmani View Post
I have an a/c with 50GB at hostgator.
Cool but you still need to issue 'df -mh' to see on which part of the file system you'll create the D/L directory and files.


Quote:
Originally Posted by fkasmani View Post
I hope I'm on a linux system with hostgator, otherwise.....
...otherwise you'll use this but I doubt that'll be the case.


Quote:
Originally Posted by fkasmani View Post
it's really very important that nothing goes wrong - I can't afford to lose my data.
Note I gave you another option. Another one could be to ask if you could pay them to ship some DVD's or a tape your way with the backup files on it. Determining what (effort) your backup is actually worth, understanding and (wisely) choosing your options, getting to know the system and commands, testing the D/L, it's all your choice and responsibility.
 
1 members found this post helpful.
Old 02-07-2011, 02:04 PM   #7
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
Quote:
Originally Posted by unSpawn View Post
'wget -t 1 -i ./files.txt -v -a ./wget.log &'
Thanks, it seems to be working. I tried at first with one file listed in the files.txt as you said. At first the log file recorded an error to do with the security certificate and told me to add '--no-check-certificate' so I tried with the command
Code:
wget --no-check-certificate -t 1 -i ./files.txt -v -a ./wget.log &
after a few minutes of giving this command, I did the
Code:
tail -f ./wget.log
and it kept listing the transfer progress. After all was done, it confirmed the same and I verified the size of the transfered file with the source and all seemed OK.

I then replaced the files.txt with a new file called urllist.txt which now has the list to all 83 files and issued the command
Code:
wget --no-check-certificate -t 1 -i ./urllist.txt -v -a ./wget.log &
but after a few minutes of issuing this command I gave the
Code:
tail -f ./wget.log
and it didn't do anything so I checked by logging into the server via FTP and it seemed to be doing the job - 1 file had completed and the 2nd one was in between transfer, but when I logged out from the FTP session, the SSH session closed down as well, so I've started the process again.

However, again after issuing the tail command, still nothing is happening so i really don't know how well the transfer is going on - whether it's even on or not.

THIS IS AN UPDATE AFTER ABOUT 30MIN INTO THE JOB:

Surprisingly, I've just noticed that the putty window has disappeared and this time I've not even logged into FTP or anything. Any idea? Also, is there anything I can do so the job does not get interrupted if something happens at my end - eg a power or internet downtime?

Last edited by fkasmani; 02-07-2011 at 02:37 PM. Reason: UPDATE
 
Old 02-07-2011, 04:01 PM   #8
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by fkasmani View Post
is there anything I can do so the job does not get interrupted if something happens at my end
There's a few ways. Three common ones would be to run the whole command as an:
- 'at' job, or
- use 'nohup', or
- use 'screen'.

A generic script that can be uses with all. Replace the value "/mozy" with the FULL path to the download directory and save the script in your download directory as "dl.sh", then make it executable with 'chmod 0700 dl.sh':
Code:
#!/bin/bash --
DLDIR="/mozy"
[ -d "${DLDIR}" ] || { echo "No such dir \"${DLDIR}\", exiting."; exit 1; } && cd "${DLDIR}"
wget --no-check-certificate -t 1 -i "${DLDIR}/files.txt" -v -a "${DLDIR}/wget.log"
exit 0
At job
Check 'man at' for a basic understanding of what the command does. Check if the 'at' service runs with 'service atd status' and if it doesn't run: 'service atd start'. The command to use is '/usr/bin/at -f /path/to/dl.sh now'. Replace "/path/to/" with the FULL path to the download directory. That's all. Once issued the command runs in the background (check with 'atq').

nohup
Check 'info nohup' for a basic understanding of what the command does. The command to use is 'nohup /path/to/dl.sh &'. Replace "/path/to/" with the FULL path to the download directory. That's all. Once issued the command runs in the background (check with 'pgrep -lf wget; tail -f ~/nohup.out').

screen
See if 'screen' is installed with 'which screen' or 'rpm -qi screen'. If screen is not installed issue 'yum -y install screen'. Check 'man screen' for a basic understanding of what the command does. The command to use is 'screen /path/to/dl.sh'. Replace "/path/to/" with the FULL path to the download directory. Once issued the command runs in the foreground inside 'screen'. 'screen' uses a default attention command key combo of the control plus the "a" key which you will press at the same time. Any single char typed after that means a command 'screen' understands. I will abbreviate CTRL+A as CA:
- open up a new window by using the key combo "CA c". Now type 'tail -f /path/to/wget.log' and notice the log getting filled.
- Now type "CA 0" to return back to the first window where you started the 'wget' command.
- Now type "CA 1", then type "CA d" to log out of screen. Now log out of your SSH session.
- Reestablish a SSH connection with the server (and I hope by ${deities} you do not log in as root), log in as root and issue 'screen -DDRR' (it's overkill but just make a habit of using those args OK). Notice how you are now back inside screen, looking at your wget.log. Always exit screen with "CA d". Do NOT use CTRL+Z, CTRL+D or "exit" unless you really want to suspend or exit.

* With all three examples you can tail the wget log file and you can log out after issuing the command. If you log back in the wget command will be running until completion or until an unrecoverable error (up the retries value to 3?) occurs.
 
1 members found this post helpful.
Old 02-07-2011, 04:43 PM   #9
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
SCREEN was not working so I wrote to hostgator and this is the reply
Quote:
I apologize for the inconvenience, but screen is not permitted by the shared account servers shell.
 
Old 02-07-2011, 06:44 PM   #10
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
One option down, two left to try...
 
Old 02-08-2011, 03:22 AM   #11
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
I understand that HostGator cannot install screen for me as it's a server-wide thing while I have a shared a/c with them. Knowing how helpful HostGator have always been, I'm sure they would have installed it for me if it was possible.

Quote:
Originally Posted by unSpawn View Post
One option down, two left to try...
unSpawn, I did the following commands: 'man at' and 'info nohup' and both responded, so I have access to them

Lets say I were to go using the script you've made. Where do I place the script and how do I execute it from the SSH shell prompt?

But I still don't understand two things with the 'wget --no-check-certificate -t 1 -i ./files.txt -v -a ./wget.log &' command I'm giving:
  1. why does the ssh terminal out-of-the-blue disappear and disconnect the SSH session along with the transfer?
  2. why is it that now when I'm giving the tail command, it doesn't display anything?
unSpawn, Thanks so much for the way you're going out of your way to help me.
 
Old 02-08-2011, 12:55 PM   #12
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
[QUOTE=unSpawn;4251216]Replace the value "/mozy" with the FULL path to the download directory and save the script in your download directory as "dl.sh", then make it executable with 'chmod 0700 dl.sh':
Code:
#!/bin/bash --
DLDIR="/mozy"
[ -d "${DLDIR}" ] || { echo "No such dir \"${DLDIR}\", exiting."; exit 1; } && cd "${DLDIR}"
wget --no-check-certificate -t 1 -i "${DLDIR}/files.txt" -v -a "${DLDIR}/wget.log"
exit 0
I don't seem to be getting the path right - I keep getting the error,
Code:
No such dir "/mozy",exiting.
Let me just outline the way I see the directory. Firstly I'm on a shared server. Secondly, when I log into the SSH terminal and I do a dir command, I get as per the attached snapshot (terminal_dir.png). I assume this is the 'root'. In here I have made a directory called mozy and in this mozy folder, is the files.txt and this is the folder in which I would like to have the files downloaded. So I placed dl.sh in the mozy folder and executed it, but error. So I placed the dl.sh in this root folder and executed it, but error. Then, there's also the public_html folder in which I've made a folder called mozy (just incase it doesn't work here, I don't mind making the mozy in public_html as the download folder, so I even tried to execute dl.sh from the mozy in the public_html but no luck - same error. (BTW, the mozy folder in the public_html also has a copy of the files.txt). What should I do?
Attached Thumbnails
Click image for larger version

Name:	terminal_dir.png
Views:	9
Size:	54.8 KB
ID:	6080  
 
Old 02-08-2011, 01:52 PM   #13
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Quote:
Originally Posted by fkasmani View Post
why does the ssh terminal out-of-the-blue disappear and disconnect the SSH session along with the transfer?
To know why PuTTY disappears you need to enable logging (it's somewhere in the options). The D/L gets disconnected because it runs in your current shell session. If the session terminates it takes all the active tasks in the foreground that don't have their file descriptors closed, which won't happen if you hand off the D/L command to 'at' or 'nohup' (or 'disown').


Quote:
Originally Posted by fkasmani View Post
why is it that now when I'm giving the tail command, it doesn't display anything?
You haven't posted any command you run and no (error) messages: give me factual information.


Quote:
Originally Posted by fkasmani View Post
I don't seem to be getting the path right - I keep getting the error,
Code:
No such dir "/mozy",exiting.
That's because you didn't change the path?


Quote:
Originally Posted by fkasmani View Post
when I log into the SSH terminal and I do a dir command, I get as per the attached snapshot (terminal_dir.png). I assume this is the 'root'.
If you log in as root (${deities} forfend!) then yes, it will be /root. If not (applause) then it will be /home/${LOGNAME} BTW you know what assuming makes ;-p One way to figure out would be to 'readlink -f ~/' or 'echo $HOME' or run 'pwd' when inside that directory.


Quote:
Originally Posted by fkasmani View Post
In here I have made a directory called mozy and in this mozy folder, is the files.txt and this is the folder in which I would like to have the files downloaded.
The safest way is to just set the value to "~/mozy":
Code:
#!/bin/bash --
DLDIR="~/mozy"
[ -d "${DLDIR}" ] || { echo "No such dir \"${DLDIR}\", exiting."; exit 1; } && cd "${DLDIR}"
wget --no-check-certificate -t 1 -i "${DLDIR}/files.txt" -v -a "${DLDIR}/wget.log"
exit 0
 
1 members found this post helpful.
Old 02-08-2011, 02:35 PM   #14
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
Quote:
Originally Posted by unSpawn View Post
You haven't posted any command you run and no (error) messages: give me factual information.
that's because giving the tail command doesn't give any message or display of any kind - just as if it's waiting for something.

Quote:
Originally Posted by unSpawn View Post
If you log in as root (${deities} forfend!) then yes, it will be /root. If not (applause) then it will be /home/${LOGNAME} BTW you know what assuming makes ;-p One way to figure out would be to 'readlink -f ~/' or 'echo $HOME' or run 'pwd' when inside that directory.

Quote:
Originally Posted by unSpawn View Post
The safest way is to just set the value to "~/mozy":
Code:
#!/bin/bash --
DLDIR="~/mozy"
[ -d "${DLDIR}" ] || { echo "No such dir \"${DLDIR}\", exiting."; exit 1; } && cd "${DLDIR}"
wget --no-check-certificate -t 1 -i "${DLDIR}/files.txt" -v -a "${DLDIR}/wget.log"
exit 0
Running with the "~/mozy" gives the same error.
I did the 'echo $HOME' while in the mozy directory and it displayed '/home/my_username' and 'pwd' displayed '/home/my_username/mozy' so I set it to 'DLDIR="/home/my_username/mozy"' and then from the mozy directory did 'cd ../' and uploaded the script , set the permissions and executed the script. It seems to work. But again, when I do the tail command, the response is nothing (pls see attached snapshot). Could it be because I'm running the tail command from the directory in which the wget.log is not present in?
So I was not aware if the transfer is taking place or not, so I logged into the mozy folder with my FTP client and saw one of the restore files there and kept refreshing, so I saw the sizes of both the restore file (being copied - this is just the first of 83 ) and wget.log file increasing.
Does this now mean that the transfer will continue even if I shut down my PC?

THIS IS AN UPDATE AFTER ABOUT 15MIN
Again the SSH (putty) terminal disappeared and I logged into the mozy folder via my FTP client and see the the transfer has stopped. I keep refreshing there but see no changes.

GREAT, I JUST REALISED MY MISTAKE
I was just executing ./dl.sh instead of using it through the nohup command. Really sorry about that. Guess it's doing it now as I closed the SSH/putty session and then logged into the mozy folder via my FTP client - kept refreshing and saw the file sizes were increasing. I'll keep the updates posted.

THIS IS AN UPDATE AFTER ABOUT 12HRS
I'm noticing from the FTP client, that it's transferring no more than 1 complete file. It does the first file successfully, starts on the second one and half way the transfer stops. I've made three attempts and have experienced the same problem. I've even changed the listings in the files.txt, so a different file is now listed as the second file, but no luck.
Attached Thumbnails
Click image for larger version

Name:	terminal_dir.png
Views:	9
Size:	21.3 KB
ID:	6082  

Last edited by fkasmani; 02-09-2011 at 06:46 AM. Reason: IMPORTANT UPDATE
 
Old 02-09-2011, 09:37 AM   #15
fkasmani
Member
 
Registered: Dec 2007
Posts: 178

Original Poster
Rep: Reputation: 17
I've decided to go the manual way. I've made 83 individual .txt files and uploaded them into the download folder, each containing just one URL and then am executing one at a time like this:
Code:
wget --no-check-certificate -t 1 -i ./file4.txt -v -a ./wget4.log &
wget --no-check-certificate -t 1 -i ./file5.txt -v -a ./wget5.log &
wget --no-check-certificate -t 1 -i ./file6.txt -v -a ./wget6.log &
wget --no-check-certificate -t 1 -i ./file7.txt -v -a ./wget7.log &
wget --no-check-certificate -t 1 -i ./file8.txt -v -a ./wget8.log &
so-far-so-good, the 3rd file in being transfered.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Transferring files Denisius Slackware 6 10-27-2007 07:19 AM
Troubleshooting Transferring Files combatseabee Linux - General 1 10-08-2007 03:40 PM
Transferring alias files linux_dummy Linux - General 11 03-18-2006 03:07 AM
Transferring 40000+ files with FTP (mget) shows 0 files tim1235 Linux - Software 5 10-17-2004 06:06 PM
IO Error when transferring files. mickey_kamer Linux - General 4 03-19-2004 09:29 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 07:05 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration