Quote:
Originally Posted by fedora4002
You can let each wget to get one file and there should not be conflict. In fact, you can write a simple script to do it. curl can also be used to to it.
In the script:
1) get the list of files that you want to back up.
2) download the files with each file using one connection. You can have several wget running at the same time.
|
I think you don't understand what I mean ..
I have ten logfiles with 4 GB each. I know I can use now ten wget instances to get those 4 GB files. However, since the server only allows one connection of 128 each, I would like to let ONE wget session to download ONE file with like TEN connections.
Like a downloadmanager such as getright.
To make it more visible :
This downloadmanager (Getright) splits the file into 4 parts. One connection gets the part 0-1GB, one connection 1-2GB etc. of the same file and merges it ...
BUT : Since I am on commandline, I need exactly the same with either wget or curl ..