LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 06-14-2006, 12:16 PM   #1
utw-mephisto
Member
 
Registered: Apr 2005
Posts: 93

Rep: Reputation: 16
Multiple Connections using wget


Is that possible ? A server I need to make some backups from only supports 128k each connection but allows multiple connections. Is there a way ?

Or can you suggest any other commandline solution ?
 
Old 06-14-2006, 01:08 PM   #2
fedora4002
Member
 
Registered: Mar 2004
Posts: 135

Rep: Reputation: 15
Quote:
Originally Posted by utw-mephisto
Is that possible ? A server I need to make some backups from only supports 128k each connection but allows multiple connections. Is there a way ?

Or can you suggest any other commandline solution ?

Can you just run mutiple instance of wget?
 
Old 06-14-2006, 01:18 PM   #3
utw-mephisto
Member
 
Registered: Apr 2005
Posts: 93

Original Poster
Rep: Reputation: 16
I sure can, but I need to transfer HUGE files and running wget several times to download one file would not do the trick..
 
Old 06-14-2006, 01:35 PM   #4
fedora4002
Member
 
Registered: Mar 2004
Posts: 135

Rep: Reputation: 15
You can let each wget to get one file and there should not be conflict. In fact, you can write a simple script to do it. curl can also be used to to it.
In the script:
1) get the list of files that you want to back up.
2) download the files with each file using one connection. You can have several wget running at the same time.

Last edited by fedora4002; 06-14-2006 at 01:43 PM.
 
Old 06-14-2006, 01:44 PM   #5
fedora4002
Member
 
Registered: Mar 2004
Posts: 135

Rep: Reputation: 15
someone else might have better solution.
 
Old 06-16-2006, 12:10 PM   #6
utw-mephisto
Member
 
Registered: Apr 2005
Posts: 93

Original Poster
Rep: Reputation: 16
Quote:
Originally Posted by fedora4002
You can let each wget to get one file and there should not be conflict. In fact, you can write a simple script to do it. curl can also be used to to it.
In the script:
1) get the list of files that you want to back up.
2) download the files with each file using one connection. You can have several wget running at the same time.
I think you don't understand what I mean ..

I have ten logfiles with 4 GB each. I know I can use now ten wget instances to get those 4 GB files. However, since the server only allows one connection of 128 each, I would like to let ONE wget session to download ONE file with like TEN connections.

Like a downloadmanager such as getright.

To make it more visible :

This downloadmanager (Getright) splits the file into 4 parts. One connection gets the part 0-1GB, one connection 1-2GB etc. of the same file and merges it ...

BUT : Since I am on commandline, I need exactly the same with either wget or curl ..
 
Old 01-29-2009, 04:43 AM   #7
disclaimer
LQ Newbie
 
Registered: Jan 2009
Posts: 1

Rep: Reputation: 2
You might use aria2c which supports multiple connection.
usage is like
aria2c -j number-of-concurrency url
And of course you can simply man aria2c
Its package name on debian is aria2 and it is located under /usr/bin/aria2c after installation.
 
1 members found this post helpful.
Old 02-02-2009, 04:50 PM   #8
utw-mephisto
Member
 
Registered: Apr 2005
Posts: 93

Original Poster
Rep: Reputation: 16
Quote:
Originally Posted by disclaimer View Post
You might use aria2c which supports multiple connection.
usage is like
aria2c -j number-of-concurrency url
And of course you can simply man aria2c
Its package name on debian is aria2 and it is located under /usr/bin/aria2c after installation.
I call that a bump

Cheers mate - that does indeed work like a treat

I normally get about 200k from this particular server, but using aria2c gives me .. well .. a tad more

 
1 members found this post helpful.
Old 07-16-2012, 02:49 AM   #9
prumble
LQ Newbie
 
Registered: Jul 2012
Posts: 1

Rep: Reputation: Disabled
Talking try using pcurl

Try using pcurl. It is a simple shell script that uses curl to download the file in 10 segments and the re-joins them at the end.

you can find it on sourceforge.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Vpn Multiple connections rdwiljam Linux - Networking 0 09-29-2005 04:43 PM
Multiple Connections with OpenBSD and PF IMNOboist Linux - Networking 0 04-29-2005 11:34 PM
Multiple internet connections kojie Linux - Networking 2 12-08-2004 03:19 PM
Multiple Wireless Connections vanv101 Slackware 7 10-24-2004 07:39 AM
running wget multiple times quickk Linux - Newbie 1 09-16-2004 06:53 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 07:22 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration