LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 11-16-2015, 11:10 AM   #1
HyKurtis
LQ Newbie
 
Registered: Nov 2015
Posts: 1

Rep: Reputation: Disabled
I'm trying to download a file from the web to my linux server using wget!


Hey guys, I'm trying to download a 133GB .zip file from the web. I'm using wget to do this and my buddy said to write this command:

wget link-to-download.zip serverdirectory

so I went ahead and typed >

wget https://www.****.***/*****/pudgecraft.zip /home (the *'s are to block out the real link to the download)

and it returned this error at 13% :

"cannot write to "pudgecraft.zip" (Success)."

Any idea why this is happening and even better anyone know what command I should type and how i should format it?

Thanks for your help!
 
Old 11-16-2015, 11:16 AM   #2
HMW
Member
 
Registered: Aug 2013
Location: Sweden
Distribution: Debian, Arch, Red Hat, CentOS
Posts: 773
Blog Entries: 3

Rep: Reputation: 369Reputation: 369Reputation: 369Reputation: 369
Hi!

Hard to tell. But the most likely explanation is that you don't have enough space to download a file that big (133GB is BIG!).

Or... it could be a timeout issue: http://stackoverflow.com/questions/2...s-wget-timeout

Best regards,
HMW
 
Old 11-16-2015, 03:57 PM   #3
coltree
Member
 
Registered: Nov 2003
Location: Jacobs Well, Queensland AU
Distribution: OpenBSD
Posts: 102
Blog Entries: 1

Rep: Reputation: 34
Have you talked with the people at pudgecraft.com ?
So far this month they have only raised 67 of funds !
If you help them, they will help you.

Last edited by coltree; 11-16-2015 at 03:59 PM. Reason: correction
 
Old 11-16-2015, 04:28 PM   #4
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
I'd contact them. they may not want you downloading 133GB...
 
Old 11-16-2015, 05:43 PM   #5
berndbausch
LQ Addict
 
Registered: Nov 2013
Location: Tokyo
Distribution: Mostly Ubuntu and Centos
Posts: 6,316

Rep: Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002
Quote:
Originally Posted by HyKurtis View Post

wget https://www.****.***/*****/pudgecraft.zip /home (the *'s are to block out the real link to the download)

and it returned this error at 13% :

"cannot write to "pudgecraft.zip" (Success)."
The /home parameter is a bit strange - what is it supposed to mean?
Also, the error message indicates a local problem. Could it be that you want to save this file to /home, but it really goes to the current directory, where there is no space?
 
Old 11-16-2015, 05:47 PM   #6
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
Quote:
Originally Posted by berndbausch View Post
The /home parameter is a bit strange - what is it supposed to mean?
Good catch.
only root can write to /home.

We can guess all day long.
How about output of
Code:
df -hT
from the OP?

I'd try
Code:
wget https://www.****.***/*****/pudgecraft.zip $HOME
and see if any progress is made.

Last edited by Habitual; 11-16-2015 at 05:49 PM.
 
Old 11-16-2015, 07:56 PM   #7
berndbausch
LQ Addict
 
Registered: Nov 2013
Location: Tokyo
Distribution: Mostly Ubuntu and Centos
Posts: 6,316

Rep: Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002
Quote:
Originally Posted by Habitual View Post
Good catch.
only root can write to /home.
...
Code:
wget https://www.****.***/*****/pudgecraft.zip $HOME
Actually, I don't think that the syntax is correct:
Code:
$ wget --help|more
GNU Wget 1.16, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...
...
/home will be interpreted as another download URL. On my system (Raspbian), an error message "missing scheme" or so is generated - wget wants file:/home instead of /home.
 
Old 11-16-2015, 11:09 PM   #8
coltree
Member
 
Registered: Nov 2003
Location: Jacobs Well, Queensland AU
Distribution: OpenBSD
Posts: 102
Blog Entries: 1

Rep: Reputation: 34
if you use wget -c file.zip
it will continue from where it stopped

at the end of your command you had /home

wget [option]... [URL]...

it would treat /home as another source [URL] not the destination


Bernd, love the bit about ho ho ho and a bottle of rum
I'm with you on the URL

Last edited by coltree; 11-16-2015 at 11:12 PM.
 
Old 11-16-2015, 11:45 PM   #9
berndbausch
LQ Addict
 
Registered: Nov 2013
Location: Tokyo
Distribution: Mostly Ubuntu and Centos
Posts: 6,316

Rep: Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002Reputation: 2002
Quote:
Originally Posted by coltree View Post
Bernd, love the bit about ho ho ho and a bottle of rum
It was a comment in the HPUX initialization routine that selected one of the CPUs at the "monarch" CPU. I wonder if it's still there. About the same level as the so-called bug "you can tune a filesystem but you can't tuna fish" in the UNIX tunefs man page (sadly, that great contribution to fun in computing has been removed).
 
Old 11-17-2015, 02:59 AM   #10
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
Quote:
Originally Posted by berndbausch View Post
Actually, I don't think that the syntax is correct:
Ya, I may have crossed scp with wget.

Code:
wget https://www.****.***/*****/pudgecraft.zip
 
Old 12-11-2015, 03:50 PM   #11
cesarbergara
Member
 
Registered: Feb 2012
Location: Buenos Aires, Argentina
Distribution: Debian, Suse, Mandrake,
Posts: 92

Rep: Reputation: Disabled
Hi. wget -c is useful to resume a download interrupted.

But the problem may be your FS version (ext2, ext3, ext4, reiserfs, ntfs) can't write this size of file (take a look at the manual of mkfs).

Have a nice day.
 
Old 12-15-2015, 12:07 AM   #12
aaronbrad
LQ Newbie
 
Registered: Dec 2015
Posts: 2

Rep: Reputation: Disabled
If there is enough space to accommodate 133GB file in our machine, we can try to resume the previous downloaded instance using the wget -c https://www.****.***/*****/pudgecraft.zip command.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
web file server download gets 403 yoda987 Linux - Newbie 4 08-19-2013 12:36 AM
I want to download web image using wget or something moldycake Linux - General 4 07-13-2012 04:13 AM
[SOLVED] How do I use wget to download only images from a single web page? errigour Linux - Newbie 1 11-29-2011 06:57 PM
[SOLVED] wget to access web resource but not download it veeruk101 Linux - Newbie 2 07-17-2011 04:51 AM
How to download web pages from a website using wget command Fond_of_Opensource Linux - Newbie 5 07-05-2006 09:50 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 08:51 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration