LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 11-04-2016, 10:37 AM   #1
johnnyreb
LQ Newbie
 
Registered: Aug 2014
Location: Outside Mena, Arkansas in the Ouachita mountains (hills really).
Distribution: Ubunto Studio 16.04 LTS
Posts: 9

Rep: Reputation: Disabled
Cron, Command line, Internet, Downloading


I live in the boonies in Arkansas. I run Ubuntu Studio 16.04. My Internet provider is Exede (satellite). I like to tinker with Linux distributions and listen to old time radio shows. Both of these require large downloads. Exede punishes those who exceed (no pun intended) their usage limits with glacial speeds and/or $10.00 per gig prices. However, they will allow one to download FREE in the wee hours. But my wife, the cat, and our two dogs object to my getting up and blundering around in the middle of the night to download stuff.

I have used cron in years past while supporting Novell networks in Texas. I haven't used the command line much because I could do what I needed to without it until now. I know nothing about using the command line in an internet context and not a lot otherwise. So, how do I educate myself with a view to writing scripts that will allow me to indulge myself?

Thanks for listening (reading)
Johnnyreb (YeeeHah!)
 
Old 11-04-2016, 01:22 PM   #2
grail
LQ Guru
 
Registered: Sep 2009
Location: Perth
Distribution: Arch
Posts: 10,017

Rep: Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196Reputation: 3196
cron - see man pages and internet

download - search for wget or curl

Let us know how you get on
 
1 members found this post helpful.
Old 11-04-2016, 01:43 PM   #3
szboardstretcher
Senior Member
 
Registered: Aug 2006
Location: Detroit, MI
Distribution: GNU/Linux systemd
Posts: 4,278

Rep: Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694
Cron/Wget download scheduler how to

http://www.howtogeek.com/54124/build...ramming-skill/
 
1 members found this post helpful.
Old 11-04-2016, 02:00 PM   #4
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,484
Blog Entries: 3

Rep: Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810Reputation: 3810
The full manual page for "wget" is worth reading, it's got a ton of options. If you're just scheduling one-off jobs, then "at" is another option for you.
 
1 members found this post helpful.
Old 11-08-2016, 10:56 AM   #5
johnnyreb
LQ Newbie
 
Registered: Aug 2014
Location: Outside Mena, Arkansas in the Ouachita mountains (hills really).
Distribution: Ubunto Studio 16.04 LTS
Posts: 9

Original Poster
Rep: Reputation: Disabled
Onward and Upward through the fog

LQers,

Thanks for the replys.

Cron works like a charm but I have a problem with wget. The URL I am trying to download is: https://archive.org/compress/The_Bob...am/formats=VBR MP3&file=/The_Bob_Hope_Program.zip. All the download links on this site use this syntax. Wget says it can't resolve MP3 (there is a space after VBR/[url]). Is there a workaround? I've read the man pages for wget and curl but I don't understand a lot of what is there. I didn't find any mention of URL's with spaces.

The URL is what shows at the bottom of thy Firefox page when I put the mouse on the download button. If I run the URL straight from Firefox, the download starts.

Thanks for your help.
Charlie
 
Old 11-08-2016, 11:09 AM   #6
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573

Rep: Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142
You need to put quotes around the URL to prevent the shell from using the space to split the URL into two separate arguments.
 
1 members found this post helpful.
Old 11-08-2016, 11:19 AM   #7
Sefyir
Member
 
Registered: Mar 2015
Distribution: Linux Mint
Posts: 634

Rep: Reputation: 316Reputation: 316Reputation: 316Reputation: 316
Quote:
Originally Posted by Turbocapitalist View Post
The full manual page for "wget" is worth reading, it's got a ton of options. If you're just scheduling one-off jobs, then "at" is another option for you.
Second this, OP is defining a specific link, not referring to something like "latest"

OP doesn't define what "wee hours" means, but assuming 2am

Code:
# Example: echo 'command "argument"' | at time
echo 'wget "https://archive.org/compress/The_Bob_Hope_Program/formats=VBR%20MP3&file=/The_Bob_Hope_Program.zip"' | at 0200
This will run the wget command the next time 2am rolls around.
Also watch to see if your computer goes into sleep mode to preserve power.

Personally, if you plan on doing this more frequently, I'd suggest getting something like a Raspberry Pi so that you can run it 24/7 - so that your main computer doesn't have too (and drain too much power)
 
1 members found this post helpful.
Old 11-08-2016, 01:50 PM   #8
johnnyreb
LQ Newbie
 
Registered: Aug 2014
Location: Outside Mena, Arkansas in the Ouachita mountains (hills really).
Distribution: Ubunto Studio 16.04 LTS
Posts: 9

Original Poster
Rep: Reputation: Disabled
Thanks everybody!

Helpful LQers,

You have made a mistake because I WILL be back.

I will mark it solved right now

Thanks and well done.
Charlie Bishop
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Lightweight Command Line Downloading with Aria2 LXer Syndicated Linux News 0 03-03-2013 03:40 PM
Downloading from the command line metalx1000 Linux - General 2 09-16-2006 11:06 AM
Downloading from a FTP site using command line Gins Linux - General 11 05-28-2005 05:13 PM
command line downloading ChaseCrum Linux - General 4 08-10-2004 07:43 AM
downloading from the command line tcut Linux - Software 2 09-27-2002 07:32 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 01:46 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration