LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Cron, Command line, Internet, Downloading (https://www.linuxquestions.org/questions/linux-newbie-8/cron-command-line-internet-downloading-4175592891/)

johnnyreb 11-04-2016 10:37 AM

Cron, Command line, Internet, Downloading
 
I live in the boonies in Arkansas. I run Ubuntu Studio 16.04. My Internet provider is Exede (satellite). I like to tinker with Linux distributions and listen to old time radio shows. Both of these require large downloads. Exede punishes those who exceed (no pun intended) their usage limits with glacial speeds and/or $10.00 per gig prices. However, they will allow one to download FREE in the wee hours. But my wife, the cat, and our two dogs object to my getting up and blundering around in the middle of the night to download stuff.

I have used cron in years past while supporting Novell networks in Texas. I haven't used the command line much because I could do what I needed to without it until now. I know nothing about using the command line in an internet context and not a lot otherwise. So, how do I educate myself with a view to writing scripts that will allow me to indulge myself?

Thanks for listening (reading)
Johnnyreb (YeeeHah!)

grail 11-04-2016 01:22 PM

cron - see man pages and internet

download - search for wget or curl

Let us know how you get on :)

szboardstretcher 11-04-2016 01:43 PM

Cron/Wget download scheduler how to

http://www.howtogeek.com/54124/build...ramming-skill/

Turbocapitalist 11-04-2016 02:00 PM

The full manual page for "wget" is worth reading, it's got a ton of options. If you're just scheduling one-off jobs, then "at" is another option for you.

johnnyreb 11-08-2016 10:56 AM

Onward and Upward through the fog
 
LQers,

Thanks for the replys.

Cron works like a charm but I have a problem with wget. The URL I am trying to download is: https://archive.org/compress/The_Bob...am/formats=VBR MP3&file=/The_Bob_Hope_Program.zip. All the download links on this site use this syntax. Wget says it can't resolve MP3 (there is a space after VBR/[url]). Is there a workaround? I've read the man pages for wget and curl but I don't understand a lot of what is there. I didn't find any mention of URL's with spaces.

The URL is what shows at the bottom of thy Firefox page when I put the mouse on the download button. If I run the URL straight from Firefox, the download starts.

Thanks for your help.
Charlie

suicidaleggroll 11-08-2016 11:09 AM

You need to put quotes around the URL to prevent the shell from using the space to split the URL into two separate arguments.

Sefyir 11-08-2016 11:19 AM

Quote:

Originally Posted by Turbocapitalist (Post 5627009)
The full manual page for "wget" is worth reading, it's got a ton of options. If you're just scheduling one-off jobs, then "at" is another option for you.

Second this, OP is defining a specific link, not referring to something like "latest"

OP doesn't define what "wee hours" means, but assuming 2am

Code:

# Example: echo 'command "argument"' | at time
echo 'wget "https://archive.org/compress/The_Bob_Hope_Program/formats=VBR%20MP3&file=/The_Bob_Hope_Program.zip"' | at 0200

This will run the wget command the next time 2am rolls around.
Also watch to see if your computer goes into sleep mode to preserve power.

Personally, if you plan on doing this more frequently, I'd suggest getting something like a Raspberry Pi so that you can run it 24/7 - so that your main computer doesn't have too (and drain too much power)

johnnyreb 11-08-2016 01:50 PM

Thanks everybody!
 
Helpful LQers,

You have made a mistake because I WILL be back.

I will mark it solved right now

Thanks and well done.
Charlie Bishop


All times are GMT -5. The time now is 12:09 PM.