Cron, Command line, Internet, Downloading
I live in the boonies in Arkansas. I run Ubuntu Studio 16.04. My Internet provider is Exede (satellite). I like to tinker with Linux distributions and listen to old time radio shows. Both of these require large downloads. Exede punishes those who exceed (no pun intended) their usage limits with glacial speeds and/or $10.00 per gig prices. However, they will allow one to download FREE in the wee hours. But my wife, the cat, and our two dogs object to my getting up and blundering around in the middle of the night to download stuff.
I have used cron in years past while supporting Novell networks in Texas. I haven't used the command line much because I could do what I needed to without it until now. I know nothing about using the command line in an internet context and not a lot otherwise. So, how do I educate myself with a view to writing scripts that will allow me to indulge myself? Thanks for listening (reading) Johnnyreb (YeeeHah!) |
cron - see man pages and internet
download - search for wget or curl Let us know how you get on :) |
|
The full manual page for "wget" is worth reading, it's got a ton of options. If you're just scheduling one-off jobs, then "at" is another option for you.
|
Onward and Upward through the fog
LQers,
Thanks for the replys. Cron works like a charm but I have a problem with wget. The URL I am trying to download is: https://archive.org/compress/The_Bob...am/formats=VBR MP3&file=/The_Bob_Hope_Program.zip. All the download links on this site use this syntax. Wget says it can't resolve MP3 (there is a space after VBR/[url]). Is there a workaround? I've read the man pages for wget and curl but I don't understand a lot of what is there. I didn't find any mention of URL's with spaces. The URL is what shows at the bottom of thy Firefox page when I put the mouse on the download button. If I run the URL straight from Firefox, the download starts. Thanks for your help. Charlie |
You need to put quotes around the URL to prevent the shell from using the space to split the URL into two separate arguments.
|
Quote:
OP doesn't define what "wee hours" means, but assuming 2am Code:
# Example: echo 'command "argument"' | at time Also watch to see if your computer goes into sleep mode to preserve power. Personally, if you plan on doing this more frequently, I'd suggest getting something like a Raspberry Pi so that you can run it 24/7 - so that your main computer doesn't have too (and drain too much power) |
Thanks everybody!
Helpful LQers,
You have made a mistake because I WILL be back. I will mark it solved right now Thanks and well done. Charlie Bishop |
All times are GMT -5. The time now is 12:09 PM. |