@ levian -- I sympathize with you regarding trying to download large files and having them go wonky on you. I have had it MANY more times than I care to recall.
However, I just wanted to point out that FTP transfers can indeed be resumed in most cases (and so can HTTP transfers as far as I know), unless there's some specific thing being done on the other end to forcefully disallow them to be.
I have found that it depends mostly on what download manager (or other means) you are using. For example, simply downloading directly with a browser is about the worst way to go. If the download stops or you lose the power at your location, the download is usually not recoverable.
But, using something like KDE's Kget downloader, allows you to start and stop the transfer at will; you can reboot and everything, or lose power, and still resume the transfer later.
Also, as choogendyk mentioned, Rsync is a cool way to download stuff, if the server offers that option. Rsync downloads one file at a time, instead of the usual way of downloading one giant ISO file. And if you kill the Rsync during a particular file, it will restart with that file again next time, rather than back at the very beginning of everything. Just be careful what options you give to Rsync -- you want to make sure that it does not empty the destination folder's contents before beginning!
At this very moment, I have been downloading a 4.1Gib ISO DVD image since last night, using Wget. It is 92% finished and suddenly the thing stops :O
I was getting worried that I would lose the whole thing, but the Wget restarted after about 10 minutes
but I don't want to find out how it would go if I had to shut the computer off; I think it would be toast..
Best of success,
Sasha