using wget for parallel downloading
I have this idea of donloading a file using multiple wget (one for each piece)
I am using tiny core which is a light weight distro and I find wget reliable as well as ubiquitous.
Say we know that file is of exactly 100Mbytes.
I will wget first to download first upto 10M bytes.
Tell another wget to get (10M+1)th byte till 20M.
So on ..
for 10 processes.
Is that possible ?
If we do not have control over the length then say we have to kill first wget when part1 file exceeds 10M.