Originally Posted by haojam
Each file is big size so wget command is too slow to download all 14 files.
Could you be a little more specific about this? I don't think wget "is slow", it should download as fast as the connection allows (theoretically, up to the point when disk I/O and such things start to slow things down--but this will probably not be a problem). If you mean that you have a very fast network connection, but that the specific connection through which you download a file from the server is slow (e.g., the bottleneck is that single connections to the server are limited to some slower speed than what you are capable of), then I guess you could try to download the files in parallel, e.g., several at once. This would speed things up if each of the connections could work at the same speed as one connection, e.g., the "speed limit" was on a per-connection basis, not over all the connections to the server. However, if it is not absolutely necessary, I recommend not doing this: there is probably a reason for the speed limit, and one should always respect the limits set on remote servers.
If you need an example on how you can use wget (other programs would be fine too) to do parallel downloads, see for example this
. Basically it fetches a list of what to download, then pipes that to (e)grep and xargs which are used to pick up files and build (argument) lists for multiple wget commands, which then start (simultaneously) the downloads.
If I misunderstood your question, please ignore my answer (unless it's helpful in some way) and perhaps re-phrase your post to get better/correct answers.