I've been learning Linux from books, and I just got my first self-managed VPS server from ovh.com (https://www.ovh.com/us/vps/vps-ssd.xml
I am configuring everything according to a website that I've found on the Internet, and I will be working on a program, which will be based on picture galleries, and this could be like 10k or 50k separate galleries, with around 15 image files per each.
These images get downloaded from xxx sponsor sites (something like NaughtyRevenue.com) by a program called Mech Bunny TGP script. Some of these image files turn out to be large in size (uncompressed, like 1.5 MB each) and it cant be this way because of bandwidth cost and download time.
For this reason, I was thinking to download them in bulk to my Windows hard drive, and compress them, but this would be taking quite a lot of time. I would need a download manager program, something to compress these files in bulk, it then would need to be uploaded to the server, and processed by the program, what takes time. Normally I can just add a list of urls, and the program will download them, and add them where they are needed there.
With this, since I am using command line Linux now, I was wondering if I could just download everything to the server, and compress selectively with the command line...
Is Linux compression (I can install whatever program would be needed, and configure it, obviously) as good as what I would get on Windows? If not, it would still make sense to use something a little worse, because it would save me a lot of time.
Can you tell me your opinion on this? Will it work and what would I use?