Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have to compress entire files of folder is size of 1.10 GB. It is to be held on Windows NT machine.
I have a folder called /folder1/. It contains around 200 files of size 1.10 gb. I want to compress those files in a zipped file. Compress process to be an automated process. Because after compress that files, i have to download that files.
Do you mena the files are already on NT, or are they on Linux and you want to compress then download to NT?
On Linux, gzip and bzip2 have params to change the amt of compression, more compression == longer time to compress.
Text (usually) compresses well, binaries not so much...
Need more detail, but HTH
>Do you mena the files are already on NT, or are they on Linux and you >want to compress then download to NT? On Linux, gzip and bzip2 have >params to change the amt of compression, more compression == longer >time to compress. Text (usually) compresses well, binaries not so >much... Need more detail, but HTH.
Files are uploaded into NT machine(Act as FTP site) from our client. I have to download those files from that machine. This is should be an automated process. Already I got your suggestions for my download process(Cron Jobs of FTP Download - Thread name). I tried to apply shell scripts of download those directly using MGET command. But it failed due to our internet bandwidth and file size. So i have decided to compress the files first, then i will start the download process through my scripts in linux machine. Download Process to be held on Linux Machine. Compress Process to be held on Windows NT machine.
I'm no MS expert, and to be honest, I'm surprised NT can handle 220GB of address space. Separate drive names?
However, the 'Linuxy' soln would be to install Cygwin toolset on the NT and then use gzip or bzip2 as mentioned. iirc, at the highest levels of compression, bzip2 is more efficient, but check that first.
Alternately, winzip files can be read by gzip (usually). Try a test file.
Edit: continued ... here's a radical idea, how about rsync? It does compression on the fly, although that may only be during txfr ie you may still end up with uncompressed files on the final tgt box.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.