Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have to download a log file , from my remote server to my desktop. The size of the file is around 35GB and it is a database dump file. I have "GZIP" it to around 5GB. Still it is showing the estimated time to download is around 39 hours.
I find , there is a split command. But using the man pages of split, i am unable to split it into pieces. For example, I want the fail to be break into pieces of 512MB. How can i do that?
You can split the zipped file using the command:split -b 500k filename prefix,but unfortunately you can not unzip the small files independently unless you cat the all small files again.
I've had some bad experiences splitting up a compressed file. You might want to do the database dump and pipe it thru split by line numbers, then compress them for the copy.
When you split a database dump by size, you might cut it off right in the middle of a sql statement. Do it by line numbers, at least it will stay intact and then when you copy over the split dump files, there's really no need to put them back together, instead just make sure to reimport in the correct order or whatever you need to do with them.
you can also split by the number of lines
this way you should specify that split is using the standard input by using the -
zcat dbpedia-data-0.nq.gz | split -l 2000000 - dbpedia0
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.