LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 09-22-2005, 09:39 PM   #1
Swakoo
Member
 
Registered: Apr 2005
Distribution: Red Hat / Fedora / CentOS
Posts: 508

Rep: Reputation: 30
mysqldump : Can I split the file up to 2GB max per file?


As some of you guys noticed, I am working on the mysqldump to maintain a daily backup.

Problem is my live database servers are all on 64bit machine (thus 4GB/file limitation). My backup server which is going to store the backfiles, and if need be, act as a stand-in database server. Its a 32bit machine, meaning, by right, 2GB Max/file.


Most of my database have no issue except for one, as it is 3.8GB before gzip, after which it becomes 1.2GB before I transport over.

Here's the question:

How to do a mysqldump such that the database dump will split itself into 2GB/file partition (before gzip)?

Oh a side question... I did try to decompress the 1.2GB file in my backup server (32bit), and it expanded to 3.8GB. The OS won't report any errors (I thought it would) ?

Then I tried to restore the file, and it ended up always reporting ERROR 1062 (23000) at line 6433: Duplicate entry 'swensen?fs' for key 2

I did a comparison, all tables are in. but I can't for sure tell whether the data is there as the database is highly active..

Someone suggested that the live database server mysql config maybe different thats why.. (oh and i took almost a day to restore the mysqldump onto the 32bit machine). Anyone can direct me to the mysql config? Is it only /etc/my.cnf

THANKS people!!!
 
Old 09-23-2005, 02:38 AM   #2
addy86
Member
 
Registered: Nov 2004
Location: Germany
Distribution: Debian Testing
Posts: 332

Rep: Reputation: 31
Maybe split(1) is what you need.
 
Old 09-23-2005, 04:35 AM   #3
Swakoo
Member
 
Registered: Apr 2005
Distribution: Red Hat / Fedora / CentOS
Posts: 508

Original Poster
Rep: Reputation: 30
can it be used in conjunction?

My current command is

mysqldump --opt -uxxxx -pyyyy -B dbname | gzip -9 > /home/user/db_files/dbname-$date.sql.gz


I read the help.. so if i specify in bytes (which will be 1000000000 for 1GB?) .. i need to add
split --bytes=1000000000 filename.sql ?
 
Old 09-23-2005, 06:46 AM   #4
addy86
Member
 
Registered: Nov 2004
Location: Germany
Distribution: Debian Testing
Posts: 332

Rep: Reputation: 31
Quote:
Originally posted by Swakoo
can it be used in conjunction?
You mean, in a pipe? Yes, but only at the end of a pipe, since it doesn't output to stdout (wouldn't make sense).

You would do it like:
mysqldump options | split -b 1000000000 - dbname-$date.sql-;
for f in dbname-$date.sql-* ; do gzip -9 $f ; done
 
Old 10-05-2005, 11:59 PM   #5
Swakoo
Member
 
Registered: Apr 2005
Distribution: Red Hat / Fedora / CentOS
Posts: 508

Original Poster
Rep: Reputation: 30
Quote:
Originally posted by addy86
You mean, in a pipe? Yes, but only at the end of a pipe, since it doesn't output to stdout (wouldn't make sense).

You would do it like:
mysqldump options | split -b 1000000000 - dbname-$date.sql-;
for f in dbname-$date.sql-* ; do gzip -9 $f ; done
I don't understand... meaning the whole line is one command, including the FOR statement?

what do you mean by end of pipe since it doesn't output to stdout? sorry just trying to clarify as it is a live server and i don have other big database to try with...


how does the pipe works..?
 
Old 10-12-2005, 05:06 AM   #6
Swakoo
Member
 
Registered: Apr 2005
Distribution: Red Hat / Fedora / CentOS
Posts: 508

Original Poster
Rep: Reputation: 30
command works liao played around a bit..

didn't know can incoporate for loops in command ... interesting!
 
Old 10-12-2005, 05:15 PM   #7
KimVette
Senior Member
 
Registered: Dec 2004
Location: Lee, NH
Distribution: OpenSUSE, CentOS, RHEL
Posts: 1,794

Rep: Reputation: 46
What filesystem are you using where the max file size is 4GB? Or, is this to save the files to DVD once backed up?

Remember: file size limitations has nothing to do with the instruction size your processor uses. I run a 32-bit Pentium 4 and yet on ReiserFS (or NTFS if I boot to Windows) I can generate files many gigabytes in size.

With that said, to answer your question: I'd pipe the dump through tar (or better yet, dar) and specify that the files be created in 3.9GB volumes.

Just FYI: 32-bit systems are not limited to 2GB file sizes, and 64-bit systems are not limited to 4GB file sizes. The "bitness" of file allocation is completely independent of the "bitness" of the CPU's instruction set, the CPU's address lines, and so forth.
 
Old 10-13-2005, 08:41 PM   #8
Swakoo
Member
 
Registered: Apr 2005
Distribution: Red Hat / Fedora / CentOS
Posts: 508

Original Poster
Rep: Reputation: 30
Quote:
Originally posted by KimVette
What filesystem are you using where the max file size is 4GB? Or, is this to save the files to DVD once backed up?

Remember: file size limitations has nothing to do with the instruction size your processor uses. I run a 32-bit Pentium 4 and yet on ReiserFS (or NTFS if I boot to Windows) I can generate files many gigabytes in size.

With that said, to answer your question: I'd pipe the dump through tar (or better yet, dar) and specify that the files be created in 3.9GB volumes.

Just FYI: 32-bit systems are not limited to 2GB file sizes, and 64-bit systems are not limited to 4GB file sizes. The "bitness" of file allocation is completely independent of the "bitness" of the CPU's instruction set, the CPU's address lines, and so forth.
Hi there,

my impression had always been such, 32bit-2GB ; 64bit-4GB....

So you are recommmending using tar rather than gzip?
 
Old 10-14-2005, 05:36 AM   #9
scuzzman
Senior Member
 
Registered: May 2004
Location: Hilliard, Ohio, USA
Distribution: Slackware, Kubuntu
Posts: 1,851

Rep: Reputation: 47
Quote:
Originally posted by Swakoo
So you are recommmending using tar rather than gzip?
Tar contains no compression - just the ability to archive and split files. Tar literally stands for Tape ARchiver. You can use, in conjunction, tar and gzip to create a .tar.gz of your database, if compression is what you're after.
 
Old 10-15-2005, 01:31 PM   #10
KimVette
Senior Member
 
Registered: Dec 2004
Location: Lee, NH
Distribution: OpenSUSE, CentOS, RHEL
Posts: 1,794

Rep: Reputation: 46
I recommend tar since tar is an archive program and gzip is purely compression (a bit different from pkzip/winzip which offer integrated archival/compression capability)

Also I'd recommend bzip over gzip because with bzip you can get to your data even if there are minor errors. gzip requires an error-free file in order to decompress any of it.
 
Old 10-17-2005, 04:13 AM   #11
Swakoo
Member
 
Registered: Apr 2005
Distribution: Red Hat / Fedora / CentOS
Posts: 508

Original Poster
Rep: Reputation: 30
hi guys...

tar comparing to split(), what should i know of the difference?
Currently I got it to work using split + gzip

I was considering abt bzip.. cant remember what was the difference that made me go gzip.. i think it was the compression... but you saying abt bzip works even if there are some errors (on compression i presume, or CRC errors?) is worth considering.. i will take note...
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Very confused about 2GB file size limit! nkendrick Linux - General 6 09-29-2005 07:14 PM
samba limit on file sizes about 2GB adler321 Linux - Software 7 09-22-2004 07:59 AM
Redhat 9.0 >2GB file limit Wilbs Red Hat 2 08-12-2004 01:26 PM
2GB File Limit AS2.1 jbovaird Red Hat 19 11-14-2003 11:29 AM
2.4.20 Kernel has file size limit=2GB?? How can this be! digizer0 Linux - General 7 01-14-2003 10:00 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 09:51 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration