LinuxQuestions.org
Latest LQ Deal: Linux Power User Bundle
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 12-21-2011, 01:52 PM   #1
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Rep: Reputation: Disabled
Is sed a solution to this problem for loading data back into mysql db?


Dear All,
We have back up a table into text file first via the into file method. Then on another server we try to run this command LOAD DATA LOCAL INFILE '/usr/local/backUp.txt' INTO TABLE data1 FIELDS TERMINATED BY ' ' LINES TERMINATED BY '\n'. It works but only able to fill the first column and the rest are all empty. Any solution to this problem? Sample of the data is below.

Code:
308589422 3.10882 101.54966 23 0 215 0.9 2011-08-01 00:00:00 1 00402020F028 00 000 0 00 109.6 13033 4D \N \N \N \N \N \N 
308590571 5.00517 102.22741 20 0 340 0 2011-08-01 00:00:00 1 00000000 27.3 012 0 04.0 154856 BB1012 \N \N \N 01 0100 0 
308596673 2.94121 101.76445 29 0 62 0 2011-08-01 00:00:00 1 00000000 25.4 000 0 04.1 56870.5 AB0569 \N \N \N 02 0000 F 
308596708 3.99147 101.05772 15 0 105 0 2011-08-01 00:00:00 1 00000000 12.6 01B 0 04.1 42562.9 AB0829 \N \N \N 02 0000 F 
308589424 5.4142 100.55539 26 0 126 1 2011-08-01 00:00:01 1 01000028F820 57 3FF 0 F5 1082.31 13276 75 \N \N \N \N \N \N 
308589482 4.95754 102.07095 3 96 235 0 2011-08-01 00:00:01 1 00000000 27.0 000 0 04.1 10278.9 BB1788 \N \N \N 01 0100 F 
308589649 1.8189 102.95381 15 0 0 0 2011-08-01 00:00:01 1 00000000 25.3 2D6 0 04.1 15223 BB1972 \N \N \N 02 0000 F 
308590471 2.63557 103.65955 14 0 66 50 2011-08-01 00:00:01 4 014000203028 44 FF 0 56 102958 15370 13 \N \N \N \N \N \N 
308590497 2.63557 103.65955 14 0 66 50 2011-08-01 00:00:01 4 014000203028 44 FF 0 56 102958 15370 13 \N \N \N \N \N \N 
308590498 2.63557 103.65955 14 0 66 50 2011-08-01 00:00:01 4 014000203028 44 FF 0 56 102958 15370 13 \N \N \N \N \N \N 
308589430 1.66043 103.85198 13 0 101 0.8 2011-08-01 00:00:02 1 01400020FC28 57 FF 0 00 0 14110 2D \N \N \N \N \N \N 
308589460 2.20039 102.23996 8 46 304 0 2011-08-01 00:00:02 1 00000000 14.5 000 0 04.1 12777.8 AB0550 \N \N \N 01 0100 F 
308589490 3.43023 101.58937 7 0 8 0 2011-08-01 00:00:02 1 00000000 25.7 01B 0 04.0 86577.1 AB0731 \N \N \N 02 0000 F 
308589498 2.54766 101.80599 27 0 283 0 2011-08-01 00:00:02 1 00000000 12.3 01B 0 04.1 66774.7 AB0728 \N \N \N 02 0000 F 
309099219 1.72879 103.42198 6 63 125 0 2011-08-01 00:00:02 1 00000000 27.1 000 0 04.1 23969.4 BB1891 \N \N \N 01 0100 F 
308589432 2.79064 102.29934 16 68 235 0.8 2011-08-01 00:00:03 1 01400028F028 58 FF 0 58 264552 00601 5C \N \N \N \N \N \N 
308589725 4.81002 100.71297 14 24 146 0.8 2011-08-01 00:00:03 1 01400020F028 BF FF 0 AD 81911.8 12154 6D \N \N \N \N \N \N 
308589964 3.30702 101.55428 11 0 312 0 2011-08-01 00:00:03 1 00000000 27.7 012 0 04.0 34818.2 BB1465 \N \N \N 01 0100 F

Last edited by Tinkster; 12-21-2011 at 07:11 PM. Reason: fixed closing code tag /. not \ ...
 
Old 12-21-2011, 07:14 PM   #2
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,066
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
That's an odd dump. If \n is meant to be a line terminator, why are you
popping lots of \n into the individual rows? And is the original dump from
a linux, a windows or a Mac box?



Cheers,
Tink
 
Old 12-21-2011, 09:21 PM   #3
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Original Poster
Rep: Reputation: Disabled
Dear Tinkster,
The original itself is in linux and now I am backing it back also into linux. Any resolution to this?


Quote:
Originally Posted by Tinkster View Post
That's an odd dump. If \n is meant to be a line terminator, why are you
popping lots of \n into the individual rows? And is the original dump from
a linux, a windows or a Mac box?



Cheers,
Tink
 
Old 12-21-2011, 09:34 PM   #4
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,066
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
May I ask why you're not using the mysqldump utility instead?
Chances are the results will be more predictable.


Cheers,
Tink
 
Old 12-21-2011, 09:35 PM   #5
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Original Poster
Rep: Reputation: Disabled
Dear Tinkster,
Mysqldump was too slow and the original data have been deleted ready. So we are just left with this text files. How best to recover it?


Quote:
Originally Posted by Tinkster View Post
May I ask why you're not using the mysqldump utility instead?
Chances are the results will be more predictable.


Cheers,
Tink
 
Old 12-21-2011, 09:46 PM   #6
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,066
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
Hard to say ... what's the actual table structure you're trying to
push this into? Looking at the file w/ awk, counting columns, the
result isn't promising for any import, no matter what tool you'd
use ....
Code:
awk '{print NF}' mysql.dump
24
23
23
23
24
23
23
24
24
24
24
23
23
23
23
24
24
23
Any respectable SQL database server will barf on that (even MySQL
does, apparently).



Cheers,
Tink
 
Old 12-21-2011, 09:48 PM   #7
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Original Poster
Rep: Reputation: Disabled
Dear Tinkster,
I do not cause we generate it via the select into file method which we had no control over the fields either why is like difficult for me to answer as that is the results of mysql?

Quote:
Originally Posted by Tinkster View Post
Hard to say ... what's the actual table structure you're trying to
push this into? Looking at the file w/ awk, counting columns, the
result isn't promising for any import, no matter what tool you'd
use ....
Code:
awk '{print NF}' mysql.dump
24
23
23
23
24
23
23
24
24
24
24
23
23
23
23
24
24
23
Any respectable SQL database server will barf on that (even MySQL
does, apparently).



Cheers,
Tink
 
Old 12-21-2011, 09:54 PM   #8
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,066
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
Quote:
Originally Posted by newbie14 View Post
I do not cause we generate it via the select into file method which we had no control over the fields either why is like difficult for me to answer as that is the results of mysql?
Che? Can you please rephrase that?

And can you please stop top-posting? If all you do is quote
my entire post might as well save the precious space, and not
quote at all.


Cheers,
Tink
 
Old 12-21-2011, 09:57 PM   #9
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Original Poster
Rep: Reputation: Disabled
Dear Tinkster,
What I was trying to tell is that the file is output based on select into file so why there are 23/24 fields I also could not answer as we did not build the file manually by ourselves? So what best solution can you suggest?
 
Old 12-22-2011, 01:10 AM   #10
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.8, Centos 5.10
Posts: 17,240

Rep: Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324
Unless you've got the orig cmd (and even then) its going to be tricky to say why it is like it is....
So, focussing on the soln, as asked above, tell us what table structure you intend to insert this stuff into and we'll be able to give you some ptrs.
The main thing is to match the data fields you've got with the num of cols in the target table.
 
Old 12-22-2011, 03:11 AM   #11
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Original Poster
Rep: Reputation: Disabled
Dear All,
I have managed it with some removal of fields and insertion of dummy fields. So looks like ok for now. So future backup for million of records like this what is the best advice when I select into file should I do some delimiters?
 
Old 12-22-2011, 03:43 PM   #12
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,066
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
I recommend using the methods provided by the DB for such purposes.
"select into" doesn't seem to be good enough. If you can modify the
select to insert delimiters, how come you can't use the appropriate
methods for the database, mysqldump?


Cheers,
Tink
 
Old 12-22-2011, 07:00 PM   #13
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.8, Centos 5.10
Posts: 17,240

Rep: Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324Reputation: 2324
I also recommend using mysqldump if moving between MySQL DBs; it'll handle the fiddly bits for you.
If you want to use the other method, you really need to read the relevant docs pages carefully and know your data very well.
A 3rd option for moving data to other formats generally is to write a program (personally I've done a lot with Perl) and create the output exactly as you want it.
 
Old 12-23-2011, 07:20 AM   #14
newbie14
Member
 
Registered: Sep 2011
Posts: 470

Original Poster
Rep: Reputation: Disabled
Dear All,
Actually I forgot the mention the main reason I used the select into because I did not wanted to dump the whole table but just a data between a date range e.g a particular month. So for that case what is best is it dump or any other mechanism?
 
Old 12-23-2011, 08:09 AM   #15
Cedrik
Senior Member
 
Registered: Jul 2004
Distribution: Slackware
Posts: 2,140

Rep: Reputation: 242Reputation: 242Reputation: 242
Try --where option for mysqldump
Code:
mysqldump -u<user> -p<passwd> \
--where="EXTRACT(MONTH FROM date_field_name) = <month num>" \
<database name> <table name> > dump.sql.txt

Last edited by Cedrik; 12-23-2011 at 08:11 AM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
sed add data to the back of a particular line in a text file gengwei89 Linux - Newbie 1 11-06-2011 05:31 AM
problem in recovering back data solaris zfs nabat1 Solaris / OpenSolaris 2 04-15-2010 07:15 AM
awk or sed to for this questions solution ? cs24 Programming 7 03-20-2010 06:23 AM
import data back to mysql alaios Linux - General 2 02-27-2007 09:52 AM
Problem importing database back to MySql shubb Programming 3 11-29-2005 02:36 PM


All times are GMT -5. The time now is 09:03 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration