Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hi Guys, have a small issue with mysqldump command of mysql. Consider I have two servers Server 1 (Webserver: 192.168.1.1)and Server 2 (Database Server: 192.168.1.2), both being Linux Red Hat 9 Servers. Is it possible to dump two tables from a particular database on Server 2 to a particular directory on Sever 1 in .xls or csv format?? Any help will be highly appreciated. Thx in advance.
Don't know exactly how to do this directly with MySQLDump, but my ISP hosts PHPMyAdmin (heard of it?) that does exactly this. I can export a table (or the results of a query) from my online server to my local machine's working directory. You might want to take a look at PHPMyAdmin - you should be able to figure out from its source how to get a .CSV dump in the format you want. Then, all you need is a NFS or SAMBA connection between your two servers, and you should be able to dump tables from the DB server to a directory on the web server in .CSV format.
Thx for the reply rylan. Yes I am aware of phpMyAdmin. Infact both these servers are client servers and on the net. The data in these tables amount to almost 2gb. The client does not want phpMyAdmin installed on their server for security reasons. Thats why opting for mysqldump. I have managed to get the command syntax for mysqldump but am having some privileges issue connecting to the database. This is the command that I will execute on Server1: mysqldump -C -h server2/ipaddress -u root -p test test > outfile.sql. Now am getting an error Access denied for root@192.168.1.220 when trying to connect. Are you aware of this problem?? Any help will be appreciated.
Check the root permissions in the database (database mysql).
In my own LAMP configuration, I don't want external access to any DB (nor for root nor for normal users). So I've removed all access from the outside world. Access to databases can only be achieved from the machine itself (localhost).
So if I need to do anything, I either sit behind the machine or I login remotely and issue the commands.
Something similar might cause your 'problem'.
PS I guess that normal root access is also not possible.
Last edited by Wim Sturkenboom; 06-06-2006 at 04:45 AM.
Is the client's machine located inside or outside of your LAN?
192.168.x.x addresses are reserved for internal LAN use only. Can never be reached from outside the LAN.
Also, try establishing a connection (from the client) via telnet to the MySql port (3306 if my memory serves me right) on your server.
If you can telnet to the open port, then you can connect to the MySql daemon (mysqld, running on your server) from the client and mysqldump should work.
Thx for the replies guys. I have managed to use mysqldump remotely from my machine and able to get the structure and data in sql command format. Now I wanted the same data in .csv format. I have tried all sort of commands and permutation and combinations to get the result but have falied. Timmeke the 192.168.x.x/ are my local machines on which am testing this mysqldump command. If executed successfully will test on Client machines. The follwing error is the one that is persistent :
[mysqldump: Got error: 1: Can't create/write to file '/exportfiles/test1.txt' (Errcode: 13) when executing 'SELECT INTO OUTFILE']
I have also changed the rights and ownership of the target folder exportfiles to mysql on the basis of the search on GOOGLE. This is the command I executed to get the .csv result.
Is the syntax right or am I missing someting?? How do I oversome the Cant read/write .. error. If you need anything ps lemme know. I really need a solution urgently. Thx a ton for the help and replies.
The man page of my mysqldump utility says that the -T (or --tab) option only works if you're running mysqldump on the same host (server) as mysqld.
But if you get the results as SQL insert statements, it isn't that difficult to convert them to csv.
Basically, an insert statement generated from mysqldump looks like:
insert into <some_table> values (val1),(val2),...
where <some_table> is the table name of course and
val1, val2, ... represent the column values for records 1, 2, etc, separated by comma's.
So, a simple script should do the trick:
* grep for "insert" SQL statements;
* use sed to replace ")" by "\n" and ",(" by "".
* use sed once more to go from "insert into <some_table> values" to just "<some_table>\n".
This should give you a text file with following format:
table_name
record1
record2
...
where each record consists of csv values.
Example code:
Code:
grep -i 'insert into' dumpfile > tmpfile1;
sed -e 's/)/\n/' tmpfile1 > tmpfile2; #you can also use "tr" for this.
sed -e 's/,(//' tmpfile2 > tmpfile3;
sed -e 's/INSERT INTO//' tempfile3 > tempfile4;
sed -e 's/VALUES//' tempfile4 > yourfile.csv;
One call to "awk" can replace all the "sed" commands.
Thx timmeke, your approach was useful. But I was wondering if there was a possibility of eliminating so many steps to convert into .csv. Honestly speaking am pretty elementary to linux so dont know how to use alot of commands. I suppose there must be. I mean if you are backing more than one table performing these steps can be kinda tedious, isnt it?? I would appreciate if you could let me know if you have any other solution for this Crazy Cant create/write.. issue or to directly convert to .csv using Mysqldump.
I provided the steps for use in either a simple shell script or for testing.
You could also, for instance, pipe all commands.
ie
Code:
mysqldump ... | grep -i 'insert into' | sed -e 's/)/\n/' | sed -e 's/,(//' | sed -e 's/INSERT INTO//' |sed -e 's/VALUES//'
And you can also replace all the sed's with just one awk script or write your custom Perl script to do this.
Personnally, I prefer Perl over awk, but that's just my opinion.
Example Perl code:
Code:
#!/usr/local/bin/perl
#if you don't want to use STDIN, put an "open()" call here.
while(<>)
{
chop();
if (/^insert into/i)
{
(undef,undef,$table,undef,$records)=split(/ /, $_);
@recArray=split( /\,/ , $records);
print $table; #prints table name
foreach $rec @recArray
{
#$rec now has format (val1,val2,val3,...) => omit the '(' and ')' to get csv
$rec=~s/()//g;
print $rec;
}
}
}
My Perl may be a little rusty, but you get the idea...
command syntax for mysqldump but am having some privileges issue connecting to the database. This is the command that I will execute on Server1: mysqldump -C -h server2/ipaddress -u root -p test test > outfile.sql. Now am getting an error Access denied for root@192.168.1.220 when trying to connect. Are you aware of this problem?? Any help will be appreciated.
You might scream at me, but did you try
mysqldump -C -h server2/ipaddress -u root -p test test > outfile.sql
or did you try
mysqldump -C -h server2/ipaddress -u root -ptest test > outfile.sql
!
I. e. without a space between the -p and the password...
Hi Rylan, dont worry I aint screaming... mysqldump -C -h server2/ipaddress -u root -p test test > outfile.sql..In the above command test is the database name and the other test is tablename. I didnt put any password after -p. However the moment I click enter, it prompts for the password and everything is smooth after that.
Thx Timmeke, for the code. To understand the Perl script was slightly tedious though. Anyways, will do some more research on the net to figure out if there is any othr simpler ways to reolve the issue. Thx guys for all your help and consideration. Cheers!!!
That was really very nice of you Timmeke. I really appreciate your willingness to help. Honestly speaking I dont undertand even P of perl so it was almost Greek and Latin to me to go thru the code. However as I told you earlier I have managed to successfully download the tables into Sql format. Atleast i can still import those files to their respective tables at the client side. Will try your Perl code also and try to parse my imported .sql file to convert to .csv. Lets see hw it functions. I will alo definitely update this thread in case of any issues or successes. Just one more thing on the personal front, could you give me some website links or posts to help me learn and understand Perl better from elementary to advanced level. It shall be of great help. Php along with Perl can be pretty dynamic. Thx again for everything. Cheers and smiles!!!
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.