LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Virtualization and Cloud (https://www.linuxquestions.org/questions/linux-virtualization-and-cloud-90/)
-   -   pipe mysqldump to S3 (https://www.linuxquestions.org/questions/linux-virtualization-and-cloud-90/pipe-mysqldump-to-s3-4175539231/)

kha.t.nguyen 04-09-2015 11:36 AM

pipe mysqldump to S3
 
I'm using s3cmd version 1.5.2 which I understand now accepts stdin to send files to s3.

Google returned this link. I tested s3cmd with stdin and the 'echo hello' command works for me:
echo "hello" | s3cmd put - s3://my-bucket/folder/hello.txt --verbose

Now that I'm past that, I'm trying to do something similar with mysqldump. I would like to be able to send the *.gz file directly to S3 so that I do not use any disk space on the local machine:
mysqldump -uroot -pubuntu --databases test_database | gzip > backupdb.sql.gz | s3cmd put - s3://mybucket/folder/backupdb.sql.gz --verbose

The mysqldump command results in the following message:

INFO: Compiling list of local files...
INFO: Running stat() and reading/calculating MD5 values on 1 files, this may take some time...
INFO: Summary: 1 local files to upload
INFO: Forwarding request to us-east-1
ERROR: S3 error: The XML you provided was not well-formed or did not validate against our published schema


I'm not sure what I'm doing wrong. Any ideas are greatly appreciated.

Thanks!

Habitual 04-09-2015 11:52 AM

I don't think the "-" in "s3cmd put -" is going to do it, (could be wrong though)
I used
Code:

mysqldump <db> -uroot -pubuntu | gzip > backupdb.sql.gz | s3cmd put backupdb.sql.gz s3://mybucket/folder/ --verbose
and that worked fine.

kha.t.nguyen 07-07-2015 11:14 PM

Turns out the dash in the s3cmd is supported by the latest version of s3. I ended up using this to back up all the databases:
for x in `mysql -BN -uroot -pubuntu -e 'SHOW DATABASES'`; do mysqldump -u'root' -p'ubuntu' --single-transaction --databases $x --events --opt | gzip | s3cmd put - s3://my-bucket/mysql_backup_$x.sql.gz; done

Habitual 07-08-2015 08:08 AM

Quote:

Originally Posted by kha.t.nguyen (Post 5388659)
Turns out the dash in the s3cmd is supported by the latest version of s3.

As is usually the case, no? Good job and well done.

parmarth 08-31-2018 05:07 AM

change to aws s3 cp
 
mysqldump --host=$HOST --user=$USER --password=$PASSWORD $DB_NAME --routines --single-transaction | gzip -9 | aws s3 cp - s3://bucket/database/filename.sql.gz

will directly store file to s3.


All times are GMT -5. The time now is 10:22 PM.