pipe mysqldump to S3
I'm using s3cmd version 1.5.2 which I understand now accepts stdin to send files to s3.
Google returned this link. I tested s3cmd with stdin and the 'echo hello' command works for me: echo "hello" | s3cmd put - s3://my-bucket/folder/hello.txt --verbose Now that I'm past that, I'm trying to do something similar with mysqldump. I would like to be able to send the *.gz file directly to S3 so that I do not use any disk space on the local machine: mysqldump -uroot -pubuntu --databases test_database | gzip > backupdb.sql.gz | s3cmd put - s3://mybucket/folder/backupdb.sql.gz --verbose The mysqldump command results in the following message: INFO: Compiling list of local files... INFO: Running stat() and reading/calculating MD5 values on 1 files, this may take some time... INFO: Summary: 1 local files to upload INFO: Forwarding request to us-east-1 ERROR: S3 error: The XML you provided was not well-formed or did not validate against our published schema I'm not sure what I'm doing wrong. Any ideas are greatly appreciated. Thanks! |
I don't think the "-" in "s3cmd put -" is going to do it, (could be wrong though)
I used Code:
mysqldump <db> -uroot -pubuntu | gzip > backupdb.sql.gz | s3cmd put backupdb.sql.gz s3://mybucket/folder/ --verbose |
Turns out the dash in the s3cmd is supported by the latest version of s3. I ended up using this to back up all the databases:
for x in `mysql -BN -uroot -pubuntu -e 'SHOW DATABASES'`; do mysqldump -u'root' -p'ubuntu' --single-transaction --databases $x --events --opt | gzip | s3cmd put - s3://my-bucket/mysql_backup_$x.sql.gz; done |
Quote:
|
change to aws s3 cp
mysqldump --host=$HOST --user=$USER --password=$PASSWORD $DB_NAME --routines --single-transaction | gzip -9 | aws s3 cp - s3://bucket/database/filename.sql.gz
will directly store file to s3. |
All times are GMT -5. The time now is 10:22 PM. |