Linux - Virtualization and CloudThis forum is for the discussion of all topics relating to Linux Virtualization and Linux Cloud platforms. Xen, KVM, OpenVZ, VirtualBox, VMware, Linux-VServer and all other Linux Virtualization platforms are welcome. OpenStack, CloudStack, ownCloud, Cloud Foundry, Eucalyptus, Nimbus, OpenNebula and all other Linux Cloud platforms are welcome. Note that questions relating solely to non-Linux OS's should be asked in the General forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I'm using s3cmd version 1.5.2 which I understand now accepts stdin to send files to s3.
Google returned this link. I tested s3cmd with stdin and the 'echo hello' command works for me:
echo "hello" | s3cmd put - s3://my-bucket/folder/hello.txt --verbose
Now that I'm past that, I'm trying to do something similar with mysqldump. I would like to be able to send the *.gz file directly to S3 so that I do not use any disk space on the local machine:
mysqldump -uroot -pubuntu --databases test_database | gzip > backupdb.sql.gz | s3cmd put - s3://mybucket/folder/backupdb.sql.gz --verbose
The mysqldump command results in the following message:
INFO: Compiling list of local files...
INFO: Running stat() and reading/calculating MD5 values on 1 files, this may take some time...
INFO: Summary: 1 local files to upload
INFO: Forwarding request to us-east-1
ERROR: S3 error: The XML you provided was not well-formed or did not validate against our published schema
I'm not sure what I'm doing wrong. Any ideas are greatly appreciated.
Turns out the dash in the s3cmd is supported by the latest version of s3. I ended up using this to back up all the databases:
for x in `mysql -BN -uroot -pubuntu -e 'SHOW DATABASES'`; do mysqldump -u'root' -p'ubuntu' --single-transaction --databases $x --events --opt | gzip | s3cmd put - s3://my-bucket/mysql_backup_$x.sql.gz; done
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.