Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I've got a server online and I'd like to get daily mysql backups. I think that the way cpanel does those backups is great so I'll like to implement something like that, I've got full access to the server.
So the way I use to do it with cpanel is just an wget of a web page like this and the sql file was created somehow in the background:
The file .sql.gz suggests that a mysqldump was used to dump the full database contents in a .sql file, which was
subsequently compressed with gzip. Try looking up "mysqldump" to get started.
A strategy could be to
1) run a cron job on the machine running mysql to dump the database to a file & compress it (as a "backup" of the database)
2) run a cron job on a second machine to copy the backup file to another machine for safe(r) keeping
Step 2 can, but does not have to be, done by wget - which is used typically for web (HTTP) or FTP downloads.
It depends on how your mysql machine is set up for remote access by the second machine - ie if it is
configured for web access (apache), ftp access (vsftp), scp/sftp access (ssh), file sharing (nfs, samba)...
The thing about the step 1 is that the file will be created and left somewhere. I want the backup to be created when I access the link, which I will download at the same time and will be deleted once the download finishes. I was thinking that it could even be some kind of pipe so the file would never end up being on the disk. Not sure if that's possible.
I hope I explained myself but if not let me know and I'll do it with an example.
Streaming it to the client browser. Could work out.
Mysql dump files are plain text, so an appropriate content-type (or -disposition) is in order.
The only problem I may see is the size of the dump file.
A -disposition ("save to disk") might be a good idea, to avoid the browser from
trying to display the entire file (could be slow when you have several megabytes in size).
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.