问题
I want to setup a cron job to run so that it will automatically backup my MySQL database, while the database is running, and then FTP that backup to my backup server.
I assume I can do this using a bash script.
Anyone know of a good way to accomplish this?
Thanks in advance.
回答1:
This is a very simple approach using the lftp command line ftp client:
backup.sh:
mysqldump -f [database] | gzip > /backup/[database].dump.gz
lftp -f /backup/lftp.script
lftp.script:
open backup.ftp.example.com
user [username] [password]
cd /backup
mv webflag.dump.gz.8 webflag.dump.gz.9
mv webflag.dump.gz.7 webflag.dump.gz.8
mv webflag.dump.gz.6 webflag.dump.gz.7
mv webflag.dump.gz.5 webflag.dump.gz.6
mv webflag.dump.gz.4 webflag.dump.gz.5
mv webflag.dump.gz.3 webflag.dump.gz.4
mv webflag.dump.gz.2 webflag.dump.gz.3
mv webflag.dump.gz.1 webflag.dump.gz.2
mv webflag.dump.gz webflag.dump.gz.1
Note: This approach has a number of issues:
- ftp is unencryped, so anyone who is able to sniff the network, is able to see both the password and the database data. Piping it through gpg -e [key] can be used to encrypt the dump but the ftp passwords stays unencrypted (sftp, scp are better alternatives)
- if someone hacks the database server, he can use the user information in this script to access the ftp server and depending on rights delete the backups (this has happened in the real world: http://seclists.org/fulldisclosure/2009/Jun/0048.html)
回答2:
If the database is very large the backup file may not fit to server's hard drive. In this case I suggest you the following way that uses pipes and ncftpput:
mysqldump -u <db_user> -p<db_password> <db_name> | gzip -c | ncftpput -u <ftp_user> -p <ftp_password> -c <ftp_url> <remote_file_name>
It works fine for me.
来源:https://stackoverflow.com/questions/1583481/how-to-create-cron-job-to-backup-mysql-and-ftp-backup-to-my-backup-server