Case 1: shell+cron implements automatic backup of MySQL and automatically deletes backups N days ago
#!/bin/sh
DUMP=/usr/local/mysql/bin/mysqldump
OUT_DIR=/home/ldl/xxx/backup/
LINUX_USER=ldl
DB_NAME=ldl
DB_USER=ldl
DB_PASS=xxx
#How much days backup most
DAYS=1
#Core of script
cd $OUT_DIR
DATE=`date +%Y_%m_%d`
OUT_SQL="$"
TAR_SQL="mysql_$"
$DUMP --default-character-set=utf8 --opt -u$DB_USER -p$DB_PASS $DB_NAME > $OUT_SQL
tar -czf $TAR_SQL ./$OUT_SQL
rm $OUT_SQL
chown $DB_NAME:nobody ./$TAR_SQL
find ./ -name "mysql*" -type f -mtime +$DAYS -exec rm {} \;
+++++++++++++++++++++++++++++++++++++++++++++++++++++
MySQLdump doesn't need to be explained. Note that -uuser -ppass is enough, there are no spaces.
This sentence is mainly on -mtime +5 on -mtime means files 5 days ago and are deleted in a unified manner.
My operating system is CentOS 5.4. Just copy the script to /etc/ and set it to the same permissions and you can execute it without adding crontab.
Case 2: Automatically backup the scripts of websites and databases under CentOS and upload them to FTP
Suppose the website directory of this server is: /home/www, the database program path is: /usr/local/mysql/bin, the database name is: level, and the ftp server is: ftphost. Let’s first look at the complete automatic backup script (the automatic backup script is saved at: /home/):
#!/bin/bash
cd /home
WebBakName=web_$(date +%y%m%d).
tar zcvf $WebBakName www
SqlBakName=sql_$(date +%y%m%d).
/usr/local/mysql/bin/mysqldump -uusername -ppassword levil>
tar zcvf $SqlBakName
ftp -nv ftphost << EOF
user ftpname ftppass
put $WebBakName
put $SqlBakName
quit
EOF
rm -f $WebBakName $SqlBakName
Let me explain it one by one: First, enter the /home directory, define the WebBakName variable as the file name for the website backup, and the file name format is web date. Define the SqlBakName variable as the file name for the database backup, and the file name format is sql date. Pack the entire website directory www into the WebBakName file name, use mysqldump to export the specified database content, and then package the database backup into the SqlBakName file. The native backup work ends here. If you do not have a large enough remote ftp space, you can directly download the backup file to the local machine, but I still recommend backing up directly to the ftp space on another server to achieve complete automatic backup. At this time, you need to replace the ftphost, ftpname and ftppass in the script with your ftp information, and the entire backup process is completed.
Then enter chmod +x to add execution permissions to the script, and then enter: crontab -e editing task automatically start time, for example, I enter:
00 05 1 /home/ It means that the automatic backup operation is performed at 5:00 a.m. every week.
After all this is done, you can first change the automatic task time to the current approach time to see if the automatic backup script works properly. If the test is OK, you don’t have to worry about any problems with this server and causing data to be lost. Of course, if your data is updated frequently, it is recommended to adjust the automatic backup time to daily