In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-04 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
http://hongge.blog.51cto.com/
The backup of Mysql database in production environment is a periodic and repeated operation, so it is usually realized by writing scripts, and the backup scripts are executed periodically through crond scheduling tasks.
mysqldump backup scheme:
Full library backup at 1:00 a.m. Sunday
Incremental backups every 4 hours Monday through Saturday morning
Set up crontab task to execute backup script daily
# crontab -e
#Perform a full backup script every Sunday at 1:00 AM
0 1 * * 0 /root/mysqlfullbackup.sh >/dev/null 2>&1
#Incremental backup every 4 hours Monday through Saturday
0 */4 * * 1-6 /root/mysqldailybackup.sh >/dev/null 2>&1
mysqlfullbackup.sh Script content:
[root@localhost ~]# cat mysqlfullbackup.sh
#!/ bin/sh
# Name:mysqlFullBackup.sh
#Define database directory
mysqlDir=/usr/local/mysql
#Define user name and password for backing up database
user=root
userpwd=123456
dbname=test_db
#Define backup directories
databackupdir=/opt/mysqlbackup
[ ! -d $databackupdir ] && mkdir $databackupdir
#Define message body file
emailfile=$databackupdir/email.txt
#Define email address
email=root@localhost.localdomain
#Define backup log files
logfile=$databackupdir/mysqlbackup.log
DATE=`date -I`
echo "" > $emailfile
echo $(date +"%y-%m-%d %H:%M:%S") >> $emailfile
cd $databackupdir
#Define backup file names
dumpfile=mysql_$DATE.sql
gzdumpfile=mysql_$DATE.sql.tar.gz
#Use mysqldump to backup database, please set parameters according to specific situation
$mysqlDir/bin/mysqldump -u$user -p$userpwd --flush-logs -x $dbname > $dumpfile
#Compress backup files
if [ $? -eq 0 ]; then
tar czf $gzdumpfile $dumpfile >> $emailfile 2>&1
echo "BackupFileName:$gzdumpfile" >> $emailfile
echo "DataBase Backup Success! " >> $emailfile
rm -f $dumpfile
else
echo "DataBase Backup Fail! " >> $emailfile
fi
#Write log files
echo "--------------------------------------------------------" >> $logfile
cat $emailfile >> $logfile
#Send email notifications
cat $emailfile | mail -s "MySQL Backup" $email
mysqldailybackup.sh Script content:
[root@localhost ~]# cat mysqldailybackup.sh
#!/ bin/sh
# Name:mysqlDailyBackup.sh
#Define database directories and data directories
mysqldir=/usr/local/mysql
datadir=$mysqldir/data
#Define user name and password for backing up database
user=root
userpwd=123456
#Define backup directory, backup files daily to $dataBackupDir/daily
databackupdir=/opt/mysqlbackup
dailybackupdir=$databackupdir/daily
[ ! -d $dailybackupdir ] && mkdir -p $databackupdir/daily
#Define message body file
emailfile=$databackupdir/email.txt
#Define email address
email=root@localhost.localdomain
#Define log files
logfile=$databackupdir/mysqlbackup.log
echo "" > $emailfile
echo $(date +"%y-%m-%d %H:%M:%S") >> $emailfile
#
#Flushes logs to make the database use new binary log files
$mysqldir/bin/mysqladmin -u$user -p$userpwd flush-logs
cd $datadir
#Get binary log list
filelist=`cat mysql-bin.index`
icounter=0
for file in $filelist
do
icounter=`expr $icounter + 1`
done
nextnum=0
ifile=0
for file in $filelist
do
binlogname=`basename $file`
nextnum=`expr $nextnum + 1`
#Skip last binary log (binary log file currently used by database)
if [ $nextnum -eq $icounter ]; then
echo "Skip lastest! " > /dev/null
else
dest=$dailybackupdir/$binlogname
#Skip binary log files that have already been backed up
if [ -e $dest ]; then
echo "Skip exist $binlogname! " > /dev/null
else
#Backup log files to backup directories
cp $binlogname $dailybackupdir
if [ $? -eq 0 ]; then
ifile=`expr $ifile + 1`
echo "$binlogname backup success! " >> $emailfile
fi
fi
fi
done
if [ $ifile -eq 0 ];then
echo "No Binlog Backup! " >> $emailfile
else
echo "Backup $ifile File(s). " >> $emailfile
echo "Backup MySQL Binlog OK! " >> $emailfile
fi
#Send email notifications
cat $emailfile | mail -s "MySQL Backup" $email
#Write log files
echo "--------------------------------------------------------" >> $logfile
cat $emailfile >> $logfile
http://hongge.blog.51cto.com/
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.