In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly talks about how to use shell to achieve mysql full volume, incremental backup. Incremental backups copy mysql-bin.00000* to the specified directory at 3 a.m. Monday through Saturday, while full backups export all databases using mysqldump every Sunday at 3 a.m. and delete mysq-bin.00000* from the previous week. The backup operations for mysql are then retained in the bak.log file. As shown below:
Start: May 2, 2013 15:10:57 End: May 2, 2013 15:12:16 20130502.sql.tgz succ is generated by DBFullyBak.sh and backed up once a week;mysql-bin. 00001 copying;mysql-bin. 00002 skip!; 02 May 2013 16:53:15 Bakup succ! is generated by DBDailyBak.sh once a day.
Achieve:
1. Script Full Volume Backup
# vim /root/DBFullyBak.sh the following
#!/ bin/bash
# Program
# use mysqldump to Fully backup mysql data per week!
# History
# 2013-04-27 guo first
# Path
# ....
BakDir=/home/mysql/backup
LogFile=/home/mysql/backup/bak.log
Date=`date +%Y%m%d`
Begin=`date +"%Y year %m month %d day %H:%M:%S"`
DumpFile=$Date.sql
GZDumpFile=$Date.sql.tgz
if [ ! -d "$BakDir" ]; then
mkdir -p $BakDir
fi
cd $BakDir
/usr/local/mysql/bin/mysqldump -uroot -p123456 --quick --all-databases --flush-logs --delete-master-logs --single-transaction > $DumpFile
/bin/tar czvf $GZDumpFile $DumpFile
/bin/rm $DumpFile
Last=`date +"%Y year %m month %d %H:%M:%S"`
echo Start:$Begin End:$Last $GZDumpFile successful!!! >> $LogFile
cd $BakDir/daily
rm -f *
2. Script incremental backups
# cat /root/www.example.com/DBDailyBak.sh
#!/ bin/bash
# Program
# use cp to backup mysql data everyday!
# History
# 2013-05-02 guo first
# Path
# ....
BakDir=/home/mysql/backup/daily
BinDir=/data/mysql
LogFile=/home/mysql/backup/bak.log
BinFile=/data/mysql/mysql-bin.index
if [ ! -d "$BakDir" ]; then
mkdir -p $BakDir
fi
/usr/local/mysql/bin/mysqladmin -uroot -p123456 flush-logs
#This is used to generate new mysql-bin.00000* files
Counter=`wc -l $BinFile |awk '{print $1}'`
NextNum=0
#This for loop is used to compare $Counter,$NextNum to determine if the file exists or is current.
for file in `cat $BinFile`
do
base=`basename $file`
#basename is used to truncate mysql-bin.00000* file name, remove./ mysql-bin.000005./
NextNum=`expr $NextNum + 1`
if [ $NextNum -eq $Counter ]
then
echo $base skip! >> $LogFile
else
dest=$BakDir/$base
if(test -e $dest)
#test -e is used to check whether the target file exists. If it exists, write exist! Go to $LogFile.
then
echo $base exist! >> $LogFile
else
cp $BinDir/$base $BakDir
echo $base copying >> $LogFile
fi
fi
done
echo `date +"%Y year %m month %d day %H:%M:%S"` $Next Bakup successful! >> $LogFile
3. Set up crontab task to execute backup script daily
The copy code is as follows:
# crontab -l //content is below
#Perform a full backup script every Sunday at 3:00 AM
0 3 * * 0 /root/DBFullyBak.sh >/dev/null 2>&1
#Do incremental backups Monday through Saturday at 3:00 AM
0 3 * * 1-6 /root/DBDailyBak.sh >/dev/null 2>&1
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.