In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/01 Report--
The main content of this article is "how to achieve regular automatic backup and deletion of website data in Linux system". Interested friends may wish to have a look. The method introduced in this paper is simple, fast and practical. Let the editor take you to learn "how to achieve regular automatic backup and deletion of website data by the Linux system".
The requirements are as follows: first, the website files and databases need to be backed up automatically every day, and then the backup directory will be deleted for more than a certain period of time, such as keeping the backup for the last 14 days.
It just so happens that vps is redhat, but there is no crond service installed by default. Enter the following command to install it.
Yum install cronie
Create a backup script
Vi / root/bakweb.sh
Edit and enter the following
#! / bin/bash
Find / home/bak/-name'*'- type f-mtime + 14-exec rm {}\
Tar zcvf / home/bak/www.penglei.name_$ (date +% F) .tar.gz / var/www/html
Mysqldump-u root-- password=PASSWORD DBNAME >
/ home/bak/sql.penglei.name_$ (date +% F) .sql
Exit 0 input: wq save exit. The script means to first select / home/bak as the site backup directory.
The first step is to delete files under / home/bak for more than 14 days
The second step is to back up the website directory / var/www/html by www.jb51.name_+ date for the file name
Third, export the database for the file name by sql.penglei.name_+ date, and replace PASSWORD and DBNAME with your own root password and database name.
Finally, create a crond script, which can be run automatically at 5 o'clock every morning to create the crond file.
Vi / etc/cron.d/bakweb edit and enter the following
0 5 * root/ root/bakweb.sh
At this point, I believe you have a deeper understanding of "how to achieve regular automatic backup and deletion of website data by the Linux system". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.