Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Use bash scripts to back up AWS EC2 data to S3

2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

1. IAM secret key authorization method (normal)

Amazon Simple Storage Service (Amazon S3) is a storage service for Internet. Any size of data that you can store and retrieve anywhere on the Web at any time through Amazon S3. You can use the simple and intuitive web interface of the AWS Management console to accomplish these tasks.

1.1.Open IAM

Click the user on the left side of the AWS Identity and Access Management control panel.

Click the add user button.

1.2. Add users

Enter the user name in the text box next to user name: (in this example, we will use aws_backup), and then select programmatic access from the "Select AWS access type" section. Click the next step: permissions button.

Click to attach existing policy options directly. Select AdministratorAccess, and then click next: audit.

Click create user.

Click the download credentials button and save the credentials.csv file to a secure location (you will need this file later in step 3), and then click the close button.

1.3.install and configure AWS CLI

Now that you have an IAM user, you need to install the AWS command line interface (CLI).

See https://docs.aws.amazon.com/zh_cn/cli/latest/userguide/cli-chap-install.html for installation documentation.

I use my own source installation directly here.

Apt install awscli1.4, configure authorization

Log in to our Ubuntu system, type aws configure, and then press Enter. Enter the following information when prompted:

Root@ip-172-31-47-132 None # aws configureAWS Access Key ID [None]: AKIA5NAGHF6NVEPKATFQAWS Secret Access Key [None]: xbh4ZgVv4j2WDdvRfWkZCGTmWdS56slv1ixVEosRDefault region name [None]: ap-northeast-1Default output format [None]: jsonroot@ip-172-31-47-132 IAM role Authorization (Security)

The above method is relatively secure, but the key information is stored on the server in clear text and becomes insecure, so let's configure a more secure role. A role is a secure way to grant permissions to trusted entities.

2.1.Create a role for EC2 to access S3

Select the trusted entity, and we choose EC2 here.

Give access to S3.

Create a role.

2.2.2.Add the role of accessing S3 for EC2

Locate the EC2 instance interface.

Select the role we just created.

2.3. Testing

The prerequisite here is that there is no way to configure key authorization.

Aws S3 ls s3://myweb-backup-eu-west-1/ III, backup data 3.1 using AWS CLI and S3, create buckets

To create a new bucket named myweb-backup, enter:

Root@ip-172-31-47-132 mb s3://myweb-backupmake_bucket # aws S3: myweb-backup3.2, upload files

To upload the / etc/passwd file to the S3 bucket myweb-backup, you need to use the following command:

Root@ip-172-31-47-132 aws S3 cp / etc/passwd s3://myweb-backup/upload:.. /.. / etc/passwd to s3://myweb-backup/passwd

3.3. Download the file

To download passwd from S3 to a local directory, we need to reverse the order of the commands, as follows:

Root@ip-172-31-47-132 cp s3://myweb-backup/passwd cp s3://myweb-backup/passwd # aws S3. Download: s3://myweb-backup/passwd to. / passwd 3.4.Delete files

To remove passwd from your myweb-backup bucket, use the following command:

Root@ip-172-31-47-132 rm s3://myweb-backup/passwd delete # aws S3 rm s3://myweb-backup/passwd delete: s3://myweb-backup/passwd IV. Actual case (using secret key method)

For example, I pack and upload all the contents below / etc every day for backup.

#! / bin/bashexport HOME= "/ home/ubuntu" cd / tmp/;tar-zcPf etc$ (date +% Y%m%d). Tar.gz / etc;aws S3 cp etc$ (date +% Y%m%d). Tar.gz S3 Y%m%d Y%m%d. Tar.gz S3 etc$ to mam-f etc$ (date +% Y%m%d). Tar.gz

Then add a scheduled task.

I have encountered a problem here, that is, I have been unable to upload successfully after a scheduled task. I don't know if this will happen when using role authorization. I don't think so. You can try it yourself. By printing the log, you can find that the problem of unable to locate credentials has been solved by adding an environment variable export HOME= "/ home/ubuntu" for a long time. It's really a lot of trouble. If the manual execution script is normal and the scheduled task is not normal again, we can use the following method to compare the difference between the environment variables of the two ways.

# scheduled task environment variable output set | sort > / tmp/env.cron# manual execution script output set | sort > / tmp/env.interactive# and then compare diff / tmp/env.cron / tmp/env.interactive

To save space, we can add lifecycle rules.

Reference document: https://aws.amazon.com/cn/getting-started/tutorials/backup-to-s3-cli/

Welcome to scan the code and follow us for more information.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report