In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/02 Report--
In this section, we will learn a classic case, which I have seen in three different training videos. I don't know who was the original creator:)
Compared with the previous examples, the train of thought is more or less the same, except for two points:
To trigger Lambda through S3, for Python modules that are not included by AWS by default, we need to upload zip files manually or import them through S3.
The process of this example is simply to create two S3 bucket, one of which uploads photos, and it will automatically compress the size and save it in the other bucket. Let's take a look at how to implement it.
First create two bucket, one source and one destination
Then create the corresponding role
Then create a Lambda function and select the role configured above
Add a trigger, where we specify the source bucket of S3
Next, configure the corresponding function
Import osimport tempfileimport boto3from PIL import Images3 = boto3.client ('s3') DEST_BUCKET = os.environ [' DEST_BUCKET'] SIZE = 128, 128def lambda_handler (event Context): for record in event ['Records']: source_bucket = record [' s 3'] ['bucket'] [' name'] key = record ['s 3'] ['object'] [' key'] thumb = 'thumb-' + key with tempfile.TemporaryDirectory () as tmpdir: download_path = os.path.join (tmpdir, key) upload_path = os.path.join (tmpdir) Thumb) s3.download_file (source_bucket, key, download_path) generate_thumbnail (download_path, upload_path) s3.upload_file (upload_path, DEST_BUCKET, thumb) print ('Thumbnail image saved at {} / {}' .format (DEST_BUCKET, thumb)) def generate_thumbnail (source_path, dest_path): print ('Generating thumbnail from:') Source_path) with Image.open (source_path) as image: image.thumbnail (SIZE) image.save (dest_path)
Attention! In this function, we call the module of Pillow image, but this module does not exist in the running environment of aws by default, so we need to upload it manually.
First of all, download the corresponding linux whl file from https://pypi.org/project/Pillow/#files. The whl file is actually a compressed package. Under Windows, we can decompress it with wheel unpack. If you are in a Linux environment, you can open it directly with unzip.
Then zip this PIL folder with our python file
And then upload it to the Lambda console.
Note that the names of py files and entry functions match handler
And don't forget to configure environment variables.
We can view the json format of the corresponding event by testing the template
Finally, let's take a look at the running effect.
Upload a few pictures
Automatically compress and save in another bucket
The experiment was successful
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.