In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-08 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
This article focuses on "how to upload data to s3cmd service". Interested friends may wish to take a look. The method introduced in this paper is simple, fast and practical. Let's let the editor learn "how to upload data to s3cmd service".
1. Basic principles
Operation procedure:
1. The client submits an upload operation request to presign server to generate the corresponding presign URL
two。 Use the generated presign URL to construct a HTTP request to upload data to the S3 service.
two。 Advantages and disadvantages
Advantages:
1. Accesskey and secretkey will not be stored on the client side to avoid key leakage
two。 Each presignURL corresponds to a keyname, which can upload and overwrite existing files at will within the valid time, which is more flexible.
3. The server can combine various Auth systems to complete the authentication and authorization of the client, and it is convenient to integrate existing services.
4. The client upload and download method is flexible. After getting the presignURL, you can upload and download through any client that supports the HTTP protocol.
Disadvantages:
1. The interaction between upload and download is one more step than normal upload.
two。 The presign in this article is uploaded in a single PUT mode, and large files are not suitable for this method.
3. Concrete realization
Install server-side dependencies
Pip install botopip install flask-restful
The server demo code is as follows:
#-*-coding: utf-8-*-import botoimport boto.s3.connectionfrom flask import Flask, requestfrom flask_restful import Resource Apiimport res3enpoint = 's3.ceph.work' # endpointaccess_key for service S3 =''# access_keysecret_key =''# secret_keybucket = 'multi-upload' # name of the bucket stored in the uploaded data class Presign_Upload (Resource): def _ _ init__ (self): self.s3enpoint = s3enpoint self.access_key = access_key self.secret_key = secret_key self.bucket = bucket # super (Presign_Upload Self). _ init__ () self.headers = {} def presign (self,key_name,expires_in): conn = boto.connect_s3 (aws_access_key_id = self.access_key, aws_secret_access_key = self.secret_key, host = self.s3enpoint, port = 80, is_secure = False # uncommmnt if you are not using ssl # calling_format = boto.s3.connection.OrdinaryCallingFormat (), # use path-style calling_format = boto.s3.connection.SubdomainCallingFormat (), # use virtual-hosted-style) presigened_url = conn.generate_url (expires_in, "PUT", self.bucket, key_name Headers=self.headers) return presigened_url def get (self): if not request.args.get ("keyname"): return "no keyname", 404 else: keyname = request.args.get ("keyname") if not request.args.get ("expires"): return "no expires" Else: expires = int (request.args.get ("expires")) if request.args.get ("contentype"): contentype = request.args.get ("contentype") self.headers ['Content-Type'] = contentype if request.args.get ("policy"): policy = request.args.get ("policy") self. Headers ['xMuamzmuracl'] = policy meatadata_rule = 'xMuamizMuimei' for parm in request.args: if re.match (meatadata_rule) Parm): self.headers [parm] = request.args.get (parm) return {"PresignURL": self.presign (keyname) Expires)} class Presign_Download (Resource): def _ init__ (self): Resource.__init__ (self) self.s3enpoint = s3enpoint self.access_key = access_key self.secret_key = secret_key self.bucket = bucket self.headers = {} def presign (self,key_name) Expires_in): conn = boto.connect_s3 (aws_access_key_id = self.access_key, aws_secret_access_key = self.secret_key, host = self.s3enpoint, port = 80, is_secure = False # uncommmnt if you are not using ssl # calling_format = boto.s3.connection.OrdinaryCallingFormat (), # use path-style calling_format = boto.s3.connection.SubdomainCallingFormat (), # use virtual-hosted-style) presigened_url = conn.generate_url (expires_in, "GET", self.bucket Key_name) return presigened_url def get (self): if not request.args.get ("keyname"): return "no keyname", 400else: keyname = request.args.get ("keyname") if not request.args.get ("expires"): return "no expires" 400 else: expires = int (request.args.get ("expires")) return {"PresignURL": self.presign (keyname, expires)} app = Flask (_ _ name__) api = Api (app) api.add_resource (Presign_Upload,'/ upload') api.add_resource (Presign_Download,'/ download') if _ name__ = ='_ main__': app.run (debug=True)
Install client dependency
Pip install requests
The client demo code is as follows:
#-*-coding: utf-8-*-import requestsimport jsonclass Presign_client (): def _ _ init__ (self, server_url): self.server_url = server_url self.headers = {} def get_upload_presignURL (self, keyname, expires, policy=None, contentype=None, metadata=None): base_url_ = "{server_url} / upload?keyname= {keyname} & expires= {expires}" .format (server_url=server_url,keyname=keyname Expires=expires) if contentype: self.headers ['Content-Type'] = contentype base_url_= "{base_url_} & contentype= {contentype}" .format (base_url_=base_url_,contentype=contentype) if policy: self.headers ['xMuramzaacl'] = policy base_url_= "{base_url_} & policy= {policy}" .format (base_url_=base_url_ Policy=policy) if metadata: for k in metadata: metadata_key= k metadata_value = metadata [k] self.headers [metadata _ key] = str (metadata_value) base_url_= "{base_url_} & {metadata_key} = {metadata_value}" .format (base_url_=base_url_,metadata_key=str (metadata_key) Metadata_value=metadata_value) res = requests.get (base_url_) return json.loads (res.content) ['PresignURL'] def upload_file (self,SignURL,filepath): with open (filepath) as fh: mydata = fh.read () response = requests.put (SignURL,data=mydata) Headers=self.headers) if response.status_code = = 200: print "Upload {} Successful!" .format (filepath) else: print "Failed! Status_code= {} ".format (response.status_code) def get_download_presignURL (self, keyname,expires): base_url_ =" {server_url} / download?keyname= {keyname} & expires= {expires} ".format (server_url=server_url,keyname=keyname,expires=expires) response = requests.get (base_url_) return json.loads (response.content) ['PresignURL'] def download_file (self,SignURL) Filepath): response = requests.get (SignURL) if response.status_code = 200: with open (filepath,'wb') as fh: fh.write (response.content) print "Download {} Successful!" .format (filepath) else: print "Failed! Status_code= {} ".format (response.status_code) server_url = 'http://localhost:5000' # fill in the server address metadata = {' xMuamzmai ACLexpires = 300 # set the valid duration of presignURL for uploaded files File_path ='/ Users/demouser/Downloads/1.pdf' # upload file path save_path ='/ tmp/download.pdf' # download file storage path keyname= 'upload.pdf' # upload file name, corresponding to object keynameclient = Presign_client (server_url) # generate upload presignURL, and upload file Upload_SignURL = client.get_upload_presignURL (keyname=keyname, expires=expires, policy=policy, contentype=contentype, metadata=metadata) client.upload_file (Upload_SignURL File_path) # download presignURL and download the file Download_SignURL = client.get_download_presignURL (keyname=keyname, expires=expires) client.download_file (Download_SignURL,save_path) here I believe that you have a deeper understanding of "how to upload data to s3cmd services". You might as well do it in practice. Here is the website, more related content can enter the relevant channels to inquire, follow us, continue to learn!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 272
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.