Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize the hot and cold startup of cloud function in Serverless

2025-03-31 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

This article to share with you is about how to achieve cloud function hot and cold start in Serverless, Xiaobian feels quite practical, so share it with you to learn, I hope you can gain something after reading this article, not much to say, follow Xiaobian to see it.

Let's take a look at the hot and cold startup process of cloud functions, and what problems developers need to pay attention to in the face of this hot and cold startup mode of cloud functions.

effect display

Cloud function is called for the first time (cold start)

Cloud functions are called multiple times in succession (hot start)

Cold and Hot Start-up Models of Cloud Functions

Let me tell you what the cloud function hot and cold start mode here means.

Cold start means that you open up a new space in the server for a function instance to run, this process is a bit like you put this function into the virtual machine to run, before each run, you have to start the virtual machine to load this function, this is a relatively time-consuming process, so cloud functions need to minimize the number of cold starts.

Hot start means that if a cloud function is continuously triggered, then I will not release this cloud function instance first, and the next request will still be run by the cloud function instance that has been created before. It is like we open the virtual machine and run this function without shutting down the virtual machine. Instead, we let it wait for the next time it is triggered to run. The advantage of doing this is to save a time-consuming link for the virtual machine to "boot". The disadvantage is that the system overhead will be higher if the virtual machine is kept active all the time.

Of course, we don't need to worry about the allocation of cloud function resources here. The bottom layer of cloud function will be allocated by algorithm.

There is such a description in the introduction of Tencent Cloud Function Document:

Tencent Cloud functions are Serverless execution environments provided by Tencent Cloud. You only need to write a simple, single-purpose cloud function to associate it with events generated by your Tencent Cloud infrastructure and other cloud services. When using cloud functions, you only need to write code in the languages supported by the platform (Python, Node.js, PHP, Golang, and Java). Tencent Cloud will fully manage the underlying computing resources, including server CPU, memory, network and other configuration/resource maintenance, code deployment, Auto Scaling, Load Balancer, security upgrade, resource operation monitoring, etc. But it also means you can't log in or manage servers, customize systems and environments. Cloud functions are automatically deployed across multiple Availability Zones within the same geographic area while providing extremely high fault tolerance. Cloud function will scale according to the request load when executing, from a few requests per day to thousands of requests per second, all by the cloud function bottom layer. You don't need manual configuration and intervention, just pay for running cloud functions to meet the availability and stability of services in different scenarios. If the cloud function does not run, no cost is incurred. You can customize when to run cloud functions, for example, when COS Bucket uploads, when files are deleted, when an application is invoked via SDK, or specify that cloud functions are executed periodically. You can easily implement IFTTT logic using cloud functions as data processing triggers for COS services, or you can easily build flexible and controllable software architectures by building flexible timed automated tasks that override manually done operations.

Now, pay attention to this sentence.

Cloud function will scale according to the request load when executing, from a few requests per day to thousands of requests per second, all by the cloud function bottom layer.

You can see that the number of instances of cloud functions scales automatically at the bottom of the system through algorithms,

Let's look down.

In Serverless 2.0, we have not only completely refactored and optimized the control flow and data flow modules, virtualization layer, network layer and scheduling layer, but also comprehensively upgraded security, availability and performance. A unified underlying architecture is used by adopting lightweight virtualization technology, VPC Proxy forwarding scheme and other optimization methods. Optimized for real-time automatic scaling core capabilities, completely avoiding the much-criticized cold start problem in traditional serverless architectures. Cloud functions no longer limit the running time and support richer application scenarios. For example, service functions do not limit the length of a single request. As requests continue to arrive, the service maintains a long-running mode with no warm or cold startup delays. Serviced functions support WebSocket long connections. Event Function has a single-call time limit, but the service remains in long-run mode with no warm or cold startup latency while requests continue to arrive.

Note this sentence:

Trigger functions have single-call time limits, but the service remains in long-run mode with no warm or cold startup delays while requests continue to arrive.

That is to say, the cloud function instances we trigger in various ways are not all completely cold-started, but may also be instances of cloud functions previously called.

Let's do an experiment together

import jsonglobal_v=1# api gateway reply message format def apiReply(reply, code=200): return { "isBase64Encoded": False, "statusCode": code, "headers": {'Content-Type': 'application/json', "Access-Control-Allow-Origin": "*"}, "body": json.dumps(reply, ensure_ascii=False) }def main_handler(event, context): global global_v global_v+=1 return apiReply({ 'ok': True, 'message': global_v-1 })

The above is a simple Python cloud function, we add an API gateway trigger to it to test what it will return:

The first call returns 1, indicating that our cloud function has been cold started.

Continue to call, found that this time returned 2, indicating that our cloud function is based on the previous instance was hot-started:

Try again a few times and we find that some are hot-started, some are still cold-started:

However, this behavior is obviously inconsistent with our expectations. We expect that the previous request will not affect the results of the cloud function later. This is where the problem lies.

Okay, let's go back to what the official documentation says.

Does SCF reuse function instances? To improve performance, SCF keeps your function instance for a certain period of time and reuses it to service subsequent requests. But your code should not assume that this always happens. Why keep SCF stateless? Keeping a function stateless enables the function to launch as many instances as necessary to meet the requested rate.

In other words, we must ensure that SCF functions are stateless when editing cloud functions, otherwise there will be some unpredictable strange problems.

So what is stateless? To put it bluntly, your cloud function cannot depend on the state or result of the previous function operation, and try to avoid the use of global variables!

The above is how to realize cloud function hot and cold start in Serverless. Xiaobian believes that some knowledge points may be seen or used in our daily work. I hope you can learn more from this article. For more details, please follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report