Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

A thorough look at serverless computing: Origin, scenarios and problems

2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

1. What is Serverless computing?

Cloud computing has emerged many new technologies that change the traditional IT architecture and operation and maintenance methods, such as virtual machines, containers, micro-services. No matter in which scenarios these technologies are applied, reducing costs and improving efficiency is the eternal theme of cloud services. Over the past decade, we have turned many common parts of applications and environments into services. The emergence of Serverless has brought about leapfrog changes. Serverless outsources all components of host management, operating system management, resource allocation, expansion, and even application logic as some form of commodity-manufacturers provide services, and we pay for them. In the past, it was to "build a framework to run on a server and respond to multiple events", while Serverless changed to "build or use a micro service or micro function to respond to an event", so that when you access, call in related resources to start running, and after the run is complete, unload all the overhead, so that you can really charge on an on-demand basis. This is a natural process for cloud computing to develop in depth.

Serverless is a complete process for building and managing a micro-service-based architecture that allows you to manage your application deployment at the service deployment level rather than the server deployment level. It differs from the traditional architecture in that it is completely managed by a third party, triggered by events, and exists in the computing container of stateless (Stateless) and temporary storage (which may only exist during one call). Building serverless applications means that developers can focus on product code without having to manage and operate cloud or local servers or runtimes. Serverless really achieves the deployment of applications without involving infrastructure construction, automatic construction, deployment and startup of services.

Domestic and foreign major cloud manufacturers Amazon, Microsoft, Google, IBM, Aliyun, Tencent Cloud and Huawei Cloud have launched Serverless products one after another. Serverless has gradually moved from concept and vision to being applied in various enterprises and companies.

2. Understand Serverless technology-FaaS and BaaS

The server-side logic implemented by the developer runs in a stateless computing container, which is triggered by events and completely managed by a third party, and its business-level state is recorded by the database and storage resources used by the developer.

Serverless covers many technologies, which fall into two categories: FaaS and BaaS.

1) Function-as-a-Service (FaaS)

Small pieces of code, on-demand implementation, on-demand extension, no need to manage any basic implementation-related parts.

Event-driven computing. The function is triggered by an event or called by a HTTP request.

2) Backend-as-a-Service (BaaS)

The third party implements the basic functional modules in application development based on API services.

These API, like services, are automatically extended without management.

1.Faas

FaaS means that you can run back-end code directly without having to manage your own server system or your own server applications. The server application referred to is the biggest difference between this technology and other modern architectures such as containers and PaaS (platform as a Service).

FaaS can replace some service processing servers (possibly physical computers, but definitely need to run some kind of application), so that you do not need to supply the server yourself, nor do you need to run the application full-time.

FaaS products are not required to be developed using specific frameworks or libraries. In terms of language and environment, the FaaS function is a regular application. For example, the functions of AWS Lambda can be implemented through Javascript, Python, and any JVM language (Java, Clojure, Scala), etc. However, the Lambda function can also execute any process that is bundled with the required deployment artifacts, so you can use any language, as long as it can be compiled into a Unix process. FaaS functions do have some architectural limitations, especially in terms of state and execution time.

During the move to FaaS, the only code that needs to be modified is the "main method / startup" code, where you may need to remove the code associated with the top-level message handler (the implementation of the message listener interface), but you may only need to change the method signature. In the world of FaaS, all the rest of the code, such as code written to the database, does not need to change.

Compared to traditional systems, the deployment method will be a big change-upload the code to the FaaS vendor, and everything else can be done by the vendor. The current approach usually means that you need to upload a new definition of the code (such as uploading a zip or JAR file) and then call a proprietary API to initiate the update process.

Functions in FaaS can be triggered by vendor-defined event types. For Amazon AWS, such trigger events can include S3 (file) updates, time (scheduled tasks), and messages that join the message bus (such as Kinesis). Usually your function needs to specify the event source to which it needs to bind through arguments.

Most vendors also allow functions to be triggered in response to incoming Http requests, usually from some type of API gateway (such as AWS API gateway, Webtask).

2.Baas

BaaS (Backend as a Service) means that we no longer write or manage all server-side components, and can use domain-common remote components (rather than in-process libraries) to provide services. To understand BaaS, you need to figure out the difference between it and PaaS.

First of all, BaaS is not PaaS. The difference between them is that PaaS needs to participate in the life cycle management of applications, while BaaS only provides third-party services on which applications depend. A typical PaaS platform needs to provide the means for developers to deploy and configure applications, such as automatically deploying applications to Tomcat containers and managing the life cycle of applications. BaaS does not contain this content, and BaaS only provides back-end services on which the application depends, such as database and object storage, in an API way. BaaS can be provided by a public cloud service provider or by a third-party vendor. Secondly, functionally, BaaS can be regarded as a subset of PaaS, that is, providing third-party dependent components.

BaaS services also allow us to rely on application logic that others have implemented. For this, authentication is a good example. Many applications have to write their own code to implement the logic of registration, login, password management and so on, but these codes are more or less the same for different applications. This repetitive work can be extracted and turned into external services, which is the goal of products such as Auth0 and Amazon Cognito. They can achieve comprehensive authentication and user management, and the development team no longer has to write or manage the code that implements these functions.

How does serverless (Serverless) computing work?

Serverless computing provides a higher level of abstraction than using virtual machines or some underlying technologies to deploy and manage applications. Because they have different abstractions and sets of "triggers".

In terms of computing, this abstraction has a specific function and an abstract trigger, which is usually an event. In the case of a database, this abstraction may be a table, while a trigger is equivalent to a query or search for a table, or an event generated by doing something in a table.

For example, a mobile game allows users to use high-score tables for the world's top players on different platforms. When this information is requested, the request goes from the application to the API interface. The API interface may trigger AWS's Lambda function, or serverless functions, which in turn fetch the data flow from the database table and return data in a certain format containing the top five scores.

Once built, the functionality of the application can be reused in mobile-based and Web-based versions of the game.

Unlike setting up a server, it is not necessary to have an Amazon EC2 instance or server and then wait for the request. The environment is triggered by an event, and the logic required to respond to the event is executed only at the time of response. This means that the resources that run the function are created only when the function is running, resulting in a very efficient way to build the application.

4. What scenarios does Serverless apply to?

At this stage, Serverless is mainly used in the following scenarios. First of all, in Web and mobile services, you can integrate API gateway and Serverles services to build Web and mobile backend to help developers build flexible and highly available mobile or Web backend applications. In the IoT scenario, the real-time stream data can be processed efficiently, and a large amount of real-time information flow data is generated by the device, which is classified and written into the back-end processing through Serverles services. In addition, in the real-time media content processing scene, users upload audio and video to the object storage OBS, and multiple functions are triggered by uploading events to complete high-definition transcoding, audio transcoding and other functions to meet users' high requirements for real-time and concurrency. Serverless computing is also suitable for any event-driven use case, including the Internet of things, mobile applications, web-based applications and chatbots. Here are two simple scenarios for everyone to think about.

Scenario 1: the application load has significant peaks and troughs.

Whether the Serverless application is successful or not is not judged by the size of the company, but by the specific technical problems behind its business, such as the obvious peak and trough of the business, how to cut the peak and fill the valley. When a company's business load has peaks and troughs, the machine resources should be estimated according to the peak demand, while in the trough period, the machine utilization decreases obviously, because the resource can not be reused, which leads to waste.

There is a general consensus in the industry that when the utilization of self-owned machines is less than 30%, there will be a significant improvement in efficiency after using Serverless. For cloud service providers, after having enough users, all kinds of peaks and troughs become stable after superposition, and the resource reusability is higher after aggregation. For example, the peak load of takeout enterprises is during the dining period, while that of the security industry is at night, which is limited by the business positioning of each enterprise. For a mature cloud service manufacturer, if its platform is large enough and there are enough users, there should be no obvious peaks and troughs.

Scenario 2: typical use case-event-based data processing

The common functional requirements of the back-end system of video processing are as follows: video transcoding, data extraction, face recognition and so on. These are all general computing tasks and can be executed by function calculation.

Developers need to write their own implementation logic, and then connect the tasks according to the control flow, and the cloud vendor is responsible for the specific execution of each task. In this way, the development becomes more convenient, and the built system is naturally highly available and flexible in real time, and users do not need to care about the problems at the machine level.

5. The problem of Serverless

For enterprises, platforms that support Serverless computing can save a lot of time and costs, while freeing up employees and allowing developers to do more valuable work instead of managing infrastructure. On the other hand, it can improve agility, launch new applications and services more quickly, and then improve customer satisfaction. But Serverless is not perfect, it also has some problems and needs to be carefully applied in the production environment.

1. It is not suitable to run applications for a long time.

Serverless does not run until the request arrives. This means that when the application is not running, it will go into "hibernation", and the next time the request comes, the application will need a startup time, that is, cold startup time. If your application needs to run continuously for a long time and process a large number of requests, then you may not be suitable to adopt the Serverless architecture. If you wake up the app regularly through CRON or CloudWatch, it will consume more resources. This requires us to optimize it. If it is called frequently, the resource will be resident in memory, and after the first cold start, it can be served until there is no new call request coming in for a period of time, it will go into a "dormant" state, or even be recycled, so that no resources are consumed.

2. Completely dependent on third-party services

Serverless is not a good thing for you when your enterprise cloud environment already has a lot of infrastructure. When we adopt the Serverless architecture of a cloud service provider, we are bound to that service provider, so it is not so easy for us to move the service to another cloud service provider.

We need to modify the underlying code of the series, and the solution we can take is to establish an isolation layer. This means that when designing applications, you need to isolate API gateways and database layers, considering that there are no mature ORM tools on the market that allow you to support both Firebase and DynamoDB, and so on. These will also bring us some additional costs, which may bring more problems than we can solve.

3. Lack of debugging and development tools

When I use Serverless Framework, I encounter this problem: lack of debugging and development tools. Later, after I found a series of plug-ins such as serverless-offline, dynamodb-local, and so on, the problem improved a little. However, this is still a daunting challenge for logging systems.

Every time you debug, you need to upload the code again and again. Every time you upload, it's like you're deploying a server, and you can't always quickly locate the problem. Later, I found winston, a Node.js library like log4j, which can log at different levels. It can support six different levels of logs: error, warn, info, verbose, debug and silly. Only by combining big data with log analysis and filtering can we quickly locate the problem.

4. Complex construction

Serverless is cheap, but that doesn't mean it's simple. AWS Lambda CloudFormation configuration is so complex and difficult to read and write (JSON format), although CloudFomation provides Template templates, but to use it, you need to create a Stack, specify the Template you want to use in Stack, and then aws will follow the definition in Template to create and initialize resources.

The configuration of Serverless Framework is simpler, using the YAML format. At deployment time, Serverless Framework will generate the CloudFormation configuration based on our configuration. However, this is not a real configuration for production, and the real application scenario is far more complex than this.

VI. Summary

After so many years of development, cloud computing has gradually evolved to that users only need to pay attention to the business and the resources they need. For example, with orchestration tools such as K8S, users only need to pay attention to their own computing and required resources (CPU, memory, etc.) without worrying about the machine layer.

Serverless architecture allows people to stop worrying about the resources needed to run, just focus on their own business logic, and pay for the resources actually consumed. It can be said that with the rise of Serverless architecture, the real era of cloud computing has come.

The landing of any new concept and new technology is essentially to be combined with specific business to really solve specific problems. Although Serverless is immature in many places, it needs to be improved urgently. However, Serverless's own superior features are attractive to developers. I believe that with the rapid development of technology, Serverless has unlimited possibilities in the future!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report