Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to transfer CKafka data to ES with Serverless Cloud function

2025-02-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

In this issue, the editor will bring you about the Serverless cloud function how to transfer CKafka data to ES. The article is rich in content and analyzes and describes it from a professional point of view. I hope you can get something after reading this article.

Background introduction

Tencent Cloud Serverless Cloud function is the next generation general computing platform, which provides a secure, stable, efficient and easy-to-use, low-cost, serverless operating environment. With the prosperity of the Kafka community, more and more users begin to use Kafka for log collection, big data analysis, streaming data processing and so on. Ckafka, the product on the public cloud, has also made many optimizations with the help of the open source community:

ApacheKafka-based distributed, highly scalable, high throughput

100% compatible with Apache KafkaAPI (0.9,0.10)

Use all the features of Kafka directly without deployment

Ckafka encapsulates all cluster details without the need for user operation and maintenance

Support dynamic up and down instance configuration and pay according to demand (under development)

For messaging engine optimization, performance is up to 50% higher than that of the community

At the same time, on the public cloud, SCF has also made a deep linkage with CKafka, and introduced a lot of very practical features.

Operation principle

As shown in the figure below, SCF can consume messages in Ckafka in real time, such as data rollover, log cleaning, real-time consumption, etc. Moreover, functions such as data transfer have been integrated into the Ckafka console, and users can turn it on with one click, which greatly reduces the complexity of users.

Scheme advantage

Compared with using self-built Ckafka Consumer of CVM, cloud function helps users shield out a lot of unnecessary overhead:

The Ckafka trigger can be opened on the SCF console to help users create the Consumer automatically, and the SCF platform can maintain the high availability of the set up.

Ckafka trigger itself supports many practical configurations: support the configuration of offset locations, support the configuration of 10-10, 000 message aggregates, support the configuration of 10, 000 retries, etc.

The business logic developed based on cloud function supports auto scaling naturally, without the need to build and maintain server clusters.

Common scenes

As shown in the figure below, with the help of cloud functions and Ckafka triggers, it is very convenient to transfer CKafka messages to COS, ES, DB, and so on. This time, cloud functions will also be used to replace Logstash to implement Ckafka message compass ES.

Compared with using CVM self-built Logstash, SCF has the following advantages:

Cloud function has its own Consumer component, which can be aggregated by itself.

The template function of the cloud function has achieved message aggregation and partial cleaning capabilities, and can also be expanded by itself.

Cloud function cluster comes with high availability and monitoring log capability, which makes the business launch faster.

Cloud function is charged according to actual use, which is better than the cost of self-built cluster, and can save 50% of the cost.

Deployment proc

Pre-conditions (take Guangzhou region as an example):

Enable the Elasticsearch service

Enable the Ckafka service

1. Create a cloud function

Log in to the SCF console, select the region, create a new function, select the running environment Python3.6, and search "Ckafka". After selecting the template function, go to the next step.

In the next step, click Advanced Settings: configure environment variables as follows:

Required parameters: ES_Address, ES_User, ES_Password, ES_Index_KeyWord

Optionally fill in:

ES_Log_IgnoreWord (keywords to be deleted, which are fully written by default, such as name, password)

ES_Index_TimeFormat (set Index by day or hour, and index by day by default, such as enter hour)

In advanced settings, to configure a VPC, you need to select the same VPC as ES to complete the function creation.

two。 Create a Ckafka trigger

On the "trigger Management" page of the function, create a trigger, configure the trigger method for the corresponding Topic, and take effect after submission.

3. View the ES and function run logs

View the function run log

View Kibana

Introduction to Extensibility

If you want to implement advanced log cleaning logic, you can modify the logic directly in the function code.

According to the running status of the function, you can configure monitoring alarms to perceive the operation of the business in real time.

The above is how the Serverless cloud function shared by Xiaobian realizes the transfer of CKafka data to ES. If you happen to have similar doubts, please refer to the above analysis to understand. If you want to know more about it, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report