Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to connect the real-time computing Flink with its own environment

2025-04-02 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

In this issue, Xiaobian will bring you about how to get through the real-time computing Flink and its own environment. The article is rich in content and analyzes and narrates it from a professional perspective. After reading this article, I hope you can gain something.

How to solve data processing problems with real-time computing Flink? Here's how to calculate Flink in real time and connect with your environment.

1. How the Jar running the job is stored on OSS

There are two ways to upload a job jar on the VVP platform.

Method 1: With the resource upload function provided by VVP, you can directly use this function to upload Jar. Currently, this function supports uploading Jar packages within 200 megabytes. When using, just select the jar package uploaded when creating the job. The demonstration is as follows:

● Enter the VVP platform, click the resource upload function on the left, and then click the upload resource in the upper right corner of the open page, select the Jar package to upload, and complete the upload;

● After uploading successfully, click on the left to create an assignment and complete the assignment name and other information. In the Jar URI column, drop-down select the Jar package just uploaded, click OK to complete the job creation, and then start to use.

Method 2: Upload the Jar to be used directly on the OSS console, and then use the Jar link provided by OSS. It is also relatively simple to use, directly use the Jar link provided by OSS, and demonstrate as follows:

● Open the OSS console, select the Bucket used when creating the VVP, select the directory, click Upload File, and set its permissions to Public Read when uploading. Click Upload File to complete;

● When using, click "Details" on the right side of the uploaded package on the OSS console to get the URL link of the Jar package.

● When creating a job, fill in the Jar URI with the link to the jar package URL.

Note that the link provided on the OSS details page is accessed from the public network. The enabled VVP cannot directly access the public network. Therefore, when creating a job using HTTPS, you need to use the endpoint accessed by the VPC to start the job normally.

If you want to use the public network to get an HTTPS link, how to do it? You can first open the public network to VVP. In short, the steps are as follows:

First, create a NAT gateway. When creating ERP, select "Purchase ERP in combination", then select the region and supplement the name and other information, and then bind the elastic public network IP to complete the creation;

Second, create SNAT entries. After creating NAT, click "Create SNAT entry", select switch in pop-up window and add name information to complete creation.

After completing the above two steps, the VVP instance has been connected to the public network. When creating a Deployment, you can directly use the jar package accessible by the https public network.

II. How Flink interacts with typical data sources on VVP platform

This paper introduces how to interact with some external data storage systems through SQL and connectors, taking SLS and Kafka as data sources to read and write data as examples.

Click SQL Editor to create a Datagen Table, which is randomly generated for data, and click Run. Then click Generate an SLS Table to supplement the required parameter information, and then click Create Done.

Once created, write SQL statements, such as insert into sls select id, name from datagen, and then save and click Run to create Deployment and start it.

When the job runs successfully, query the data on SLS. datagen has generated data and successfully written to SLS.

Similarly, we can read data from SLS and write to Kafka by following the steps above:

● Create a Kafka table in vvp sql editor page

● Read data from SLS into Kafka using SQL syntax and start

● After the job runs successfully, start reading data from SLS to Kafka

III. How to enter Flink indicators on VVP platform into external Metrics system

Next, if you want to put the indicators of running operations into some systems, and observe the indicators. VVP offers two methods:

Method 1: By default, VVP will enter Flink job metrics into arms. No additional processing is required. After running the job directly, you can see them through the metrics button.

Method 2: If you have an indicator system, you want to input Flink's operation indicators into your own system. There are two main points: first, ensure the connectivity between the VVP operation and your own indicator system network; second, configure the corresponding metrics reporter in Flink conf. Metric configuration is performed during job creation.

For example: Use the pushGateway method of premetheus, so the reporter class selects org.apache.flink.metrics.prometheus.PrometheusPushGatewayReporter. Configure the port and host of pushGateway as shown in the figure above, and the Metric reporter is configured. After the job starts successfully, check the metrics on the configured grafana board.

IV. How to enter Flink job logs into external systems

If a job suddenly fails while it is running and we want to view the log of the failed job, we need to save the log of the Flink job. The VVP platform provides two solutions for this purpose, writing Logs into OSS or SLS. Simply put, when creating a job, configure some Log parameters in the Log configuration item.

Method 1: Write the log to OSS. When creating a job, select User-Defined in Log Configuration in Advanced Configuration, and then put the configuration in (Help Document) into User-Defined Configuration, and then replace some parameters with OSS necessary parameters.

When you need to view the log, you can find the file stored in the log through the guidance of the help document, and then click Download to view.

Method 2: Write the log to SLS. Similar to Method 1, except that LOG configuration items are slightly different; download and view methods are consistent with Method 1.

The above is how to get through the real-time calculation Flink and its own environment shared by Xiaobian. If there are similar doubts, please refer to the above analysis for understanding. If you want to know more about it, please pay attention to the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report