Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Example Analysis of heka Reading data from kalka

2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)05/31 Report--

In this issue, the editor will bring you an example analysis of heka reading data from kalka. The article is rich in content and analyzes and narrates it from a professional point of view. I hope you can get something after reading this article.

Heka reads data from kalka.

Configuration:

[hekad]

Maxprocs = 2

[KafkaInputExample]

Type = "KafkaInput"

Topic = "test"

Addrs = ["localhost:9092"]

[RstEncoder]

[LogOutput]

Message_matcher = "TRUE"

Encoder = "RstEncoder"

The above configuration can only read data from kalfka and display it to console and write data to kalfka.

Result

: Timestamp: 2016-07-21 109 Viru 39 46.342093657 + 0000 UTC

: Type: heka.kafka

: Hostname: master

: Pid: 0

: Uuid: 501b0a0e-63a9-4eee-b9ca-ab572c17d273

: Logger: KafkaInputExample

: Payload: {"msg": "Start Request", "event": "artemis.web.ensure-running1", "userid": "12", "extra": {"workspace-id": "cN907xLngi"}, "time": "2015-05-06T 20virtual 40R 05.509926234Z", "severity": 1}

: EnvVersion:

: Severity: 7

: Fields:

| | name: "Key" type:bytes value:

| | name: "Topic" type:string value: "test" |

| | name: "Partition" type:integer value:0 |

| | name: "Offset" type:integer value:8 |

The read data is put into the payload, while the fileds stores some information from the read kalkfa. Then you can use jsondecoder for parsing.

[hekad]

Maxprocs = 2

[KafkaInputExample]

Type = "KafkaInput"

Topic = "test"

Addrs = ["localhost:9092"]

Decoder= "JsonDecoder"

[JsonDecoder]

Type = "SandboxDecoder"

Filename = "lua_decoders/json.lua"

[JsonDecoder.config]

Type = "artemis"

Payload_keep = true

Map_fields = true

Severity = "severity"

[RstEncoder]

[LogOutput]

Message_matcher = "TRUE"

Encoder = "RstEncoder"

The results are as follows:

: Timestamp: 2016-07-21 09:42:34 + 0000 UTC

: Type: artemis

: Hostname: master

: Pid: 0

: Uuid: 3965285c-70ac-4069-a1a3-a9bcf518d3e8

: Logger: KafkaInputExample

: Payload: {"msg": "Start Request", "event": "artemis.web.ensure-running2", "userid": "11", "extra": {"workspace-id": "cN907xLngi"}, "time": "2015-05-06T 20VR 40V 05.509926234Z", "severity": 1}

: EnvVersion:

: Severity: 1

: Fields:

| | name: "time" type:string value: "2015-05-06T 20purl 40purl 05.509926234Z" |

| | name: "msg" type:string value: "Start Request" |

| | name: "userid" type:string value: "11" |

| | name: "event" type:string value: "artemis.web.ensure-running2" |

| | name: "extra.workspace-id" type:string value: "cN907xLngi" |

After decoder parsing, fileds has changed, but we can see that Logger still shows KafkaInputExample, indicating that the data is not generated by decoder, but by Input, only using decoder for parsing, rewriting and rewriting fields.

Next, enter the data into the es.

[hekad]

Maxprocs = 2

[KafkaInputExample]

Type = "KafkaInput"

Topic = "test"

Addrs = ["localhost:9092"]

Decoder= "JsonDecoder"

[JsonDecoder]

Type = "SandboxDecoder"

Filename = "lua_decoders/json.lua"

[JsonDecoder.config]

Type = "artemis"

Payload_keep = true

Map_fields = true

Severity = "severity"

[ESJsonEncoder]

Index = "% {Type} -% {% Y.%m.%d}"

Es_index_from_timestamp = true

Type_name = "% {Type}"

[ESJsonEncoder.field_mappings]

Timestamp = "@ timestamp"

Severity = "level"

[ElasticSearchOutput]

Message_matcher = "TRUE"

Encoder = "ESJsonEncoder"

Flush_interval = 1

Json is also required to import into es, so use ESJsonEncoder and specify the index name and type. The execution result is as follows

You can see that in addition to the metadata field in heka, there is also JsonDecoder to generate field, which is actually taken out of the fields attribute in JsonDecoder. Note that Payload does not parse.

: Fields:

| | name: "time" type:string value: "2015-05-06T 20purl 40purl 05.509926234Z" |

| | name: "msg" type:string value: "Start Request" |

| | name: "userid" type:string value: "11" |

| | name: "event" type:string value: "artemis.web.ensure-running2" |

| | name: "extra.workspace-id" type:string value: "cN907xLngi" |

These field, of course, vary from data to data, so they are called dynamic fileds.

When entering es, you can specify which dynamic fields to extract

Fields= ["Timestamp", "Uuid", "Type", "Logger", "Pid", "Hostname", "DynamicFields"]

Dynamic_fields= ["msg", "userid"]

As long as you use dynamic_fileds, you must specify DynamicFields in fields.

If there is no dynamic_fileds, then fields can only enumerate a few fixed properties, just refer to the official documentation.

Completed columns:

[hekad]

Maxprocs = 2

[KafkaInputExample]

Type = "KafkaInput"

Topic = "test"

Addrs = ["localhost:9092"]

Decoder= "JsonDecoder"

[JsonDecoder]

Type = "SandboxDecoder"

[hekad]

Maxprocs = 2

[KafkaInputExample]

Type = "KafkaInput"

Topic = "test"

Addrs = ["localhost:9092"]

Decoder= "JsonDecoder"

[JsonDecoder]

Type = "SandboxDecoder"

Filename = "lua_decoders/json.lua"

[JsonDecoder.config]

Type = "artemis"

Payload_keep = true

Map_fields = true

Severity = "severity"

[ESJsonEncoder]

Index = "% {Type} -% {% Y.%m.%d}"

Es_index_from_timestamp = true

Type_name = "% {Type}"

Fields= ["Timestamp", "Uuid", "Type", "Logger", "Pid", "Hostname", "DynamicFields"]

Dynamic_fields= ["msg", "userid"]

Raw_bytes_fields= ["Payload"]

[ESJsonEncoder.field_mappings]

Timestamp = "@ timestamp"

Severity = "level"

[ElasticSearchOutput]

Message_matcher = "TRUE"

Encoder = "ESJsonEncoder"

Flush_interval = 1

The results are as follows

The above is the example of heka reading data from kalka shared by Xiaobian. If you happen to have similar doubts, you might as well refer to the above analysis to understand. If you want to know more about it, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report