Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to integrate Kafka with SpringBoot

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

In this article, the editor introduces in detail "how SpringBoot integrates Kafka". The content is detailed, the steps are clear, and the details are handled properly. I hope this "SpringBoot how to integrate Kafka" article can help you solve your doubts.

I. preparatory work

Note in advance: if you have problems running, please check that the versions of Kafka and SpringBoot are consistent with those in my article. The environment in this article has been tested.

The Kafka service version is kafka_2.11-1.1.0 (Scala), which is 1.1.0

SpringBoot version: 1.5.10.RELEASE

Start zk,kafka ahead of time and create a Topic

[root@Basic kafka_2.11-1.1.0] # bin/kafka-topics.sh-create-zookeeper localhost:2181-replication-factor 1-partitions 1-topic test_topic

Make sure your kafka can be accessed, and if not, you need to open the external network to access it.

Config/server.properties

Advertised.listeners=PLAINTEXT://192.168.239.128:9092

Maven dependence

Org.springframework.kafka

Spring-kafka

Org.apache.kafka

Kafka-clients

0.11.0.0

II. Project structure

In order to better reflect the actual development requirements, producers usually throw data into the kafka after calling the services of certain interfaces to process the logic, and then a consumer constantly monitors the Topic and then processes the data, so here the producer is taken as an interface, and the consumer is placed in the kafka directory, paying attention to the @ Component annotation, otherwise the @ KafkaListener will not be scanned

Third, the specific implementation code

SpringBoot profile

Application.yml

Spring:

Kafka:

Bootstrap-servers: 192.168.239.128:9092

Producer:

Key-serializer: org.apache.kafka.common.serialization.StringSerializer

Value-serializer: org.apache.kafka.common.serialization.StringSerializer

Consumer:

Group-id: test

Enable-auto-commit: true

Auto-commit-interval: 1000

Key-deserializer: org.apache.kafka.common.serialization.StringDeserializer

Value-deserializer: org.apache.kafka.common.serialization.StringDeserializer

Producer

Package cn.saytime.web

Import org.springframework.beans.factory.annotation.Autowired

Import org.springframework.kafka.core.KafkaTemplate

Import org.springframework.web.bind.annotation.RequestMapping

Import org.springframework.web.bind.annotation.RestController

/ * *

* testing kafka producers

, /

@ RestController

@ RequestMapping ("kafka")

Public class TestKafkaProducerController {

@ Autowired

Private KafkaTemplate kafkaTemplate

@ RequestMapping ("send")

Public String send (String msg) {

KafkaTemplate.send ("test_topic", msg)

Return "success"

}

}

Consumer

Consumers here will listen to this topic and execute it if there is a message. There is no need for while (true).

Package cn.saytime.kafka

Import org.apache.kafka.clients.consumer.ConsumerRecord

Import org.springframework.kafka.annotation.KafkaListener

Import org.springframework.stereotype.Component

/ * *

* kafka Consumer Test

, /

@ Component

Public class TestConsumer {

@ KafkaListener (topics = "test_topic")

Public void listen (ConsumerRecord record) throws Exception {

System.out.printf ("topic =% s, offset =% d, value =% s\ n", record.topic (), record.offset (), record.value ())

}

}

Project startup class

Package cn.saytime

Import org.springframework.boot.SpringApplication

Import org.springframework.boot.autoconfigure.SpringBootApplication

@ SpringBootApplication

Public class TestApplication {

Public static void main (String [] args) {

SpringApplication.run (TestApplication.class, args)

}

}

IV. Testing

Run the project, execute: http://localhost:8080/kafka/send?msg=hello

Console output:

Topic = test_topic, offset = 19, value = hello

one

In order to show that the consumer ends after more than one execution, call the interface again:

Http://localhost:8080/kafka/send?msg=kafka

Topic = test_topic, offset = 20, value = kafka

one

So you can see that consumers here are actually constantly poll Topic data.

After reading this, the article "how to integrate SpringBoot with Kafka" has been introduced. If you want to master the knowledge points of this article, you still need to practice and use it yourself. If you want to know more about related articles, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report