In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-16 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/02 Report--
This article mainly explains the "use of SpringBoot Kafka integration", the content of the explanation is simple and clear, easy to learn and understand, the following please follow the editor's ideas slowly in depth, together to study and learn "the use of SpringBoot Kafka integration" bar!
Create the overall architecture of the project:
Use IDEA to create a SpringBoot project, which is very simple and will not be explained too much here.
1. The code of pom file is as follows:
4.0.0 com.zhisheng kafka-learning 0.0.1-SNAPSHOT jar kafka-learning Demo project for Spring Boot + kafka org.springframework.boot spring-boot-starter-parent 1.5.9.RELEASE UTF-8 UTF-8 1.8 org.springframework.boot spring-boot-starter -web org.projectlombok lombok true org.springframework.boot spring-boot-starter-test test org.springframework.kafka spring-kafka 1.1.1.RELEASE com.google .code.gson gson 2.8.2 org.springframework.boot spring-boot-maven-plugin
Spring-kafka, lombok and gson dependencies are mainly introduced.
2. The message entity class Message.java is as follows:
@ Datapublic class Message {private Long id; / / id private String msg; / / message private Date sendTime; / / timestamp}
3. Message sending class KafkaSender.java
@ Component@Slf4jpublicclass KafkaSender {@ Autowired private KafkaTemplate kafkaTemplate; private Gson gson = new GsonBuilder (). Create (); / / message method public void send () {Message message = new Message (); message.setId (System. CurrentTimeMillis ()) Message.setMsg (UUID.randomUUID (). ToString ()); message.setSendTime (new Date ()); log.info ("+ message = {}", gson.toJson (message)); kafkaTemplate.send ("zhisheng", gson.toJson (message));}}
In this way, the sending message code is implemented.
The key code here is the kafkaTemplate.send () method. Zhisheng is the topic in Kafka. This topic does not need to be set in Kafka in advance in the Java program, because it will automatically create the topic you set when it is sent. Gson.toJson (message) is the content of the message. I will say so much here for the time being, and I will not explain it in detail. Later, I will have the opportunity to continue to interpret the source code inside and write a blog (because I encountered a pit in the middle). I followed the source code several times.
4. Message receiving class KafkaReceiver.java
@ Component@Slf4jpublicclass KafkaReceiver {@ KafkaListener (topics = {"zhisheng"}) public void listen (ConsumerRecord record) {Optional kafkaMessage = Optional.ofNullable (record.value ()); if (kafkaMessage.isPresent ()) {Object message = kafkaMessage.get () Log.info ("- record =" + record); log.info ("- message =" + message);}
It is very easy for the client consumer to receive messages. You can annotate the messages directly with @ KafkaListener, and set the listening topic in the listening. Topics is an array, so you can bind multiple topics. The above code can be changed to @ KafkaListener (topics= {"zhisheng", "tian"}) to listen to the messages of two topic at the same time. It is important to note that the topic here needs to be the same as the topic set in the messaging class KafkaSender.java.
5. Start the class KafkaApplication.java
@ SpringBootApplicationpublicclass KafkaApplication {public static void main (String [] args) {ConfigurableApplicationContext context = SpringApplication.run (KafkaApplication.class, args); KafkaSender sender = context.getBean (KafkaSender.class); for (int I = 0; I < 3) Try +) {/ / call the message sending method sender.send () in the message sending class; try {message (3000) } catch (InterruptedException e) {e.printStackTrace ();}}
6. Configuration file application.properties
# = kafka = # specify kafka proxy address You can have multiple spring.kafka.bootstrap-servers=192.168.153.135:9092##= provider = # spring.kafka.producer.retries=0# the number of messages sent in batches spring.kafka.producer.batch-size=16384spring.kafka.producer.buffer-memory=33554432## specifies the message key and the encoding and decoding method spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializerspring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer##= of the message body Consumer = = # specify default consumer group idspring.kafka.consumer.group-id=test-consumer-groupspring.kafka.consumer.auto-offset-reset=earliestspring.kafka.consumer.enable-auto-commit=truespring.kafka.consumer.auto-commit-interval=100## specify message key and message body codec spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializerspring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
After spring.kafka.bootstrap-servers, set the IP address and port number of the machine you installed Kafka to 9092.
If you just simply integrate, the others will be fine by default.
Kafka Settin
Under the Kafka directory file you installed:
Start zk
Start a single-node Zookeeper instance using the script in the installation package:
Bin/zookeeper-server-start.sh-daemon config/zookeeper.properties starts the Kafka service
Start the kafka service using kafka-server-start.sh:
Bin/kafka-server-start.sh config/server.properties
After the startup is successful!
Note: remember to turn off the firewall or open Kafka port 9092 on your virtual machine or server.
Running
The appearance of this means that the integration is successful!
Let's take a look at the list of topic in Kafka
Bin/kafka-topics.sh-list-zookeeper localhost:2181
You will find that the zhisheng in our program has been created on its own.
Thank you for your reading, the above is the content of "the use of SpringBoot Kafka integration", after the study of this article, I believe you have a deeper understanding of the use of SpringBoot Kafka integration, and the specific use needs to be verified in practice. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.