一、安装与启动
1. 进入kafka官网下载安装包Apache Kafka
2 解压缩安装包
tar -zxvf kafka_2.13-3.7.0.tgz
3 进入到目录启动zookeeper与kafka
cd kafka_2.13-3.7.0
修改配置的数据地址
vim ./config/zookeeper.properties
将dataDir改为你习惯的地址
vim ./config/server.properties
将log.dirs改为你习惯的地址
启动zookeeper
./bin/zookeeper-server-start.sh ./config/zookeeper.properties
启动kafka
./bin/kafka-server-start.sh ./config/server.properties
至此kafka服务启动 成功,windows 的启动命令在对应的windows目录下类似
二、kafka集成springboot
1. 添加依赖
首先,在你的pom.xml
文件中添加必要的依赖项:
<dependencies>
<!-- Spring Boot Starter for Kafka -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-kafka</artifactId>
</dependency>
<!-- 其他依赖项 -->
</dependencies>
2. 配置Kafka属性
在application.properties
或application.yml
文件中配置Kafka的属性:
application.properties
# Kafka broker地址
spring.kafka.bootstrap-servers=localhost:9092
# 消费者组ID
spring.kafka.consumer.group-id=my-group
# 生产者配置
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
# 消费者配置
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
application.yml
spring:
kafka:
bootstrap-servers: localhost:9092
consumer:
group-id: my-group
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
3. 创建Kafka生产者
创建一个Kafka生产者类,用于发送消息:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
@Service
public class KafkaProducer {
private static final String TOPIC = "my_topic";
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;
public void sendMessage(String message) {
kafkaTemplate.send(TOPIC, message);
}
}
4. 创建Kafka消费者
创建一个Kafka消费者类,用于接收消息:
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;
@Service
public class KafkaConsumer {
@KafkaListener(topics = "my_topic", groupId = "my-group")
public void consume(String message) {
System.out.println("Consumed message: " + message);
}
}
5. 测试Kafka集成
你可以在你的控制器或其他服务中注入KafkaProducer
并调用sendMessage
方法来发送消息:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@RestController
public class KafkaController {
@Autowired
private KafkaProducer kafkaProducer;
@GetMapping("/send")
public String sendMessage(@RequestParam("message") String message) {
kafkaProducer.sendMessage(message);
return "Message sent to Kafka topic!";
}
}
启动你的Spring Boot应用程序,然后访问http://localhost:8080/send?message=HelloKafka
,你应该会在控制台看到消费者打印的消息。
三、一些常用命令
删除某个topic
bin/kafka-topics.sh --delete --topic cdc-analysis-infected-topic --bootstrap-server localhost:9092
查看topic
bin/kafka-topics.sh --describe --topic cdc-analysis-infected-topic --bootstrap-server localhost:9092
查看topic列表
bin/kafka-topics.sh --list --bootstrap-server localhost:9092
查看消费组 消息堆积 不能删topic 可能导致offset 一直不动
bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --describe --group group2
消费者服务先停止 然后 手动设置问题topic的偏移量 跳过卡在的offset位置
./bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --group group3 --reset-offsets --to-offset 1915 --topic cdc-analysis-infected-topic2 --execute