环境信息
canal version v1.1.3-alpha-3 mysql version 5.6
问题描述
kafka主题为多分区时,消费端消费到的数据不完整
canal.properties主要配置如下: ######### binlog filter config canal.instance.filter.druid.ddl = true canal.instance.filter.query.dcl = true canal.instance.filter.query.dml = false canal.instance.filter.query.ddl = true canal.instance.filter.table.error = true canal.instance.filter.rows = false canal.instance.filter.transaction.entry = false ################################################## ######### MQ ############# ################################################## canal.mq.servers = 192.168.1.15:9092,192.169.1.16:9092,192.168.1.15:9092 canal.mq.retries = 3 canal.mq.batchSize = 16384 canal.mq.maxRequestSize = 1048576 canal.mq.lingerMs = 1 canal.mq.bufferMemory = 33554432 canal.mq.canalBatchSize = 50 canal.mq.canalGetTimeout = 100 canal.mq.flatMessage = false canal.mq.compressionType = none canal.mq.acks = all ######### use transaction for kafka flatMessage batch produce canal.mq.transaction = false
instance.properties主要配置如下: ######### table regex canal.instance.filter.regex=schema.table ######### table black regex canal.instance.filter.black.regex= ######### mq config canal.mq.topic=topic canal.mq.partition=0 ######### hash partition config canal.mq.partitionsNum=8 canal.mq.partitionHash=.\..
######### kafka消费端 ######## @KafkaListener(topics = "${spring.kafka.consumer.topic}", containerFactory = "kafkaListenerContainerFactory") public void consumerListener(KafkaMessage message, Acknowledgment ack) { try { boolean success = true; Message canalMessage = message.getMessage(); if(canalMessage != null) { if (canalMessage.getId() != -1 && canalMessage.getEntries().size() > 0) { success = printEntry(canalMessage.getEntries()); } } if(success) ack.acknowledge(); }catch (Exception e) { logger.error(e.getMessage(), e); } }
原提问者GitHub用户wangsaner
测试和验证的方式? 我之前测试过非flat模式,数据会被拆分到多个parition,总数是对的上
原回答者GitHub用户agapple
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。