开发者社区> 问答> 正文

Flink consume Kafka with schema registry

I have occur the problem that the data in Kakfa is formatted as avro with schema register server. I found that is not easy to consume this topic easy, the provided kafka does not support this, and I do not want to write a new kafka source, is there any way to using provided kafka source to consume kafka, which is format as avro with schema register.*来自志愿者整理的flink邮件归档

展开
收起
EXCEED 2021-12-08 13:52:10 1602 0
1 条回答
写回答
取消 提交回答
  • I have the same problem these days.

    I finally customize avro related serde schema for supporting schema registry.

    The root cause is that, when serialization , the avro record with schema registry restriction is different with “original” avro record without schema registry restriction . The former writes 5 bytes header ahead of real record bytes. 1 byte magic and 4 bytes schema Id which is the unique id registered in Kafka schema registry.

    I think apache flink should consider this case, supporting both original avro and schema registry formatted avro .*来自志愿者整理的flink邮件归档

    2021-12-08 14:42:43
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Flink CDC Meetup PPT - 龚中强 立即下载
Flink CDC Meetup PPT - 王赫 立即下载
Flink CDC Meetup PPT - 覃立辉 立即下载