带有本地架构文件的kafka avro Deserializer
背景 : 我使用SpringKafka实施了基于AVRO的消费者和生产商。其他重要组成部分:Kafka-Broker,Zookeeper和架构Registry在Docker容器中运行。这对我来说很好。
我想要的: 我想拥有一个Kafka Avro Deserializer(在消费者中),应该独立于模式注册。就我而言,我有一个不会更改的Avroschema文件。因此,我想摆脱在消费者方面使用模式注册的额外步骤,而是去本地架构文件
Background :
I used SpringKafka to implement Avro based Consumer and Producer. Other important components : Kafka-Broker, Zookeeper and Schema-registry runs in a docker container. And this works perfectly fine for me.
What I Want :
I want to have a Kafka Avro Deserializer(in Consumer) which should be independent of schema-registry. In my case, I have a avroSchema file which would not change. So I want to get rid of this additional step of using schema-registry on Consumer side and Rather go for a local schema file
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果生产者序列化器使用架构注册表,则消费者也应该使用。 Avro 要求您拥有读者和作者模式。
如果消费者出于何种原因无法通过网络访问注册表,则需要使用
bytearrayDeserializer
,那么您将在位置5之后使用字节夹(0x0
+) 4个字节架构整数ID)byte []
从您要解析的记录部分中。然后,在AVRO API中,您可以使用
genericDatumReader
以及您的本地架构
参考来获取generiCrecord
实例,但这将假设您的读者和作家架构完全相同(但这不是真的,因为制片人可以随时更改模式)。或者,您可以从拥有的模式中创建一个特定记录,并配置
kafkaavrodeserialializer
以使用它。If the Producer serializer uses the Schema Registry, then the Consumer should as well. Avro requires you to have a reader and writer schema.
If the consumer, for whatever reason cannot access the Registry over the network, you would need to use
ByteArrayDeserializer
, then you would take the byte-slice after position 5 (0x0
+ 4 byte schema integer ID) of thebyte[]
from the part of the record you want to parse.Then, from Avro API, you can use
GenericDatumReader
along with your localSchema
reference to get aGenericRecord
instance, but this would assume your reader and writer schema are the exact same (but this shouldn't be true, as the producer could change the schema at any time).Or you can create a SpecificRecord from the schema you have, and configure the
KafkaAvroDeserializer
to use that.