kafka Consumer .net“协议消息结束组标记与预期标记不匹配。”

发布于 2025-01-10 04:41:48 字数 4094 浏览 0 评论 0原文

我正在尝试从 kafka 读取数据,如您所见:

var config = new ConsumerConfig
{
    BootstrapServers = ""*******,
    GroupId = Guid.NewGuid().ToString(),
    AutoOffsetReset = AutoOffsetReset.Earliest
};
MessageParser<AdminIpoChange> parser = new(() => new AdminIpoChange());
using (var consumer = new ConsumerBuilder<Ignore, byte[]>(config).Build())
{

    consumer.Subscribe("AdminIpoChange");

    while (true)
    {
        AdminIpoChange item = new AdminIpoChange();
            var cr = consumer.Consume();
    
            item = parser.ParseFrom(new ReadOnlySpan<byte>(cr.Message.Value).ToArray());
    }

    consumer.Close();
}

我正在使用 google protobuf 来发送和接收数据。此代码在解析器行中返回此错误:

 KafkaConsumer.ConsumeAsync: Protocol message end-group tag did not match expected tag.
Google.Protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
   at Google.Protobuf.ParsingPrimitivesMessages.CheckLastTagWas(ParserInternalState& state, UInt32 expectedTag)
   at Google.Protobuf.ParsingPrimitivesMessages.ReadGroup(ParseContext& ctx, Int32 fieldNumber, UnknownFieldSet set)
   at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(ParseContext& ctx)
   at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(UnknownFieldSet unknownFields, ParseContext& ctx)
   at AdminIpoChange.pb::Google.Protobuf.IBufferMessage.InternalMergeFrom(ParseContext& input) in D:\MofidProject\domain\obj\Debug\net6.0\Protos\Rlc\AdminIpoChange.cs:line 213
   at Google.Protobuf.ParsingPrimitivesMessages.ReadRawMessage(ParseContext& ctx, IMessage message)
   at Google.Protobuf.CodedInputStream.ReadRawMessage(IMessage message)
   at AdminIpoChange.MergeFrom(CodedInputStream input) in D:\MofidProject\domain\obj\Debug\net6.0\Protos\Rlc\AdminIpoChange.cs:line 188
   at Google.Protobuf.MessageExtensions.MergeFrom(IMessage message, Byte[] data, Boolean discardUnknownFields, ExtensionRegistry registry)
   at Google.Protobuf.MessageParser`1.ParseFrom(Byte[] data)
   at infrastructure.Queue.Kafka.KafkaConsumer.ConsumeCarefully[T](Func`2 consumeFunc, String topic, String group) in D:\MofidProject\infrastructure\Queue\Kafka\KafkaConsumer.cs:line 168

D:\MofidProject\mts.consumer.plus\bin\Debug\net6.0\mts.consumer.plus.exe (process 15516) exited with code -1001.
To automatically close the console when debugging stops, enable Tools->Options->Debugging->Automatically close the console when debugging stops.'

更新:

我来自 Kafka 的示例数据:

 - {"SymbolName":"\u0641\u062F\u0631","SymbolIsin":"IRo3pzAZ0002","Date":"1400/12/15","Time":"08:00-12:00","MinPrice":17726,"MaxPrice":21666,"Share":1000,"Show":false,"Operation":0,"Id":"100d8e0b54154e9d902054bff193e875","CreateDateTime":"2022-02-26T09:47:20.0134757+03:30"}

我的 rlc 模型:

syntax = "proto3";

message AdminIpoChange
{
 string Id =1;
 string SymbolName =2;
 string SymbolIsin =3;
 string Date =4;
 string Time=5;
 double MinPrice =6;
 double MaxPrice =7;
 int32 Share =8;
 bool Show =9;
 int32 Operation =10;
 string  CreateDateTime=11;
enum AdminIpoOperation
{
    Add = 0;
    Edit = 1;
    Delete = 2;
}

}

我的数据(以字节为单位):

7B 22 53 79 6D 62 6F 6C 4E 61 6D 65 22 3A 22 5C 75 30 36 34 31 5C 75 30 36 32 46 5C 75 30 
36 33 31 22 2C 22 53 79 6D 62 6F 6C 49 73 69 6E 22 3A 22 49 52 6F 33 70 7A 41 5A 30 30 30 
32 22 2C 22 44 61 74 65 22 3A 22 31 34 30 30 2F 31 32 2F 31 35 22 2C 22 54 69 6D 65 22 3A 
22 30 38 3A 30 30 2D 31 32 3A 30 30 22 2C 22 4D 69 6E 50 72 69 63 65 22 3A 31 37 37 32 36 
2C 22 4D 61 78 50 72 69 63 65 22 3A 32 31 36 36 36 2C 22 53 68 61 72 65 22 3A 31 30 30 30 
2C 22 53 68 6F 77 22 3A 66 61 6C 73 65 2C 22 4F 70 65 72 61 74 69 6F 6E 22 3A 30 2C 22 49 
64 22 3A 22 31 30 30 64 38 65 30 62 35 34 31 35 34 65 39 64 39 30 32 30 35 34 62 66 66 31 
39 33 65 38 37 35 22 2C 22 43 72 65 61 74 65 44 61 74 65 54 69 6D 65 22 3A 22 32 30 32 32 
2D 30 32 2D 32 36 54 30 39 3A 34 37 3A 32 30 2E 30 31 33 34 37 35 37 2B 30 33 3A 33 30 22 
7D 

I am trying to read data from kafka as you can see :

var config = new ConsumerConfig
{
    BootstrapServers = ""*******,
    GroupId = Guid.NewGuid().ToString(),
    AutoOffsetReset = AutoOffsetReset.Earliest
};
MessageParser<AdminIpoChange> parser = new(() => new AdminIpoChange());
using (var consumer = new ConsumerBuilder<Ignore, byte[]>(config).Build())
{

    consumer.Subscribe("AdminIpoChange");

    while (true)
    {
        AdminIpoChange item = new AdminIpoChange();
            var cr = consumer.Consume();
    
            item = parser.ParseFrom(new ReadOnlySpan<byte>(cr.Message.Value).ToArray());
    }

    consumer.Close();
}

I am using google protobuf for send and receive data .This code returns this error in parser line:

 KafkaConsumer.ConsumeAsync: Protocol message end-group tag did not match expected tag.
Google.Protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
   at Google.Protobuf.ParsingPrimitivesMessages.CheckLastTagWas(ParserInternalState& state, UInt32 expectedTag)
   at Google.Protobuf.ParsingPrimitivesMessages.ReadGroup(ParseContext& ctx, Int32 fieldNumber, UnknownFieldSet set)
   at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(ParseContext& ctx)
   at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(UnknownFieldSet unknownFields, ParseContext& ctx)
   at AdminIpoChange.pb::Google.Protobuf.IBufferMessage.InternalMergeFrom(ParseContext& input) in D:\MofidProject\domain\obj\Debug\net6.0\Protos\Rlc\AdminIpoChange.cs:line 213
   at Google.Protobuf.ParsingPrimitivesMessages.ReadRawMessage(ParseContext& ctx, IMessage message)
   at Google.Protobuf.CodedInputStream.ReadRawMessage(IMessage message)
   at AdminIpoChange.MergeFrom(CodedInputStream input) in D:\MofidProject\domain\obj\Debug\net6.0\Protos\Rlc\AdminIpoChange.cs:line 188
   at Google.Protobuf.MessageExtensions.MergeFrom(IMessage message, Byte[] data, Boolean discardUnknownFields, ExtensionRegistry registry)
   at Google.Protobuf.MessageParser`1.ParseFrom(Byte[] data)
   at infrastructure.Queue.Kafka.KafkaConsumer.ConsumeCarefully[T](Func`2 consumeFunc, String topic, String group) in D:\MofidProject\infrastructure\Queue\Kafka\KafkaConsumer.cs:line 168

D:\MofidProject\mts.consumer.plus\bin\Debug\net6.0\mts.consumer.plus.exe (process 15516) exited with code -1001.
To automatically close the console when debugging stops, enable Tools->Options->Debugging->Automatically close the console when debugging stops.'

Updated:

My sample data that comes from Kafka :

 - {"SymbolName":"\u0641\u062F\u0631","SymbolIsin":"IRo3pzAZ0002","Date":"1400/12/15","Time":"08:00-12:00","MinPrice":17726,"MaxPrice":21666,"Share":1000,"Show":false,"Operation":0,"Id":"100d8e0b54154e9d902054bff193e875","CreateDateTime":"2022-02-26T09:47:20.0134757+03:30"}

My rlc Model :

syntax = "proto3";

message AdminIpoChange
{
 string Id =1;
 string SymbolName =2;
 string SymbolIsin =3;
 string Date =4;
 string Time=5;
 double MinPrice =6;
 double MaxPrice =7;
 int32 Share =8;
 bool Show =9;
 int32 Operation =10;
 string  CreateDateTime=11;
enum AdminIpoOperation
{
    Add = 0;
    Edit = 1;
    Delete = 2;
}

}

My data in bytes :

7B 22 53 79 6D 62 6F 6C 4E 61 6D 65 22 3A 22 5C 75 30 36 34 31 5C 75 30 36 32 46 5C 75 30 
36 33 31 22 2C 22 53 79 6D 62 6F 6C 49 73 69 6E 22 3A 22 49 52 6F 33 70 7A 41 5A 30 30 30 
32 22 2C 22 44 61 74 65 22 3A 22 31 34 30 30 2F 31 32 2F 31 35 22 2C 22 54 69 6D 65 22 3A 
22 30 38 3A 30 30 2D 31 32 3A 30 30 22 2C 22 4D 69 6E 50 72 69 63 65 22 3A 31 37 37 32 36 
2C 22 4D 61 78 50 72 69 63 65 22 3A 32 31 36 36 36 2C 22 53 68 61 72 65 22 3A 31 30 30 30 
2C 22 53 68 6F 77 22 3A 66 61 6C 73 65 2C 22 4F 70 65 72 61 74 69 6F 6E 22 3A 30 2C 22 49 
64 22 3A 22 31 30 30 64 38 65 30 62 35 34 31 35 34 65 39 64 39 30 32 30 35 34 62 66 66 31 
39 33 65 38 37 35 22 2C 22 43 72 65 61 74 65 44 61 74 65 54 69 6D 65 22 3A 22 32 30 32 32 
2D 30 32 2D 32 36 54 30 39 3A 34 37 3A 32 30 2E 30 31 33 34 37 35 37 2B 30 33 3A 33 30 22 
7D 

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

孤檠 2025-01-17 04:41:48

数据绝对不是 protobuf 二进制文件;字节 0 开始一个字段号为 15 的组;该组内部是:

  • 字段 4、字符串
  • 字段 13、fixed32
  • 字段 6、varint
  • 字段 12、fixed32
  • 字段 6、

在此之后的 varint(在字节 151 处),在字段编号 6 处遇到结束组标记

。这:

  1. 您的模式不使用组(事实上,现在很难在文档中找到组的存在),所以......这些看起来都不正确,
  2. 始终需要结束组标记来匹配最后一个起始组字段编号,它不在
  3. 单个级别内的字段通常(尽管作为“应该”,而不是“必须”)按数字顺序编写,
  4. 您没有字段 12 或13 声明
  5. 你的字段 6 是错误的类型 - 我们期望这里是fixed64,但是得到了varint

所以:毫无疑问:数据是......不是你所期望的。它肯定不是有效的 protobuf 二进制文件。在不知道数据如何存储的情况下,我们所能做的就是猜测,但凭直觉:让我们尝试将其解码为 UTF8,看看它是什么样子:

{"SymbolName":"\u0641\u062F\u0631","SymbolIsin":"IRo3pzAZ0002","Date":"1400/12/15","Time":"08:00-12:00","MinPrice":17726,"MaxPrice":21666,"Share":1000,"Show":false,"Operation":0,"Id":"100d8e0b54154e9d902054bff193e875","CreateDateTime":"2022-02-26T09:47:20.0134757+03:30"}

或者(格式化)

{ 
 "SymbolName":"\u0641\u062F\u0631",
  "SymbolIsin":"IRo3pzAZ0002",
  "Date":"1400/12/15",
  "Time":"08:00-12:00", 
  "MinPrice":17726,
  "MaxPrice":21666,
  "Share":1000,
  "Show":false,
  "Operation":0,
  "Id":"100d8e0b54154e9d902054bff193e875",
  "CreateDateTime":"2022-02-26T09:47:20.0134757+03:30"
}

哎呀!您已将数据编写为 JSON,并尝试将其解码为二进制 protobuf。将其解码为 JSON,应该没问题。如果这是使用 protobuf JSON API 编写的:使用 protobuf JSON API 对其进行解码。

The data is definitely not protobuf binary; byte 0 starts a group with field number 15; inside this group is:

  • field 4, string
  • field 13, fixed32
  • field 6, varint
  • field 12, fixed32
  • field 6, varint

after this (at byte 151), an end-group token is encountered with field number 6

There are many striking things about this:

  1. your schema doesn't use groups (in fact, the mere existence of groups is now hard to find in the docs), so ... none of this looks right
  2. end-group tokens are always required to match the last start-group field number, which it doesn't
  3. fields inside a single level are usually (although as a "should", not a "must") written in numerical order
  4. you have no field 12 or 13 declared
  5. your field 6 is of the wrong type - we expect fixed64 here, but got varint

So: there's no doubt about it: that data is ... not what you expect. It certainly isn't valid protobuf binary. Without knowing how that data is stored, all we can do is guess, but on a hunch: let's try decoding it as UTF8 and see what it looks like:

{"SymbolName":"\u0641\u062F\u0631","SymbolIsin":"IRo3pzAZ0002","Date":"1400/12/15","Time":"08:00-12:00","MinPrice":17726,"MaxPrice":21666,"Share":1000,"Show":false,"Operation":0,"Id":"100d8e0b54154e9d902054bff193e875","CreateDateTime":"2022-02-26T09:47:20.0134757+03:30"}

or (formatted)

{ 
 "SymbolName":"\u0641\u062F\u0631",
  "SymbolIsin":"IRo3pzAZ0002",
  "Date":"1400/12/15",
  "Time":"08:00-12:00", 
  "MinPrice":17726,
  "MaxPrice":21666,
  "Share":1000,
  "Show":false,
  "Operation":0,
  "Id":"100d8e0b54154e9d902054bff193e875",
  "CreateDateTime":"2022-02-26T09:47:20.0134757+03:30"
}

Oops! You've written the data as JSON, and you're trying to decode it as binary protobuf. Decode it as JSON instead, and you should be fine. If this was written with the protobuf JSON API: decode it with the protobuf JSON API.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文