FLINK -KAFKA连接器恰好一次错误

发布于 2025-02-11 05:25:26 字数 2276 浏览 1 评论 0原文

我正在使用Flink 1.15.0。 For Kafka integration I wrote

KafkaSource:

public static KafkaSource<String> kafkaSource(String bootstrapServers, String topic, String groupId) {
    return KafkaSource.<String>builder()
            .setBootstrapServers(bootstrapServers)
            .setTopics(topic)              
            .setGroupId(groupId)                
            .setStartingOffsets(OffsetsInitializer.latest())
            .setValueOnlyDeserializer(new SimpleStringSchema())
            .build();

and KafkaSink:

public static KafkaSink<WordCountPojo> kafkaSink(String brokers, String topic, Properties producerProperties) {
    return KafkaSink.<WordCountPojo>builder()
            .setKafkaProducerConfig(producerProperties)
            .setBootstrapServers(brokers)
            .setRecordSerializer(new WordCountPojoSerializer(topic))
            .setDeliverGuarantee(DeliveryGuarantee.EXACTLY_ONCE)
            .setTransactionalIdPrefix(UUID.randomUUID().toString())
            .build();
}

for completeness this is my custom serializer

public class WordCountPojoSerializer implements KafkaRecordSerializationSchema<WordCountPojo> {
    private String topic;
    private ObjectMapper mapper;

    public WordCountPojoSerializer(String topic) {
        this.topic = topic;
        mapper = new ObjectMapper();
    }

    @Override
    public ProducerRecord<byte[], byte[]> serialize(WordCountPojo wordCountPojo, KafkaSinkContext kafkaSinkContext, Long timestamp) {
        try {
            byte[] serializedValue = mapper.writeValueAsBytes(wordCountPojo);
            return new ProducerRecord<>(topic, null,  timestamp,null, serializedValue);
        } catch (JsonProcessingException e) {
            return null;
        }
    }
}

Property transaction.timeout.ms is set to 60_000ms (1 minute)

My application run for a while, then, suddenly, stop除了

Caused by: org.apache.kafka.common.errors.InvalidProducerEpochException: Producer attempted to produce with an old epoch.

生产者似乎试图在两项交易之间产生记录,但我不确定。 我可以将Transaction.mms增加到最大900_000(15分钟),但我认为这可能无法解决问题。

你能帮我吗?

I'm using Flink 1.15.0. For Kafka integration I wrote

KafkaSource:

public static KafkaSource<String> kafkaSource(String bootstrapServers, String topic, String groupId) {
    return KafkaSource.<String>builder()
            .setBootstrapServers(bootstrapServers)
            .setTopics(topic)              
            .setGroupId(groupId)                
            .setStartingOffsets(OffsetsInitializer.latest())
            .setValueOnlyDeserializer(new SimpleStringSchema())
            .build();

and KafkaSink:

public static KafkaSink<WordCountPojo> kafkaSink(String brokers, String topic, Properties producerProperties) {
    return KafkaSink.<WordCountPojo>builder()
            .setKafkaProducerConfig(producerProperties)
            .setBootstrapServers(brokers)
            .setRecordSerializer(new WordCountPojoSerializer(topic))
            .setDeliverGuarantee(DeliveryGuarantee.EXACTLY_ONCE)
            .setTransactionalIdPrefix(UUID.randomUUID().toString())
            .build();
}

for completeness this is my custom serializer

public class WordCountPojoSerializer implements KafkaRecordSerializationSchema<WordCountPojo> {
    private String topic;
    private ObjectMapper mapper;

    public WordCountPojoSerializer(String topic) {
        this.topic = topic;
        mapper = new ObjectMapper();
    }

    @Override
    public ProducerRecord<byte[], byte[]> serialize(WordCountPojo wordCountPojo, KafkaSinkContext kafkaSinkContext, Long timestamp) {
        try {
            byte[] serializedValue = mapper.writeValueAsBytes(wordCountPojo);
            return new ProducerRecord<>(topic, null,  timestamp,null, serializedValue);
        } catch (JsonProcessingException e) {
            return null;
        }
    }
}

Property transaction.timeout.ms is set to 60_000ms (1 minute)

My application run for a while, then, suddenly, stop with the exception

Caused by: org.apache.kafka.common.errors.InvalidProducerEpochException: Producer attempted to produce with an old epoch.

It seems that producer tries to produce record between two transaction, but I'm not sure.
I could increase transaction.timeout.ms to max 900_000 (15 minute) but I don't think this may solve the problem.

Can you help me please?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文