线程“main”中的异常java.lang.NoClassDefFoundError: com/fasterxml/jackson/core/util/JacksonFeature

发布于 2025-01-18 08:16:20 字数 6558 浏览 1 评论 0原文

我在使用 Avro serde 在 IntelliJ-Idea IDE 中运行 Kafka 流 Java 应用程序时遇到异常。 相同的主题/应用程序以 JSON 格式运行没有问题。

错误

线程“main”中出现异常java.lang.NoClassDefFoundError: com/fasterxml/jackson/core/util/JacksonFeature 位于 com.fasterxml.jackson.databind.ObjectMapper。(ObjectMapper.java:655) 在 com.fasterxml.jackson.databind.ObjectMapper。(ObjectMapper.java:558) 在 io.confluence.kafka.schemaregistry.utils.JacksonMapper。(JacksonMapper.java:24) 在 io.confluence.kafka.schemaregistry.client.rest.RestService。(RestService.java:151) 在 io.confluence.kafka.schemaregistry.client.CachedSchemaRegistryClient。(CachedSchemaRegistryClient.java:153) 在 io.confluence.kafka.serializers.AbstractKafkaSchemaSerDe.configureClientProperties(AbstractKafkaSchemaSerDe.java:83) 在 io.confluence.kafka.serializers.AbstractKafkaAvroSerializer.configure(AbstractKafkaAvroSerializer.java:56) 在 io.confluence.kafka.serializers.KafkaAvroSerializer.configure(KafkaAvroSerializer.java:50) 在 io.confluence.kafka.streams.serdes.avro.SpecificAvroSerializer.configure(SpecificAvroSerializer.java:58)

所以我已经多次更改了所有可能的杰克逊包 - 没有运气。 我看到 jackson-databind 被调用,但我没有看到它出现在汇合注册表中。 所以我在我的 pom 文件中使用了 jackson-bom - 更改了所有主要版本仍然没有运气。

项目相关文件

POM文件

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>pmu</groupId>
    <artifactId>PMU_StreamingAPP_Demo</artifactId>
    <version>0.1.1</version>

    <properties>
        <java.version>1.8</java.version>
        <confluent.version>7.0.1</confluent.version>
        <kafka.version>3.0.1</kafka.version>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>8</maven.compiler.source>
        <maven.compiler.target>8</maven.compiler.target>
    </properties>


    <repositories>
        <repository>
            <id>confluent</id>
            <url>https://packages.confluent.io/maven/</url>
        </repository>
    </repositories>


    <build>
        <plugins>
            <!-- Maven Compiler Plugin-->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.10.1</version>
                <configuration>
                    <source>${java.version}</source>
                    <target>${java.version}</target>
                </configuration>
            </plugin>

            <!-- Maven Avro plugin for generating pojo-->
            <plugin>
                <groupId>org.apache.avro</groupId>
                <artifactId>avro-maven-plugin</artifactId>
                <version>1.10.2</version>
                <executions>
                    <execution>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>schema</goal>
                        </goals>
                        <configuration>
                            <sourceDirectory>${project.basedir}/src/main/resources/schema/</sourceDirectory>
                            <outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <!-- Confluent Kafka Avro Serializer-->
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-streams-avro-serde</artifactId>
            <version>${confluent.version}</version>
        </dependency>
        <!-- Apache Kafka Clients-->
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-streams</artifactId>
            <version>${kafka.version}</version>
        </dependency>
        <!-- Apache Log4J2 binding for SLF4J -->
        <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-slf4j-impl</artifactId>
            <version>2.17.1</version>
        </dependency>
    </dependencies>

</project>

主题Avro架构(PosInvoice):

{
  "namespace": "guru.learningjournal.kafka.examples.types",
  "type": "record",
  "name": "PosInvoice",
  "fields": [
    {"name": "InvoiceNumber","type": ["null","string"]},
    {"name": "CreatedTime","type": ["null","long"],"default" : null},
    {"name": "CustomerCardNo","type": ["null","double"]},
    {"name": "TotalAmount","type": ["null","double"]},
    {"name": "NumberOfItems","type": ["null","int"]},
    {"name": "PaymentMethod","type": ["null","string"]},
    {"name": "TaxableAmount","type": ["null","double"]},
    {"name": "CGST","type": ["null","double"]},
    {"name": "SGST","type": ["null","double"]},
    {"name": "CESS","type": ["null","double"]},
    {"name": "StoreID","type": ["null","string"]},
    {"name": "PosID","type": ["null","string"]},
    {"name": "CashierID","type": ["null","string"]},
    {"name": "CustomerType","type": ["null","string"]},
    {"name": "DeliveryType","type": ["null","string"]}
  ]
}

破坏应用程序的Kafka流代码:

流创建/加载正常:

    StreamsBuilder builder = new StreamsBuilder();
    KStream<String, PosInvoice> KS0 = builder.stream(AppConfigs.posTopicName,
        Consumed.with(AppSerdes.String(), AppSerdes.PosInvoice()));

但是之后,此代码破坏了应用程序:

KS0.filter((k, v) -> v.getDeliveryType().equals("Home_Delivery"))
    .to(AppConfigs.shipmentTopicName, Produced.with(AppSerdes.String(), AppSerdes.PosInvoice()));

似乎流序列化正常,但之后代码在 KStream.filter 方法上中断。

I'm running into exception while running Kafka stream Java app in IntelliJ-Idea IDE with Avro serde.
Same topic/app runs without problems in JSON format.

Error

Exception in thread "main" java.lang.NoClassDefFoundError:
com/fasterxml/jackson/core/util/JacksonFeature at
com.fasterxml.jackson.databind.ObjectMapper.(ObjectMapper.java:655)
at
com.fasterxml.jackson.databind.ObjectMapper.(ObjectMapper.java:558)
at
io.confluent.kafka.schemaregistry.utils.JacksonMapper.(JacksonMapper.java:24)
at
io.confluent.kafka.schemaregistry.client.rest.RestService.(RestService.java:151)
at
io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.(CachedSchemaRegistryClient.java:153)
at
io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.configureClientProperties(AbstractKafkaSchemaSerDe.java:83)
at
io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.configure(AbstractKafkaAvroSerializer.java:56)
at
io.confluent.kafka.serializers.KafkaAvroSerializer.configure(KafkaAvroSerializer.java:50)
at
io.confluent.kafka.streams.serdes.avro.SpecificAvroSerializer.configure(SpecificAvroSerializer.java:58)

So I've changed all possible jackson pacakges several time - no luck.
I see that jackson-databind is called but I don't see it present in confluent registry.
So I used jackson-bom in my pom file - changed all major versions still no luck.

Project relevant files

POM file

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>pmu</groupId>
    <artifactId>PMU_StreamingAPP_Demo</artifactId>
    <version>0.1.1</version>

    <properties>
        <java.version>1.8</java.version>
        <confluent.version>7.0.1</confluent.version>
        <kafka.version>3.0.1</kafka.version>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>8</maven.compiler.source>
        <maven.compiler.target>8</maven.compiler.target>
    </properties>


    <repositories>
        <repository>
            <id>confluent</id>
            <url>https://packages.confluent.io/maven/</url>
        </repository>
    </repositories>


    <build>
        <plugins>
            <!-- Maven Compiler Plugin-->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.10.1</version>
                <configuration>
                    <source>${java.version}</source>
                    <target>${java.version}</target>
                </configuration>
            </plugin>

            <!-- Maven Avro plugin for generating pojo-->
            <plugin>
                <groupId>org.apache.avro</groupId>
                <artifactId>avro-maven-plugin</artifactId>
                <version>1.10.2</version>
                <executions>
                    <execution>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>schema</goal>
                        </goals>
                        <configuration>
                            <sourceDirectory>${project.basedir}/src/main/resources/schema/</sourceDirectory>
                            <outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <!-- Confluent Kafka Avro Serializer-->
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-streams-avro-serde</artifactId>
            <version>${confluent.version}</version>
        </dependency>
        <!-- Apache Kafka Clients-->
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-streams</artifactId>
            <version>${kafka.version}</version>
        </dependency>
        <!-- Apache Log4J2 binding for SLF4J -->
        <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-slf4j-impl</artifactId>
            <version>2.17.1</version>
        </dependency>
    </dependencies>

</project>

Topic Avro schema (PosInvoice):

{
  "namespace": "guru.learningjournal.kafka.examples.types",
  "type": "record",
  "name": "PosInvoice",
  "fields": [
    {"name": "InvoiceNumber","type": ["null","string"]},
    {"name": "CreatedTime","type": ["null","long"],"default" : null},
    {"name": "CustomerCardNo","type": ["null","double"]},
    {"name": "TotalAmount","type": ["null","double"]},
    {"name": "NumberOfItems","type": ["null","int"]},
    {"name": "PaymentMethod","type": ["null","string"]},
    {"name": "TaxableAmount","type": ["null","double"]},
    {"name": "CGST","type": ["null","double"]},
    {"name": "SGST","type": ["null","double"]},
    {"name": "CESS","type": ["null","double"]},
    {"name": "StoreID","type": ["null","string"]},
    {"name": "PosID","type": ["null","string"]},
    {"name": "CashierID","type": ["null","string"]},
    {"name": "CustomerType","type": ["null","string"]},
    {"name": "DeliveryType","type": ["null","string"]}
  ]
}

Kafka stream code that breaks app:

Stream creates/loads ok:

    StreamsBuilder builder = new StreamsBuilder();
    KStream<String, PosInvoice> KS0 = builder.stream(AppConfigs.posTopicName,
        Consumed.with(AppSerdes.String(), AppSerdes.PosInvoice()));

But after that this code breaks app:

KS0.filter((k, v) -> v.getDeliveryType().equals("Home_Delivery"))
    .to(AppConfigs.shipmentTopicName, Produced.with(AppSerdes.String(), AppSerdes.PosInvoice()));

It seems that stream is serialized OK but after that code breaks on KStream.filter method.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文