GCP DataProc上的SparkJob失败而失败-Java.lang.nosuchmethoderror:io.netty.buffer.pooledbydybytebufallocator。

发布于 2025-02-12 10:22:45 字数 13455 浏览 0 评论 0原文

我使用以下命令在GCP DataProc上运行SPARK作业,

gcloud dataproc workflow-templates instantiate-from-file --file=job_config.yaml  --region us-east1

以下是我的Job_config.yaml

jobs:
 - sparkJob:
args:
- filepath
mainJarFileUri: gs://<host>/my-project-1.0.0-SNAPSHOT.jar

stepId: sparkjob
placement:
  managedCluster:
clusterName: mysparkcluster
config:
  configBucket: my-dataproc-spark
  gceClusterConfig:
    zoneUri: 'us-east1-c'
    subnetworkUri: '<uri>'      
  masterConfig:
    diskConfig:
      bootDiskSizeGb: 30
      bootDiskType: 'pd-ssd'
    machineTypeUri: 'e2-standard-4'
  workerConfig:
    diskConfig:
      bootDiskSizeGb: 30
      bootDiskType: 'pd-ssd'
    machineTypeUri: 'e2-standard-4'
    numInstances: 2
  secondaryWorkerConfig:
    diskConfig:
      bootDiskType: 'pd-ssd'
    numInstances: 2
  softwareConfig:
    imageVersion: '2.0.44-debian10'
    properties:
      dataproc:dataproc.logging.stackdriver.job.driver.enable: 'true'
      dataproc:dataproc.logging.stackdriver.job.yarn.container.enable: 'true'
      dataproc:dataproc.monitoring.stackdriver.enable: 'true'
      dataproc:dataproc.scheduler.driver-size-mb: '256'
      dataproc:ranger.cloud-sql.use-private-ip: 'true'
      spark:spark.dynamicAllocation.enabled: 'false'
      spark:spark.shuffle.service.enabled: 'false'
      spark:spark.executor.instances: '30'
      spark:spark.executor.memory: '9g'
      spark:spark.executors.cores: '3'

下面是我的pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sai.spark</groupId>
<artifactId>spark-project</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>

<dependencies>
    <dependency>
        <groupId>com.google.cloud.spark</groupId>
        <artifactId>spark-bigquery_2.12</artifactId>
        <version>0.22.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.12</artifactId>
        <version>3.1.2</version>
        <scope>compile</scope>
        <exclusions>
            <exclusion>
                <groupId>com.google.code.gson</groupId>
                <artifactId>gson</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>com.google.protobuf</groupId>
        <artifactId>protobuf-java</artifactId>
        <version>3.11.4</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.12.14</version>
    </dependency>
    <dependency>
        <groupId>org.apache.httpcomponents</groupId>
        <artifactId>httpclient</artifactId>
        <version>4.5.4</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.12</artifactId>
        <version>3.1.2</version>
        <scope>compile</scope>
    </dependency>

    <dependency>
        <groupId>org.elasticsearch</groupId>
        <artifactId>elasticsearch</artifactId>
        <version>7.5.2</version>
    </dependency>
    <dependency>
        <groupId>com.google.cloud</groupId>
        <artifactId>google-cloud-secretmanager</artifactId>
        <version>0.3.0</version>
        <exclusions>
            <exclusion>
                <groupId>io.grpc</groupId>
                <artifactId>grpc-grpclb</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.typesafe/config -->
    <dependency>
        <groupId>com.typesafe</groupId>
        <artifactId>config</artifactId>
        <version>1.4.0</version>
    </dependency>

</dependencies>
<build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <plugins>
      
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.2.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <createDependencyReducedPom>false</createDependencyReducedPom>
                        <artifactSet>
                            <includes>
                                <!-- dependencies to be packed in the fat jar -->
                                <include>*</include>
                            </includes>
                        </artifactSet>
                        <transformers>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                <resource>reference.conf</resource>
                            </transformer>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <manifestEntries>
                                    <Main-Class>com.sai.spark.Test</Main-Class>
                                </manifestEntries>
                            </transformer>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                        </transformers>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/maven/**</exclude>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <relocations>
                            <relocation>
                                <pattern>com</pattern>
                                <shadedPattern>repackaged.com</shadedPattern>
                                <includes>
                                    <include>com.google.protobuf.**</include>
                                    <include>com.google.common.**</include>
                                    <include>com.google.gson.**</include>
                                    <include>com.google.api.**</include>
                                </includes>
                            </relocation>
                           
                            <relocation>
                                <pattern>org</pattern>
                                <shadedPattern>repackaged.org</shadedPattern>
                                <includes>
                                    <include>org.apache.commons.**</include>
                                </includes>
                            </relocation>
                        </relocations>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

stacktrace:

22/07/01 16:58:00 ERROR io.grpc.internal.ManagedChannelImpl: [Channel<1>: (secretmanager.googleapis.com:443)] Uncaught exception in the SynchronizationContext. Panic!
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.<init>(ZIIIIIIZ)V
at io.grpc.netty.Utils.createByteBufAllocator(Utils.java:172)
at io.grpc.netty.Utils.access$000(Utils.java:71)
at io.grpc.netty.Utils$ByteBufAllocatorPreferDirectHolder.<clinit>(Utils.java:93)
at io.grpc.netty.Utils.getByteBufAllocator(Utils.java:140)
at io.grpc.netty.NettyClientTransport.start(NettyClientTransport.java:231)
at io.grpc.internal.ForwardingConnectionClientTransport.start(ForwardingConnectionClientTransport.java:33)
at io.grpc.internal.ForwardingConnectionClientTransport.start(ForwardingConnectionClientTransport.java:33)
at io.grpc.internal.InternalSubchannel.startNewTransport(InternalSubchannel.java:258)
at io.grpc.internal.InternalSubchannel.access$400(InternalSubchannel.java:65)
at io.grpc.internal.InternalSubchannel$2.run(InternalSubchannel.java:200)
at io.grpc.SynchronizationContext.drain(SynchronizationContext.java:95)
at io.grpc.SynchronizationContext.execute(SynchronizationContext.java:127)
at io.grpc.internal.ManagedChannelImpl$NameResolverListener.onResult(ManagedChannelImpl.java:1852)
at io.grpc.internal.DnsNameResolver$Resolve.run(DnsNameResolver.java:333)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Exception in thread "main" repackaged.com.google.api.gax.rpc.InternalException: io.grpc.StatusRuntimeException: INTERNAL: Panic! This is a bug!
at repackaged.com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:67)
at repackaged.com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
at repackaged.com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
at repackaged.com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
at repackaged.com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
at repackaged.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1074)
at repackaged.com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at repackaged.com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1213)
at repackaged.com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:983)
at repackaged.com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:771)
at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:522)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:497)
at io.grpc.internal.DelayedClientCall$DelayedListener$3.run(DelayedClientCall.java:463)
at io.grpc.internal.DelayedClientCall$DelayedListener.delayOrExecute(DelayedClientCall.java:427)
at io.grpc.internal.DelayedClientCall$DelayedListener.onClose(DelayedClientCall.java:460)
at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:553)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:68)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:739)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:718)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)

我尝试了以下stackoverflow问题中指定的解决方案,但问题尚未解决。

I'm running a spark job on GCP dataproc using below command,

gcloud dataproc workflow-templates instantiate-from-file --file=job_config.yaml  --region us-east1

Below is my job_config.yaml

jobs:
 - sparkJob:
args:
- filepath
mainJarFileUri: gs://<host>/my-project-1.0.0-SNAPSHOT.jar

stepId: sparkjob
placement:
  managedCluster:
clusterName: mysparkcluster
config:
  configBucket: my-dataproc-spark
  gceClusterConfig:
    zoneUri: 'us-east1-c'
    subnetworkUri: '<uri>'      
  masterConfig:
    diskConfig:
      bootDiskSizeGb: 30
      bootDiskType: 'pd-ssd'
    machineTypeUri: 'e2-standard-4'
  workerConfig:
    diskConfig:
      bootDiskSizeGb: 30
      bootDiskType: 'pd-ssd'
    machineTypeUri: 'e2-standard-4'
    numInstances: 2
  secondaryWorkerConfig:
    diskConfig:
      bootDiskType: 'pd-ssd'
    numInstances: 2
  softwareConfig:
    imageVersion: '2.0.44-debian10'
    properties:
      dataproc:dataproc.logging.stackdriver.job.driver.enable: 'true'
      dataproc:dataproc.logging.stackdriver.job.yarn.container.enable: 'true'
      dataproc:dataproc.monitoring.stackdriver.enable: 'true'
      dataproc:dataproc.scheduler.driver-size-mb: '256'
      dataproc:ranger.cloud-sql.use-private-ip: 'true'
      spark:spark.dynamicAllocation.enabled: 'false'
      spark:spark.shuffle.service.enabled: 'false'
      spark:spark.executor.instances: '30'
      spark:spark.executor.memory: '9g'
      spark:spark.executors.cores: '3'

Below is my pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sai.spark</groupId>
<artifactId>spark-project</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>

<dependencies>
    <dependency>
        <groupId>com.google.cloud.spark</groupId>
        <artifactId>spark-bigquery_2.12</artifactId>
        <version>0.22.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.12</artifactId>
        <version>3.1.2</version>
        <scope>compile</scope>
        <exclusions>
            <exclusion>
                <groupId>com.google.code.gson</groupId>
                <artifactId>gson</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>com.google.protobuf</groupId>
        <artifactId>protobuf-java</artifactId>
        <version>3.11.4</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.12.14</version>
    </dependency>
    <dependency>
        <groupId>org.apache.httpcomponents</groupId>
        <artifactId>httpclient</artifactId>
        <version>4.5.4</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.12</artifactId>
        <version>3.1.2</version>
        <scope>compile</scope>
    </dependency>

    <dependency>
        <groupId>org.elasticsearch</groupId>
        <artifactId>elasticsearch</artifactId>
        <version>7.5.2</version>
    </dependency>
    <dependency>
        <groupId>com.google.cloud</groupId>
        <artifactId>google-cloud-secretmanager</artifactId>
        <version>0.3.0</version>
        <exclusions>
            <exclusion>
                <groupId>io.grpc</groupId>
                <artifactId>grpc-grpclb</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.typesafe/config -->
    <dependency>
        <groupId>com.typesafe</groupId>
        <artifactId>config</artifactId>
        <version>1.4.0</version>
    </dependency>

</dependencies>
<build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <plugins>
      
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.2.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <createDependencyReducedPom>false</createDependencyReducedPom>
                        <artifactSet>
                            <includes>
                                <!-- dependencies to be packed in the fat jar -->
                                <include>*</include>
                            </includes>
                        </artifactSet>
                        <transformers>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                <resource>reference.conf</resource>
                            </transformer>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <manifestEntries>
                                    <Main-Class>com.sai.spark.Test</Main-Class>
                                </manifestEntries>
                            </transformer>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                        </transformers>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/maven/**</exclude>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <relocations>
                            <relocation>
                                <pattern>com</pattern>
                                <shadedPattern>repackaged.com</shadedPattern>
                                <includes>
                                    <include>com.google.protobuf.**</include>
                                    <include>com.google.common.**</include>
                                    <include>com.google.gson.**</include>
                                    <include>com.google.api.**</include>
                                </includes>
                            </relocation>
                           
                            <relocation>
                                <pattern>org</pattern>
                                <shadedPattern>repackaged.org</shadedPattern>
                                <includes>
                                    <include>org.apache.commons.**</include>
                                </includes>
                            </relocation>
                        </relocations>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

StackTrace:

22/07/01 16:58:00 ERROR io.grpc.internal.ManagedChannelImpl: [Channel<1>: (secretmanager.googleapis.com:443)] Uncaught exception in the SynchronizationContext. Panic!
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.<init>(ZIIIIIIZ)V
at io.grpc.netty.Utils.createByteBufAllocator(Utils.java:172)
at io.grpc.netty.Utils.access$000(Utils.java:71)
at io.grpc.netty.Utils$ByteBufAllocatorPreferDirectHolder.<clinit>(Utils.java:93)
at io.grpc.netty.Utils.getByteBufAllocator(Utils.java:140)
at io.grpc.netty.NettyClientTransport.start(NettyClientTransport.java:231)
at io.grpc.internal.ForwardingConnectionClientTransport.start(ForwardingConnectionClientTransport.java:33)
at io.grpc.internal.ForwardingConnectionClientTransport.start(ForwardingConnectionClientTransport.java:33)
at io.grpc.internal.InternalSubchannel.startNewTransport(InternalSubchannel.java:258)
at io.grpc.internal.InternalSubchannel.access$400(InternalSubchannel.java:65)
at io.grpc.internal.InternalSubchannel$2.run(InternalSubchannel.java:200)
at io.grpc.SynchronizationContext.drain(SynchronizationContext.java:95)
at io.grpc.SynchronizationContext.execute(SynchronizationContext.java:127)
at io.grpc.internal.ManagedChannelImpl$NameResolverListener.onResult(ManagedChannelImpl.java:1852)
at io.grpc.internal.DnsNameResolver$Resolve.run(DnsNameResolver.java:333)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Exception in thread "main" repackaged.com.google.api.gax.rpc.InternalException: io.grpc.StatusRuntimeException: INTERNAL: Panic! This is a bug!
at repackaged.com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:67)
at repackaged.com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
at repackaged.com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
at repackaged.com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
at repackaged.com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
at repackaged.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1074)
at repackaged.com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at repackaged.com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1213)
at repackaged.com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:983)
at repackaged.com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:771)
at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:522)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:497)
at io.grpc.internal.DelayedClientCall$DelayedListener$3.run(DelayedClientCall.java:463)
at io.grpc.internal.DelayedClientCall$DelayedListener.delayOrExecute(DelayedClientCall.java:427)
at io.grpc.internal.DelayedClientCall$DelayedListener.onClose(DelayedClientCall.java:460)
at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:553)
at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:68)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:739)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:718)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)

I have tried solutions specified in below stackoverflow questions, but issue is not getting resolved.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

自由如风 2025-02-19 10:22:46

在更新一些依赖项之后,对我有用。

   <dependency>
        <groupId>org.elasticsearch</groupId>
        <artifactId>elasticsearch</artifactId>
        <version>7.5.2</version>
        <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty-all</artifactId>
            </exclusion>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
     <dependency>
        <groupId>com.google.cloud.spark</groupId>
        <artifactId>spark-bigquery-with-dependencies_2.12</artifactId>
        <version>0.23.2</version>
    </dependency>

After updating some dependencies as shown below, it worked for me.

   <dependency>
        <groupId>org.elasticsearch</groupId>
        <artifactId>elasticsearch</artifactId>
        <version>7.5.2</version>
        <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty-all</artifactId>
            </exclusion>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
     <dependency>
        <groupId>com.google.cloud.spark</groupId>
        <artifactId>spark-bigquery-with-dependencies_2.12</artifactId>
        <version>0.23.2</version>
    </dependency>
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文