java.lang.VerifyError:google-ads API 和 SBT 的操作数堆栈溢出

发布于 2025-01-11 06:07:05 字数 3447 浏览 0 评论 0原文

我正在尝试从 EMR 中的 Spark 3.1.1 中的 Google-AdWords 迁移到 google-ads-v10 API。 由于与现有 jar 冲突,我面临一些依赖性问题。 最初,我们面临着与 Protobuf jar 相关的依赖关系:

Exception in thread "grpc-default-executor-0" java.lang.IllegalAccessError: tried to access field com.google.protobuf.AbstractMessage.memoizedSize from class com.google.ads.googleads.v10.services.SearchGoogleAdsRequest
    at com.google.ads.googleads.v10.services.SearchGoogleAdsRequest.getSerializedSize(SearchGoogleAdsRequest.java:394)
    at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108)

为了解决这个问题,尝试遮蔽 Protobuf jar 并使用 uber-jar 来代替。着色后,在 IntelliJ 本地运行我的项目工作正常,但是当尝试运行我创建的可执行 jar 时,出现以下错误:

Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional channel service provider found. Try adding a dependency on the grpc-okhttp, grpc-netty, or grpc-netty-shaded artifact

我尝试在 --spark.jars.packages 中添加所有这些库但这没有帮助。

java.lang.VerifyError: Operand stack overflow
Exception Details:
  Location:
    io/grpc/internal/TransportTracer.getStats()Lio/grpc/InternalChannelz$TransportStats; ...
...
...

    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.<init>(NettyChannelBuilder.java:96)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forTarget(NettyChannelBuilder.java:169)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forAddress(NettyChannelBuilder.java:152)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:38)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:24)
    at io.grpc.ManagedChannelBuilder.forAddress(ManagedChannelBuilder.java:39)
    at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:348)

有人遇到过这样的问题吗?

构建.sbt



lazy val dependencies = new {
  val sparkRedshift = "io.github.spark-redshift-community" %% "spark-redshift" % "5.0.3" % "provided" excludeAll (ExclusionRule(organization = "com.amazonaws"))
  val jsonSimple = "com.googlecode.json-simple" % "json-simple" % "1.1" % "provided"
  val googleAdsLib = "com.google.api-ads" % "google-ads" % "17.0.1"
  val jedis = "redis.clients" % "jedis" % "3.0.1" % "provided"
  val sparkAvro = "org.apache.spark" %% "spark-avro" % sparkVersion % "provided"
  val queryBuilder = "com.itfsw" % "QueryBuilder" % "1.0.4" % "provided" excludeAll (ExclusionRule(organization = "com.fasterxml.jackson.core"))
  val protobufForGoogleAds = "com.google.protobuf" % "protobuf-java" % "3.18.1"
  val guavaForGoogleAds = "com.google.guava" % "guava" % "31.1-jre"
}

libraryDependencies ++= Seq(
  dependencies.sparkRedshift, dependencies.jsonSimple, dependencies.googleAdsLib,dependencies.guavaForGoogleAds,dependencies.protobufForGoogleAds
  ,dependencies.jedis, dependencies.sparkAvro,
  dependencies.queryBuilder
)


dependencyOverrides ++= Set(
  dependencies.guavaForGoogleAds
)

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.protobuf.**" -> "repackaged.protobuf.@1").inAll
)


assemblyMergeStrategy in assembly := {
 case PathList("META-INF", xs@_*) => MergeStrategy.discard
 case PathList("module-info.class", xs@_*) => MergeStrategy.discard
 case x => MergeStrategy.first
}

I am trying to migrate from Google-AdWords to google-ads-v10 API in spark 3.1.1 in EMR.
I am facing some dependency issues due to conflicts with existing jars.
Initially, we were facing a dependency related to Protobuf jar:

Exception in thread "grpc-default-executor-0" java.lang.IllegalAccessError: tried to access field com.google.protobuf.AbstractMessage.memoizedSize from class com.google.ads.googleads.v10.services.SearchGoogleAdsRequest
    at com.google.ads.googleads.v10.services.SearchGoogleAdsRequest.getSerializedSize(SearchGoogleAdsRequest.java:394)
    at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108)

In order to resolve this, tried to shade the Protobuf jar and have a uber-jar instead. After the shading, running my project locally in IntelliJ works fine, But when trying to run an executable jar I created I get the following error:

Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional channel service provider found. Try adding a dependency on the grpc-okhttp, grpc-netty, or grpc-netty-shaded artifact

I tried adding all those libraries in --spark.jars.packages but it didn't help.

java.lang.VerifyError: Operand stack overflow
Exception Details:
  Location:
    io/grpc/internal/TransportTracer.getStats()Lio/grpc/InternalChannelz$TransportStats; ...
...
...

    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.<init>(NettyChannelBuilder.java:96)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forTarget(NettyChannelBuilder.java:169)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelBuilder.forAddress(NettyChannelBuilder.java:152)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:38)
    at io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider.builderForAddress(NettyChannelProvider.java:24)
    at io.grpc.ManagedChannelBuilder.forAddress(ManagedChannelBuilder.java:39)
    at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:348)

Has anyone ever encountered such an issue?

Build.sbt



lazy val dependencies = new {
  val sparkRedshift = "io.github.spark-redshift-community" %% "spark-redshift" % "5.0.3" % "provided" excludeAll (ExclusionRule(organization = "com.amazonaws"))
  val jsonSimple = "com.googlecode.json-simple" % "json-simple" % "1.1" % "provided"
  val googleAdsLib = "com.google.api-ads" % "google-ads" % "17.0.1"
  val jedis = "redis.clients" % "jedis" % "3.0.1" % "provided"
  val sparkAvro = "org.apache.spark" %% "spark-avro" % sparkVersion % "provided"
  val queryBuilder = "com.itfsw" % "QueryBuilder" % "1.0.4" % "provided" excludeAll (ExclusionRule(organization = "com.fasterxml.jackson.core"))
  val protobufForGoogleAds = "com.google.protobuf" % "protobuf-java" % "3.18.1"
  val guavaForGoogleAds = "com.google.guava" % "guava" % "31.1-jre"
}

libraryDependencies ++= Seq(
  dependencies.sparkRedshift, dependencies.jsonSimple, dependencies.googleAdsLib,dependencies.guavaForGoogleAds,dependencies.protobufForGoogleAds
  ,dependencies.jedis, dependencies.sparkAvro,
  dependencies.queryBuilder
)


dependencyOverrides ++= Set(
  dependencies.guavaForGoogleAds
)

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.protobuf.**" -> "repackaged.protobuf.@1").inAll
)


assemblyMergeStrategy in assembly := {
 case PathList("META-INF", xs@_*) => MergeStrategy.discard
 case PathList("module-info.class", xs@_*) => MergeStrategy.discard
 case x => MergeStrategy.first
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

叫思念不要吵 2025-01-18 06:07:05

我遇到了类似的问题,我将程序集合并策略更改为:

assemblyMergeStrategy in assembly := {
  case x if x.contains("io.netty.versions.properties") => MergeStrategy.discard
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

I had a similar issue and I changed the assembly merge strategy to this:

assemblyMergeStrategy in assembly := {
  case x if x.contains("io.netty.versions.properties") => MergeStrategy.discard
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}
旧时浪漫 2025-01-18 06:07:05

通过使用 google-ads-shadowjar 作为外部 jar 而不是依赖 google-ads 库解决了这个问题。这解决了必须手动处理依赖项的问题,但会使你的 jar 大小更大。

Solved this by using the google-ads-shadowjar as an external jar rather than having a dependency on google-ads library. This solves the problem of having to deal with dependencies manually but makes your jar size bigger.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文