org.apache.hadoop.security.token.tokenidentifier:提供者org.apache.hadoop.yarn.security.dockercredentialdokenIdentifier找不到

发布于 2025-02-11 00:52:17 字数 4590 浏览 1 评论 0原文

在连接到tomcat上部署的API的Kerberos Hadoop群集以将文件上传到HDFS时,我会遇到上述错误。 keytab文件在集群上正常工作。我能够在HDFS中进行kinit和阅读文件。此外,KRB5​​.CONF中的详细信息也按照要求。 中存在默认领域和所有其他必需的详细信息

krb5.conf stack跟踪

Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.security.token.TokenIdentifier: Provider org.apache.hadoop.yarn.security.DockerCredentialTokenIdentifier not found
    at java.util.ServiceLoader.fail(ServiceLoader.java:239)
    at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:372)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at org.apache.hadoop.security.token.Token.getClassForIdentifier(Token.java:117)
    at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:138)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$getTokenRenewalInterval$1$$anonfun$4.apply(HadoopFSDelegationTokenProvider.scala:116)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$getTokenRenewalInterval$1$$anonfun$4.apply(HadoopFSDelegationTokenProvider.scala:116)
    at scala.collection.TraversableLike$$anonfun$filterImpl$1.apply(TraversableLike.scala:248)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
    at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
    at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$getTokenRenewalInterval$1.apply(HadoopFSDelegationTokenProvider.scala:115)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$getTokenRenewalInterval$1.apply(HadoopFSDelegationTokenProvider.scala:111)
    at scala.Option.flatMap(Option.scala:171)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider.getTokenRenewalInterval(HadoopFSDelegationTokenProvider.scala:111)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider.obtainDelegationTokens(HadoopFSDelegationTokenProvider.scala:53)
    at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:132)
    at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:130)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
    at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
    at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:130)
    at org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.obtainDelegationTokens(YARNHadoopDelegationTokenManager.scala:59)
    at org.apache.spark.deploy.yarn.Client.setupSecurityToken(Client.scala:309)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:1013)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:178)
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1134)
    at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1526)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

While Connecting to kerberos hadoop cluster from the api deployed on tomcat to upload file to the hdfs, I am getting the above error.
The keytab file is working fine on the cluster. I am able to Kinit and read the files in hdfs. Also the details in the krb5.conf is also as per the requirements. Default realm and all other required details are present in krb5.conf

Stack trace is,

Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.security.token.TokenIdentifier: Provider org.apache.hadoop.yarn.security.DockerCredentialTokenIdentifier not found
    at java.util.ServiceLoader.fail(ServiceLoader.java:239)
    at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:372)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at org.apache.hadoop.security.token.Token.getClassForIdentifier(Token.java:117)
    at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:138)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$anonfun$getTokenRenewalInterval$1$anonfun$4.apply(HadoopFSDelegationTokenProvider.scala:116)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$anonfun$getTokenRenewalInterval$1$anonfun$4.apply(HadoopFSDelegationTokenProvider.scala:116)
    at scala.collection.TraversableLike$anonfun$filterImpl$1.apply(TraversableLike.scala:248)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
    at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
    at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$anonfun$getTokenRenewalInterval$1.apply(HadoopFSDelegationTokenProvider.scala:115)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$anonfun$getTokenRenewalInterval$1.apply(HadoopFSDelegationTokenProvider.scala:111)
    at scala.Option.flatMap(Option.scala:171)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider.getTokenRenewalInterval(HadoopFSDelegationTokenProvider.scala:111)
    at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider.obtainDelegationTokens(HadoopFSDelegationTokenProvider.scala:53)
    at org.apache.spark.deploy.security.HadoopDelegationTokenManager$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:132)
    at org.apache.spark.deploy.security.HadoopDelegationTokenManager$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:130)
    at scala.collection.TraversableLike$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.TraversableLike$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
    at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
    at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:130)
    at org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.obtainDelegationTokens(YARNHadoopDelegationTokenManager.scala:59)
    at org.apache.spark.deploy.yarn.Client.setupSecurityToken(Client.scala:309)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:1013)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:178)
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1134)
    at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1526)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:849)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

兔小萌 2025-02-18 00:52:17

正如例外的消息清楚地指出,

Provider org.apache.hadoop.yarn.security.DockerCredentialTokenIdentifier not found

类路径上缺少DockerCredentialTokenIdentifier类。

就我而言,当我开始分析它是如何知道应该将令牌作为DockercredentialTokenIdentifier而不是其他内容的问题时,我在列出了此类列出的凤凰城罐的meta_inf中找到了一个文件。我们正在使用Hadoop罐子中的Hadoop类,因此从Phoenix中删除了Meta_inf文件,该文件解决了问题。

zip -d *.jar ./META-INF/services/org.apache.hadoop.security.token.TokenIdentifier

当纱线试图发起火花工作时,我们在纱线上遇到了同样的问题,同样的步骤帮助解决了这两个地方的问题。

As the message in the exception clearly states,

Provider org.apache.hadoop.yarn.security.DockerCredentialTokenIdentifier not found

DockerCredentialTokenIdentifier class is missing on the classpath.

In my case when I started analyzing how does it know that the token should be resolved as a DockerCredentialTokenIdentifier and not something else, I found a file in the META_INF of the phoenix jar that had this class listed. We're using the hadoop classes from the hadoop jars, so deleted the META_INF file from the phoenix, which resolved the issue.

zip -d *.jar ./META-INF/services/org.apache.hadoop.security.token.TokenIdentifier

We were getting the same issue on the yarn when yarn is trying to launch the spark job, and the same steps helped resolving the issue in both places.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文