fs.s3a.aws.credentials.provider java.lang.classnotfoundexception:类org.apache.hadoop.fs.s3a.auth.iaminstancecredentialsprovider找不到

发布于 2025-02-03 21:42:24 字数 1042 浏览 4 评论 0原文

我正在尝试使用以下依赖项和配置来读取S3的数据:


libraryDependencies += "org.apache.spark" %% "spark-core" % "3.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.0"

libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "3.2.1"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "3.2.1"
spark.sparkContext.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.access.key", config.s3AccessKey)
spark.sparkContext.hadoopConfiguration.set("fs.s3a.secret.key", config.s3SecretKey)
spark.sparkContext.hadoopConfiguration.set("spark.hadoop.fs.s3a.aws.credentials.provider", "org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider")

我将错误作为错误 :class org.apache.hadoop.fs.fs.s3a.auth.iaminstancecredentialsprovider nestion

java.io.ioexception:从选项fs.s3a.aws.credentials.provider java.lang.classnotfoundexception 带有Spark和Hadoop的较旧版本。确切地说,我以前使用了Spark版本2.4.8和Hadoop版本2.8.5

I'm trying to read data from S3 using spark using following dependencies and configurations:


libraryDependencies += "org.apache.spark" %% "spark-core" % "3.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.0"

libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "3.2.1"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "3.2.1"
spark.sparkContext.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.access.key", config.s3AccessKey)
spark.sparkContext.hadoopConfiguration.set("fs.s3a.secret.key", config.s3SecretKey)
spark.sparkContext.hadoopConfiguration.set("spark.hadoop.fs.s3a.aws.credentials.provider", "org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider")

I'm getting error as
java.io.IOException: From option fs.s3a.aws.credentials.provider java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3a.auth.IAMInstanceCredentialsProvider not found

It was working fine with older version of spark and hadoop. To be exact, i was previously using spark version 2.4.8 and hadoop version 2.8.5

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

倾其所爱 2025-02-10 21:42:24

我期待使用Spark 3.2.0和Hadoop 3.2.1的最新EMR版本。基本上,此问题主要是因为 .1 因此,唯一的选择是使用旧版本的EMR。 Spark 2.4.8和Hadoop 2.10.1为我工作。

I was looking forward to use the latest EMR version with spark 3.2.0 and hadoop 3.2.1. This issue basically was faced mainly because of hadoop 3.2.1 and hence the only option was to use older version of EMR. Spark 2.4.8 and hadoop 2.10.1 worked for me.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文