如何获得与 Spark SQL 3.2.1 for Java 1.8 兼容的 jar

发布于 2025-01-21 03:18:02 字数 1754 浏览 1 评论 0原文

我的Java 1.8程序试图使用Mariadb和Spark SQL 3.2.1二进制文件

编译并运行如下

export CLASSPATH=./hadoop-common-3.3.1.jar:/usr/share/java/slf4j-nop.jar:/mysql-connector-java-8.0.28.jar:./spark-network-common_2.12-3.2.1.jar:-/spark-catalyst_2.12-3.2.1.jar:./spark-core_2.12-3.2.1.jar:./spark-sql_2.12-3.2.1.jar:./spark-hive_2.12-3.2.1.jar:$CLASSPATH
javac ordersETL.java
java ordersETL   

: sparksession spark = sparksession.builder()。appName(“ etl versemiment”)。主(“ local”)。getorCreate(); 如下:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:799)
    at org.apache.spark.SparkConf$.<init>(SparkConf.scala:596)
    at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
    at org.apache.spark.SparkConf.set(SparkConf.scala:94)
    at org.apache.spark.SparkConf.set(SparkConf.scala:83)
    at   org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:920)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:920)
    at ordersETL.main(ordersETL.java:102)

我从Maven存储库下载的JAR文件 假设“ 2.12-3.2.1”罐子由Scara 2.12汇编而成,Spark SQL 3.2.1 但是在错误消息中谷歌搜索我多次匹配说我应该 返回由Scala 2.11编译的罐子。 但是,Maven存储库中没有2.11-3.2.1罐。 我的问题是:如何使罐子与Spark SQL 3.2.1兼容?

My java 1.8 program tries to use MariaDB and Spark SQL 3.2.1 binaries

Compiled and run as follows

export CLASSPATH=./hadoop-common-3.3.1.jar:/usr/share/java/slf4j-nop.jar:/mysql-connector-java-8.0.28.jar:./spark-network-common_2.12-3.2.1.jar:-/spark-catalyst_2.12-3.2.1.jar:./spark-core_2.12-3.2.1.jar:./spark-sql_2.12-3.2.1.jar:./spark-hive_2.12-3.2.1.jar:$CLASSPATH
javac ordersETL.java
java ordersETL   

Failing on:
SparkSession spark = SparkSession.builder().appName("ETL experiment").master("local").getOrCreate();
as follows:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:799)
    at org.apache.spark.SparkConf$.<init>(SparkConf.scala:596)
    at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
    at org.apache.spark.SparkConf.set(SparkConf.scala:94)
    at org.apache.spark.SparkConf.set(SparkConf.scala:83)
    at   org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:920)
    at scala.collection.mutable.HashMap$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:920)
    at ordersETL.main(ordersETL.java:102)

The jar files I have downloaded from Maven Repository
assuming that "2.12-3.2.1" jars are compiled by Scala 2.12 for Spark SQL 3.2.1
but googling on the error message I've multiple matches saying that I should
get back to jars compiled by Scala 2.11.
However, there ain't 2.11-3.2.1 jars in Maven repository.
My question is: How can I get jars compatible with Spark SQL 3.2.1?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文