java.lang.noclassdeffounderror:com/dataStax/spark/connector/columnselector构建jar
我正在使用Spark的Maven Project,将Cassandra用作数据库。 可以使用0错误执行代码,并获得我想要的结果,但是我无法用JAR文件提交项目: 这是我的pom.xml:
我的命令提交项目:
spark-submit --master "local[*]" --class com.sparkfinal.App target/sparkfinal-1.0-SNAPSHOT.jar
错误:
Am using maven project with spark, and a cassandra as a database.
Am able to execute the code with 0 error, and getting the result that I want, but I can't submit the project with the jar file:
here is my pom.xml :
my commands to submit the project:
spark-submit --master "local[*]" --class com.sparkfinal.App target/sparkfinal-1.0-SNAPSHOT.jar
the error:
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您都需要:
- packages com.datastax.spark.spark:spark-cassandra-- connector_2.12:3.2.0
,如连接器的文档我建议使用第一种方法,因为我看到您还有其他需要提供的依赖项。另外,请注意,您可能不需要指定Java驱动程序依赖性,因为Spark Cassandra连接器将自动拉出。
You need either:
provided
)--packages com.datastax.spark:spark-cassandra-connector_2.12:3.2.0
as shown in the connector's docsI would recommend to use first method because I see that you have other dependencies that you'll need to provide. Also, please note that you may not need to specify java driver dependency as it will be pulled by Spark Cassandra Connector automatically.