无法初始化com.datastax.oss.oss.internal.core.config.config.typesafe.typesafedriverconfig的初始化
我正在使用Azure Databricks
解决方案以连接到Cassandra
。我的cassandra
实例在某些特定端口公开,可从cqlsh
访问。 Cassandra
显示版本返回:
[CQLSH 6.0.0 | Cassandra 3.11.10 | CQL规格3.4.4 |本机协议v4]
我已经创建了cluster
在运行时运行:
7.3 lts(包括Apache Spark 3.0.1,Scala 2.12)
我已经安装了以下图书馆: com.datastax.oss:Java-Driver-core:4.12.0
和com.datastax.spark:spark-cassandra-connector_2.12:3.0.1
现在m试图执行简单查询以加载数据框架:
spark.read.format("org.apache.spark.sql.cassandra")
.option("spark.cassandra.connection.host", ...)
.option("spark.cassandra.auth.username", ...)
.option("spark.cassandra.auth.password", ...)
.option("table", ...)
.option("keyspace", ...)
.load()
在响应中,我得到了: java.io.ioexception:未能打开与Cassandra的本机连接::无法初始化com.datastax.oss.oss.soss.internal.core.config.config.typesafe.typesafedriverconfig
我如何正确初始化连接连接连接?
I'm using Azure Databricks
solution to connect to Cassandra
. My Cassandra
instance is exposed at some specific port and accessible from cqlsh
.Cassandra
SHOW versions returns:
[cqlsh 6.0.0 | Cassandra 3.11.10 | CQL spec 3.4.4 | Native protocol v4]
I've created Cluster
that runs on runtime:
7.3 LTS (includes Apache Spark 3.0.1, Scala 2.12)
I've installed following libraries:com.datastax.oss:java-driver-core:4.12.0
and com.datastax.spark:spark-cassandra-connector_2.12:3.0.1
Now I'm trying to execute simple query to load data with Dataframes:
spark.read.format("org.apache.spark.sql.cassandra")
.option("spark.cassandra.connection.host", ...)
.option("spark.cassandra.auth.username", ...)
.option("spark.cassandra.auth.password", ...)
.option("table", ...)
.option("keyspace", ...)
.load()
In response I'm getting:java.io.IOException: Failed to open native connection to Cassandra at :: Could not initialize class com.datastax.oss.driver.internal.core.config.typesafe.TypesafeDriverConfig
How can I correctly initialize connection?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您需要使用
spark-cassandra-connector-sembly
( maven central ),而不是spark-cassandra-connector
。原因 - Spark Cassandra Connector使用型Typeafe Config库的新版本比Databricks运行时。汇编版本包括所有必要的库作为阴影版本。而且,您无需安装Java-Driver核
- 它将自动拉动。You need to use
spark-cassandra-connector-assembly
(Maven Central) instead ofspark-cassandra-connector
. The reason - Spark Cassandra Connector uses newer version of Typesafe Config library than Databricks runtime. The assembly version includes all necessary libraries as shaded versions. And you don't need to installjava-driver-core
- it will be pulled as dependency automatically.