sqlexception:没有找到适合JDBC的驱动程序:Phoenix :(主机)
我正在运行一个Spark-Submit命令,该命令将通过Scala类进行一些数据库工作。
spark-submit
--verbose
--class mycompany.MyClass
--conf spark.driver.extraJavaOptions=-Dconfig.resource=dev-test.conf
--conf spark.executor.extraJavaOptions=-Dconfig.resource=dev-test.conf
--master yarn
--driver-library-path /usr/lib/hadoop-lzo/lib/native/
--jars /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar,/usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar,/usr/lib/hadoop-lzo/lib/hadoop-lzo.jar,/usr/lib/hadoop/lib/commons-compress-1.18.jar,/usr/lib/hadoop/hadoop-aws-3.2.1-amzn-5.jar,/usr/share/aws/aws-java-sdk/aws-java-sdk-bundle-1.12.31.jar
--files /home/hadoop/mydir/dev-test.conf
--num-executors 1
--executor-memory 3g
--driver-memory 3g
--queue default /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar
<<args to MyClass>>
运行时,我会得到一个例外:
Caused by: java.sql.SQLException: No suitable driver found for jdbc:phoenix:host1,host2,host3:2181:/hbase;
at java.sql.DriverManager.getConnection(DriverManager.java:689)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:422)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:414)
以下是我的Scala代码的相关部分:
val conf: SerializableHadoopConfiguration =
new SerializableHadoopConfiguration(sc.hadoopConfiguration)
Class.forName("org.apache.phoenix.jdbc.PhoenixDriver")
val tableRowKeyPairs: RDD[(Cell, ImmutableBytesWritable)] =
df.rdd.mapPartitions(partition => {
val configuration = conf.get()
val partitionConn: JavaConnection = QueryUtil.getConnection(configuration)
// ...
}
我的spark-submit
命令包括/usr/lib/phoenix/phoenix/phoenix-client-hbase-hbase-2.4-- 5.1.2.jar
使用-jars
参数。当我在该文件中搜索“ org.apache.phoenix.jdbc.phoenixdriver”时,我发现了:
$ jar -tf /usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar | grep -i driver
...
org/apache/phoenix/jdbc/PhoenixDriver.class
...
为什么我的程序无法找到驱动程序?
I am running a spark-submit command that will do some database work via a Scala class.
spark-submit
--verbose
--class mycompany.MyClass
--conf spark.driver.extraJavaOptions=-Dconfig.resource=dev-test.conf
--conf spark.executor.extraJavaOptions=-Dconfig.resource=dev-test.conf
--master yarn
--driver-library-path /usr/lib/hadoop-lzo/lib/native/
--jars /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar,/usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar,/usr/lib/hadoop-lzo/lib/hadoop-lzo.jar,/usr/lib/hadoop/lib/commons-compress-1.18.jar,/usr/lib/hadoop/hadoop-aws-3.2.1-amzn-5.jar,/usr/share/aws/aws-java-sdk/aws-java-sdk-bundle-1.12.31.jar
--files /home/hadoop/mydir/dev-test.conf
--num-executors 1
--executor-memory 3g
--driver-memory 3g
--queue default /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar
<<args to MyClass>>
When I run, I get an exception:
Caused by: java.sql.SQLException: No suitable driver found for jdbc:phoenix:host1,host2,host3:2181:/hbase;
at java.sql.DriverManager.getConnection(DriverManager.java:689)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:422)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:414)
Here are the relevant parts of my Scala code:
val conf: SerializableHadoopConfiguration =
new SerializableHadoopConfiguration(sc.hadoopConfiguration)
Class.forName("org.apache.phoenix.jdbc.PhoenixDriver")
val tableRowKeyPairs: RDD[(Cell, ImmutableBytesWritable)] =
df.rdd.mapPartitions(partition => {
val configuration = conf.get()
val partitionConn: JavaConnection = QueryUtil.getConnection(configuration)
// ...
}
My spark-submit
command includes /usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar
using the --jars
argument. When I search that file for "org.apache.phoenix.jdbc.PhoenixDriver", I find it:
$ jar -tf /usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar | grep -i driver
...
org/apache/phoenix/jdbc/PhoenixDriver.class
...
So why can't my program locate the driver?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我能够通过将以下参数添加到问题中显示的
spark-submit
命令中,以找到驱动程序来找到驱动程序:https://stackoverflow.com/questions/ 37132559/add-jar-files-to-a-spark-job-spark-Submit“> stackoverflow文章对各种参数的做法有很好的解释。
I was able to get the program to find the driver by adding the following argument to the
spark-submit
command shown in the question:This StackOverflow article has great explanations for what the various arguments do.