sqlexception:没有找到适合JDBC的驱动程序:Phoenix :(主机)

发布于 2025-01-28 14:08:21 字数 2162 浏览 2 评论 0原文

我正在运行一个Spark-Submit命令,该命令将通过Scala类进行一些数据库工作。

spark-submit 
--verbose 
--class mycompany.MyClass 
--conf spark.driver.extraJavaOptions=-Dconfig.resource=dev-test.conf 
--conf spark.executor.extraJavaOptions=-Dconfig.resource=dev-test.conf 
--master yarn 
--driver-library-path /usr/lib/hadoop-lzo/lib/native/ 
--jars /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar,/usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar,/usr/lib/hadoop-lzo/lib/hadoop-lzo.jar,/usr/lib/hadoop/lib/commons-compress-1.18.jar,/usr/lib/hadoop/hadoop-aws-3.2.1-amzn-5.jar,/usr/share/aws/aws-java-sdk/aws-java-sdk-bundle-1.12.31.jar 
--files /home/hadoop/mydir/dev-test.conf 
--num-executors 1 
--executor-memory 3g 
--driver-memory 3g 
--queue default /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar
<<args to MyClass>>

运行时,我会得到一个例外:

Caused by: java.sql.SQLException: No suitable driver found for jdbc:phoenix:host1,host2,host3:2181:/hbase;
   at java.sql.DriverManager.getConnection(DriverManager.java:689)
   at java.sql.DriverManager.getConnection(DriverManager.java:208)
   at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:422)
   at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:414)

以下是我的Scala代码的相关部分:

    val conf: SerializableHadoopConfiguration =
        new SerializableHadoopConfiguration(sc.hadoopConfiguration)
    Class.forName("org.apache.phoenix.jdbc.PhoenixDriver")
    val tableRowKeyPairs: RDD[(Cell, ImmutableBytesWritable)] =
        df.rdd.mapPartitions(partition => {
            val configuration = conf.get()
            val partitionConn: JavaConnection = QueryUtil.getConnection(configuration)
            // ...
        }

我的spark-submit命令包括/usr/lib/phoenix/phoenix/phoenix-client-hbase-hbase-2.4-- 5.1.2.jar使用-jars参数。当我在该文件中搜索“ org.apache.phoenix.jdbc.phoenixdriver”时,我发现了:

$ jar -tf /usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar | grep -i driver
...
org/apache/phoenix/jdbc/PhoenixDriver.class
...

为什么我的程序无法找到驱动程序?

I am running a spark-submit command that will do some database work via a Scala class.

spark-submit 
--verbose 
--class mycompany.MyClass 
--conf spark.driver.extraJavaOptions=-Dconfig.resource=dev-test.conf 
--conf spark.executor.extraJavaOptions=-Dconfig.resource=dev-test.conf 
--master yarn 
--driver-library-path /usr/lib/hadoop-lzo/lib/native/ 
--jars /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar,/usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar,/usr/lib/hadoop-lzo/lib/hadoop-lzo.jar,/usr/lib/hadoop/lib/commons-compress-1.18.jar,/usr/lib/hadoop/hadoop-aws-3.2.1-amzn-5.jar,/usr/share/aws/aws-java-sdk/aws-java-sdk-bundle-1.12.31.jar 
--files /home/hadoop/mydir/dev-test.conf 
--num-executors 1 
--executor-memory 3g 
--driver-memory 3g 
--queue default /home/hadoop/mydir/dbp.spark-utils-1.1.0-SNAPSHOT.jar
<<args to MyClass>>

When I run, I get an exception:

Caused by: java.sql.SQLException: No suitable driver found for jdbc:phoenix:host1,host2,host3:2181:/hbase;
   at java.sql.DriverManager.getConnection(DriverManager.java:689)
   at java.sql.DriverManager.getConnection(DriverManager.java:208)
   at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:422)
   at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:414)

Here are the relevant parts of my Scala code:

    val conf: SerializableHadoopConfiguration =
        new SerializableHadoopConfiguration(sc.hadoopConfiguration)
    Class.forName("org.apache.phoenix.jdbc.PhoenixDriver")
    val tableRowKeyPairs: RDD[(Cell, ImmutableBytesWritable)] =
        df.rdd.mapPartitions(partition => {
            val configuration = conf.get()
            val partitionConn: JavaConnection = QueryUtil.getConnection(configuration)
            // ...
        }

My spark-submit command includes /usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar using the --jars argument. When I search that file for "org.apache.phoenix.jdbc.PhoenixDriver", I find it:

$ jar -tf /usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar | grep -i driver
...
org/apache/phoenix/jdbc/PhoenixDriver.class
...

So why can't my program locate the driver?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

昔梦 2025-02-04 14:08:21

我能够通过将以下参数添加到问题中显示的spark-submit命令中,以找到驱动程序来找到驱动程序:

--conf "spark.executor.extraClassPath=/usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar" 

https://stackoverflow.com/questions/ 37132559/add-jar-files-to-a-spark-job-spark-Submit“> stackoverflow文章对各种参数的做法有很好的解释。

I was able to get the program to find the driver by adding the following argument to the spark-submit command shown in the question:

--conf "spark.executor.extraClassPath=/usr/lib/phoenix/phoenix-client-hbase-2.4-5.1.2.jar" 

This StackOverflow article has great explanations for what the various arguments do.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文