sqoop导入完成但hive显示表看不到表
安装hadoop、hive(CDH版本)后我执行
./sqoop import -connect jdbc:mysql://10.164.11.204/server -username root -password password -table user -hive-import --hive-home /opt/hive/
一切正常,但是当我进入hive命令行并执行show table时,什么也没有。 我使用 ./hadoop fs -ls,我可以看到 /user/(username)/user 存在。
任何帮助表示赞赏。
---编辑----------
/sqoop import -connect jdbc:mysql://10.164.11.204/server -username root -password password -table user -hive-import --target-dir /user/hive/warehouse
导入失败,原因是:
11/07/02 00:40:00 INFO hive.HiveImport: FAILED: Error in semantic analysis: line 2:17 Invalid Path 'hdfs://hadoop1:9000/user/ubuntu/user': No files matching path hdfs://hadoop1:9000/user/ubuntu/user
11/07/02 00:40:00 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 10
at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326)
at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276)
at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228)
After install hadoop, hive (CDH version) I execute
./sqoop import -connect jdbc:mysql://10.164.11.204/server -username root -password password -table user -hive-import --hive-home /opt/hive/
All goes fine, but when I enter hive command line and execute show tables, there are nothing.
I use ./hadoop fs -ls, I can see /user/(username)/user existing.
Any help is appreciated.
---EDIT-----------
/sqoop import -connect jdbc:mysql://10.164.11.204/server -username root -password password -table user -hive-import --target-dir /user/hive/warehouse
import fail due to :
11/07/02 00:40:00 INFO hive.HiveImport: FAILED: Error in semantic analysis: line 2:17 Invalid Path 'hdfs://hadoop1:9000/user/ubuntu/user': No files matching path hdfs://hadoop1:9000/user/ubuntu/user
11/07/02 00:40:00 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 10
at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326)
at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276)
at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(7)
检查 hive-site.xml 中的属性值
javax.jdo.option.ConnectionURL。如果您没有明确定义这一点,
默认值将使用相对路径来创建配置单元
元存储 (jdbc:derby:;databaseName=metastore_db;create=true) 其中
根据您启动该过程的位置而有所不同。
这可以解释为什么您无法通过显示表看到该表。
在您的中定义此属性值
hive-site.xml 使用绝对路径
Check your hive-site.xml for the value of the property
javax.jdo.option.ConnectionURL. If you do not define this explicitly,
the default value will use a relative path for creation of hive
metastore (jdbc:derby:;databaseName=metastore_db;create=true) which
will be different depending upon where you launch the process from.
This would explain why you cannot see the table via show tables.
define this property value in your
hive-site.xml using an absolute path
无需在配置单元中创建表..请参考以下查询
no need of creating the table in hive..refer the below query
就我而言,Hive 将数据存储在 HDFS 的
/user/hive/warehouse
目录中。这是 Sqoop 应该放的地方。所以我想你必须添加:
这是 Hive 表的默认位置(在你的情况下可能会有所不同)。
您可能还想在 Hive 中创建此表:
In my case Hive stores data in
/user/hive/warehouse
directory in HDFS. This is where Sqoop should put it.So I guess you have to add:
Which is default location for Hive tables (might be different in your case).
You might also want to create this table in Hive:
就我而言,它在 hive 默认数据库中创建表,您可以尝试一下。
sqoop import --connect jdbc:mysql://xxxx.com/数据库名称 --用户名 root --密码 admin --表名称 --hive-import --warehouse-dir DIR --create-hive-table --hive -表名称-m 1
in my case it creates table in hive default database, you can give it a try.
sqoop import --connect jdbc:mysql://xxxx.com/Database name --username root --password admin --table NAME --hive-import --warehouse-dir DIR --create-hive-table --hive-table NAME -m 1
Hive 表将通过 Sqoop 导入过程创建。请确保您的 HDFS 中创建了 /user/hive/warehouse。您可以浏览 HDFS(http://localhost:50070/dfshealth.jsp - 浏览文件系统选项。
还可以将 HDFS 本地包含在 -target dir 中,即 hdfs://:9000/user/ sqoop import 命令中的 hive/warehouse。
Hive tables will be created by Sqoop import process. Please make sure the /user/hive/warehouse is created in you HDFS. You can browse the HDFS (http://localhost:50070/dfshealth.jsp - Browse the File System option.
Also include the HDFS local in -target dir i.e hdfs://:9000/user/hive/warehouse in the sqoop import command.
首先,在 Hive 中创建表定义,使用与 mysql 中相同的字段名称和类型。
然后,执行导入操作
For Hive Import
First of all , create the table definition in Hive with exact field names and types as in mysql.
Then, perform the import operation
For Hive Import
我认为您所需要的只是指定数据应该存放的配置单元表。
将
"--hive-table database.tablename"
添加到 sqoop 命令并删除--hive-home /opt/hive/
。我认为这应该可以解决问题。I think all you need is to specify the hive table where data should go.
add
"--hive-table database.tablename"
to the sqoop command and remove the--hive-home /opt/hive/
. I think that should resolve the problem.