Hbase 0.20.6 无法启动master异常

发布于 2024-10-11 02:26:58 字数 3025 浏览 0 评论 0原文

我在 Ubuntu 10.04 上使用 Hbase 0.20.6Hadoop 0.21.0 LTS,我遇到了无法启动主错误。 (该错误附在 hbase-root-master-ubuntu.log 文件的帖子末尾)

Hbase 0.20.6 可以与 Hadoop 0.21.0 一起正常工作吗?如果不是,有解决办法吗?

问题根源是什么?

感谢您的时间和考虑。

日志:

java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException
 at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
 at org.apache.hadoop.ipc.Client.call(Client.java:743)
 at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
 at $Proxy0.getProtocolVersion(Unknown Source)
 at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
 at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
 at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
 at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:195)
 at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:94)
 at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
 at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1229)
 at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1274)
Caused by: java.io.EOFException
 at java.io.DataInputStream.readInt(DataInputStream.java:375)
 at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
 at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
Fri Dec 24 14:02:12 EET 2010 Starting master on ubuntu
ulimit -n 1024
2010-12-24 14:02:13,267 INFO org.apache.hadoop.hbase.master.HMaster: vmName=Java HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc., vmVersion=17.1-b03
2010-12-24 14:02:13,268 INFO org.apache.hadoop.hbase.master.HMaster: vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError, -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, -XX:+HeapDumpOnOutOfMemoryError, -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, -XX:+HeapDumpOnOutOfMemoryError, -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, -Dhbase.log.dir=/usr/lib/hbase/bin/../logs, -Dhbase.log.file=hbase-root-master-ubuntu.log, -Dhbase.home.dir=/usr/lib/hbase/bin/.., -Dhbase.id.str=root, -Dhbase.root.logger=INFO,DRFA, -Djava.library.path=/usr/lib/hbase/bin/../lib/native/Linux-i386-32]
2010-12-24 14:02:13,353 INFO org.apache.hadoop.hbase.master.HMaster: My address is ubuntu.ubuntu-domain:60000
2010-12-24 14:02:13,593 ERROR org.apache.hadoop.hbase.master.HMaster: Can not start master

I'm using Hbase 0.20.6 with Hadoop 0.21.0 on Ubuntu 10.04 LTS and I got can't start master error. (The error is attached at the end of the post from the hbase-root-master-ubuntu.log file)

Does Hbase 0.20.6 work fine with Hadoop 0.21.0 ?? and if it's NOT, Is there a work around ??

What's the problem source ??

Thanks for your time and consideration.

The Log :

java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException
 at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
 at org.apache.hadoop.ipc.Client.call(Client.java:743)
 at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
 at $Proxy0.getProtocolVersion(Unknown Source)
 at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
 at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
 at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
 at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:195)
 at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:94)
 at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
 at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1229)
 at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1274)
Caused by: java.io.EOFException
 at java.io.DataInputStream.readInt(DataInputStream.java:375)
 at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
 at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
Fri Dec 24 14:02:12 EET 2010 Starting master on ubuntu
ulimit -n 1024
2010-12-24 14:02:13,267 INFO org.apache.hadoop.hbase.master.HMaster: vmName=Java HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc., vmVersion=17.1-b03
2010-12-24 14:02:13,268 INFO org.apache.hadoop.hbase.master.HMaster: vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError, -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, -XX:+HeapDumpOnOutOfMemoryError, -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, -XX:+HeapDumpOnOutOfMemoryError, -XX:+UseConcMarkSweepGC, -XX:+CMSIncrementalMode, -Dhbase.log.dir=/usr/lib/hbase/bin/../logs, -Dhbase.log.file=hbase-root-master-ubuntu.log, -Dhbase.home.dir=/usr/lib/hbase/bin/.., -Dhbase.id.str=root, -Dhbase.root.logger=INFO,DRFA, -Djava.library.path=/usr/lib/hbase/bin/../lib/native/Linux-i386-32]
2010-12-24 14:02:13,353 INFO org.apache.hadoop.hbase.master.HMaster: My address is ubuntu.ubuntu-domain:60000
2010-12-24 14:02:13,593 ERROR org.apache.hadoop.hbase.master.HMaster: Can not start master

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

情场扛把子 2024-10-18 02:26:58

最近 HBase 用户邮件列表上有一个关于这个问题的讨论,我建议阅读它。
http: //mail-archives.apache.org/mod_mbox/hbase-user/201012.mbox/%[电子邮件受保护]%3E

作为总结,我想引用 StumbleUpon 的 Ryan Rawson 在列表中提到的内容:

HBase 0.20.6 可能在 hadoop 21 上运行良好。我们有很多补丁
有助于增强 Branch-20-append 之上的耐用性,还有一些
可能适用于hadoop 21。

您可能遇到的是在 hbase 中使用 hadoop 20 jar
0.90 在 hadoop 21 之上。尝试删除 hadoop 20 jar 并复制
在你的hadoop 21中。

还可以考虑运行 cdh3b2+,hadoop 21 是一个经过审核的版本,没有
人们运行它也不期望它在生产环境中运行。

我们通过 debian 软件包将 HBase 0.90 RC 与 Cloudera 的 CDH3b3 结合使用。如果您想考虑它,请参阅其安装页面了解详情。我还建议在集群上安装此页面 。从此处下载最新的 HBase 0.90 RC。

There has been a discussion about this on HBase users mailing list recently, I would suggest reading it.
http://mail-archives.apache.org/mod_mbox/hbase-user/201012.mbox/%[email protected]%3E

As a summary I would quote what Ryan Rawson of StumbleUpon mentioned in the lists:

HBase 0.20.6 is likely to run well on hadoop 21. We have many patches
that help bolster durability on top of branch-20-append, and also some
may apply to hadoop 21.

What you are possibly running in to is using hadoop 20 jars in hbase
0.90 on top of hadoop 21. Try deleting the hadoop 20 jars and copying
in your hadoop 21.

Also consider running cdh3b2+, hadoop 21 is a panned release and no
one runs it nor expects it to be run in a production setting.

We are using the HBase 0.90 RCs with Cloudera's CDH3b3 via debian packages. In case you want to consider it please refer to its installation page for details. I would also recommend this page for installation on a cluster. Download the latest HBase 0.90 RC from here.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文