从我的桌面连接到 Cloudera VM
我在 Windows 7 笔记本电脑上下载了 Cloudera VM 来试用。我 我正在尝试连接到虚拟机中运行的 Hadoop 实例 视窗。我执行了 ifconfig 并获取了虚拟机的 IP 地址。我可以 从运行的 Firefox 连接到虚拟机中运行的 Web 界面 在我的 Windows 盒子上,所以我知道我至少可以连接到它。
接下来,我尝试从 Java 连接到 Hadoop。
public class FileSystemWriter
{
static
{
URL.setURLStreamHandlerFactory( new FsUrlStreamHandlerFactory() );
}
public static void main( String[] args ) throws Exception
{
String uri = "hdfs://192.168.171.128/user";
Configuration conf = new Configuration();
System.out.println( "uri: " + uri );
FileSystem fs = FileSystem.get( URI.create( uri ), conf );
}
}
但我收到错误。
uri:hdfs://192.168.171.128/user
Aug 9, 2011 8:29:26 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
0 time(s).
Aug 9, 2011 8:29:28 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
1 time(s).
Aug 9, 2011 8:29:30 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
2 time(s).
有人可以帮助我吗?
I downloaded the Cloudera VM on my Windows 7 laptop to play around. I
am trying to connect to the Hadoop instance running in the VM from
Windows. I did an ifconfig and got the IP address of the VM. I can
connect to the web interfaces running in the VM from Firefox running
on my Windows box so i know i can connect at least to that.
So next, i tried to connect to Hadoop from Java.
public class FileSystemWriter
{
static
{
URL.setURLStreamHandlerFactory( new FsUrlStreamHandlerFactory() );
}
public static void main( String[] args ) throws Exception
{
String uri = "hdfs://192.168.171.128/user";
Configuration conf = new Configuration();
System.out.println( "uri: " + uri );
FileSystem fs = FileSystem.get( URI.create( uri ), conf );
}
}
But i get errors.
uri: hdfs://192.168.171.128/user
Aug 9, 2011 8:29:26 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
0 time(s).
Aug 9, 2011 8:29:28 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
1 time(s).
Aug 9, 2011 8:29:30 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
2 time(s).
Can anyone help me out?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
首先,尝试通过 hftp 连接。
如果您看到某些东西(无一例外),那么您就已连接。
如果你不这样做,那么你的问题不是 HDFS,而是你的问题是你的 ip 错误,或者 hadoop 没有运行,或者你的端口被阻止......等等......
First, try to connect over hftp .
If you see something (no exceptions), then you are connected.
If you do not, then your problem is not HDFS, but rather, your problem is that you have a bad ip, or hadoop isn't running, or your ports are blocked... etc...
确保您的Namenode正在侦听端口8020。您可以使用以下命令进行检查:
如果此检查失败,请输入
vim HADOOP_HOME/conf/core-site.xml
并在fs.default.name
的此项中查看您的 namenode 端口。更改你的java代码:
Make sure your Namenode is listening on port 8020. You can check with this command:
If this check fails, type
vim HADOOP_HOME/conf/core-site.xml
and see your namenode port in this entry offs.default.name
.Change your java code: