从我的桌面连接到 Cloudera VM

发布于 2024-11-28 16:22:03 字数 1338 浏览 5 评论 0原文

我在 Windows 7 笔记本电脑上下载了 Cloudera VM 来试用。我 我正在尝试连接到虚拟机中运行的 Hadoop 实例 视窗。我执行了 ifconfig 并获取了虚拟机的 IP 地址。我可以 从运行的 Firefox 连接到虚拟机中运行的 Web 界面 在我的 Windows 盒子上,所以我知道我至少可以连接到它。

接下来,我尝试从 Java 连接到 Hadoop。

public class FileSystemWriter
{

static
        {
                URL.setURLStreamHandlerFactory( new FsUrlStreamHandlerFactory() );
        }

        public static void main( String[] args ) throws Exception
        {
                String uri = "hdfs://192.168.171.128/user";
                Configuration conf = new Configuration();

                System.out.println( "uri: " + uri );

                FileSystem fs = FileSystem.get( URI.create( uri ), conf );
       }

} 

但我收到错误。

uri:hdfs://192.168.171.128/user

Aug 9, 2011 8:29:26 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
0 time(s).
Aug 9, 2011 8:29:28 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
1 time(s).
Aug 9, 2011 8:29:30 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
2 time(s).

有人可以帮助我吗?

I downloaded the Cloudera VM on my Windows 7 laptop to play around. I
am trying to connect to the Hadoop instance running in the VM from
Windows. I did an ifconfig and got the IP address of the VM. I can
connect to the web interfaces running in the VM from Firefox running
on my Windows box so i know i can connect at least to that.

So next, i tried to connect to Hadoop from Java.

public class FileSystemWriter
{

static
        {
                URL.setURLStreamHandlerFactory( new FsUrlStreamHandlerFactory() );
        }

        public static void main( String[] args ) throws Exception
        {
                String uri = "hdfs://192.168.171.128/user";
                Configuration conf = new Configuration();

                System.out.println( "uri: " + uri );

                FileSystem fs = FileSystem.get( URI.create( uri ), conf );
       }

} 

But i get errors.

uri: hdfs://192.168.171.128/user

Aug 9, 2011 8:29:26 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
0 time(s).
Aug 9, 2011 8:29:28 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
1 time(s).
Aug 9, 2011 8:29:30 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
2 time(s).

Can anyone help me out?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

一身骄傲 2024-12-05 16:22:03

首先,尝试通过 hftp 连接。

        uri = "hftp://172.16.xxx.xxx:50070/";

        System.out.println( "uri: " + uri );           
        Configuration conf = new Configuration();

        FileSystem fs = FileSystem.get( URI.create( uri ), conf );
        fs.printStatistics();

如果您看到某些东西(无一例外),那么您就已连接。

如果你不这样做,那么你的问题不是 HDFS,而是你的问题是你的 ip 错误,或者 hadoop 没有运行,或者你的端口被阻止......等等......

First, try to connect over hftp .

        uri = "hftp://172.16.xxx.xxx:50070/";

        System.out.println( "uri: " + uri );           
        Configuration conf = new Configuration();

        FileSystem fs = FileSystem.get( URI.create( uri ), conf );
        fs.printStatistics();

If you see something (no exceptions), then you are connected.

If you do not, then your problem is not HDFS, but rather, your problem is that you have a bad ip, or hadoop isn't running, or your ports are blocked... etc...

风向决定发型 2024-12-05 16:22:03
  1. 确保您的Namenode正在侦听端口8020。您可以使用以下命令进行检查:

    hadoop fs -ls hdfs://namenode(ip):8020
    
  2. 如果此检查失败,请输入 vim HADOOP_HOME/conf/core-site.xml 并在 fs.default.name 的此项中查看您的 namenode 端口。

  3. 更改你的java代码:

    String uri = "hdfs://192.168.171.128:portOfNameNode/user";
    
  1. Make sure your Namenode is listening on port 8020. You can check with this command:

    hadoop fs -ls hdfs://namenode(ip):8020
    
  2. If this check fails, type vim HADOOP_HOME/conf/core-site.xml and see your namenode port in this entry of fs.default.name.

  3. Change your java code:

    String uri = "hdfs://192.168.171.128:portOfNameNode/user";
    
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文