Hadoop DFS 指向当前目录
几个月前,我们在本地计算机上安装了 CLoudera Hadoop 3,一切都很好。最近我们还安装了 Whirr 来开始使用集群。虽然我们遇到了一些问题,但过了一段时间,我们就可以启动一个集群,登录到它的主节点并开始工作。然而,我最近发现,当我在本地计算机中输入:
hadoop dfs -ls
时,它现在显示我所在的当前目录中的所有内容,而不是 DFS 的内容。这种情况过去并没有发生过,所以我们认为在安装 Whirr 时出现了一些问题。
是什么导致了这种情况,更重要的是,我们如何让本地 hadoop dfs 指向正确的位置?
A few months ago, we installed CLoudera Hadoop 3 in our local machine and everything was fine. Recently we also installed Whirr to start working with clusters. Although we faced some problems, after a while, we can start up a cluster, log into its master node and commence work. However, I found out recently that when I type:
hadoop dfs -ls
into our local machine, it now displays everything in the current directory I am in, not the contents of the DFS. This didn't use to happen, so we are thinking something got messed up when we installed Whirr.
What could have caused this, and more importantly, how can we get our local hadoop dfs to point to the correct location?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
Hadoop安装conf目录下的core-site.xml是否设置为
file:///
?Is core-site.xml in the Hadoop installation conf directory set to
file:///
?