无法从映射器 Hadoop 打开 HDFS 文件
我进行了很多搜索,但未能找到解决这个问题的方法。 实际上我想要访问的文件位于 HDFS 中,但不在输入路径(输入到 Map/Reduce 作业的路径)中。我想从映射器访问它。 输入路径中指定的 hdfs 路径可以从映射器完全访问,但其他 hdfs 文件则不能。
内部映射器:-
FileSystem FS1=FileSystem.get(conf);
Path path=new Path(""+FS1.getHomeDirectory());
FSDataInputStream fsdis=FS1.open(path);
导致以下错误: java.io.IOException:无法打开文件名/user/hadoop
提前致谢, 残酷的
I searched a lot but failed to find a solution to this problem.
Actually the file I want to access is in HDFS, but not in input path (the path which was input to the map/reduce job). And I want to access it from mapper.
The hdfs path specified in the input path is perfectly accessible from mapper but the other hdfs files are not.
INside mapper:-
FileSystem FS1=FileSystem.get(conf);
Path path=new Path(""+FS1.getHomeDirectory());
FSDataInputStream fsdis=FS1.open(path);
RESULTS IN the following ERROR:
java.io.IOException : Cannot open filename /user/hadoop
Thanks in advance,
Harsh
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我记得使用本教程来获得类似的工作。你可以尝试一下,它与你所写的内容只有一些区别,但仍然可能有帮助...
@Edit:啊,我刚刚注意到(在阅读评论后)你正在尝试打开
FS1.getHomeDirectory()
这是一个目录。我认为您应该指出文件而不是目录(您可以在“从文件读取数据”下的链接教程中查看它)。I remember using this tutorial to get something similar working. You can give it a try, it has only a few difference tho what you've written but still it might help...
@Edit: ah and I just noticed (after reading the comments) that you are trying to open
FS1.getHomeDirectory()
and that is a directory. You should point out to a file not a directory, I think (you can check it out in the linked tutorial under "Reading data from a file").你可以尝试一次吗
can u try this once