Hadoop DFS权限错误
2009/08/11 13:25:39 [INFO] - put: org.apache.hadoop.fs.permission.AccessControlException: Permission denied: user=yskhoo, access=WRITE, inode="":bad-boy:supergroup:rwxr-xr-x
当我尝试将某些文件从 LFS 放入 HDFS 时,为什么总是收到此错误?
2009/08/11 13:25:39 [INFO] - put: org.apache.hadoop.fs.permission.AccessControlException: Permission denied: user=yskhoo, access=WRITE, inode="":bad-boy:supergroup:rwxr-xr-x
Why do I keep getting this error when I try to put some files from my LFS to HDFS?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
权限被拒绝就是这样 --- yskhoo 试图访问坏男孩的文件。
不确定空白索引节点名称。
The permission denied is what it is --- yskhoo was trying to access bad-boy's file.
Not sure about the blank inode name.
您收到的错误是因为您的情况下的坏孩子是超级用户。 具体来说,您尝试以
yskhoo
用户身份从本地文件系统放入文件,但该用户无权访问 HDFS 目录,这就是生成错误的原因。就像在 Linux 中一样,除了 root 用户本身之外,没有其他用户可以直接访问
/root
目录,同样,如果您无权访问 HDFS 中的任何目录,则无法将文件放入 HDFS 。我建议您将文件放入 /tmp 目录中,并尝试通过以 HDFS 用户身份登录(在您的情况下是坏男孩)将其放入 HDFS 中。
The error you're getting is because, bad-boys in your case is Superuser. To be specific you're trying to put a file from your Local file system as a
yskhoo
user who does not have the permission to access HDFS directory that is why the error was generated.Just like in Linux no other user can directly access
/root
directory except for the root user itself, the same way you can not put a file into HDFS if you don't have access to any directory inside HDFS.I advice you to put the file in to /tmp directory and the try to put it in HDFS by logging in as a HDFS user(bad-boy in your case).