运行作业时Hadoop DFS权限问题
我收到以下权限错误,并且不确定为什么 hadoop 试图写入这个特定文件夹:
hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar pi 2 100000
Number of Maps = 2
Samples per Map = 100000
Wrote input for Map #0
Wrote input for Map #1
Starting Job
org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=myuser, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x
知道为什么它试图写入我的 hdfs 的根目录吗?
更新:临时将 hdfs root (/) 设置为 777 权限后,我看到正在写入“/tmp”文件夹。我想一个选择是只创建一个具有开放权限的“/tmp”文件夹,供所有人写入,但从安全角度来看,如果将其写入用户文件夹(即/user/myuser/tmp),那就太好了
I'm getting this following permission error, and am not sure why hadoop is trying to write to this particular folder:
hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar pi 2 100000
Number of Maps = 2
Samples per Map = 100000
Wrote input for Map #0
Wrote input for Map #1
Starting Job
org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=myuser, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x
Any idea why it is trying to write to the root of my hdfs?
Update: After temporarily setting hdfs root (/) to be 777 permissions, I seen that a "/tmp" folder is being written. I suppose one option is to just create a "/tmp" folder with open permissions for all to write to, but it would be nice from a security standpoint if this is instead written to the user folder (i.e. /user/myuser/tmp)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
我能够通过以下设置来实现此功能:
还需要重新启动 jobtracker 服务(特别感谢 Hadoop 邮件列表上的 Jeff 帮助我追踪问题!)
I was able to get this working with the following setting:
Restart of jobtracker service required as well (special thanks to Jeff on Hadoop mailing list for helping me track down problem!)
1)使用以下命令在hdfs中创建{mapred.system.dir}/mapred目录
2)为mapred用户授予权限
1) Create the {mapred.system.dir}/mapred directory in hdfs using the following command
2) Give permission to mapred user
您还可以创建一个名为“hdfs”的新用户。非常简单的解决方案,但可能不那么干净。
当然,这是当您将 Hue 与 Cloudera Hadoop Manager (CDH3) 一起使用时
You can also make a new user named "hdfs". Quite simple solution but not as clean probably.
Of course this is when you are using Hue with Cloudera Hadoop Manager (CDH3)
您需要设置hadoop根目录(/)的权限,而不是设置系统根目录的权限。我也很困惑,但后来意识到提到的目录是hadoop的文件系统,而不是系统的。
You need to set the permission for hadoop root directory (/) instead of setting the permission for the system's root directory. Even I was confused, but then realized that the directory mentioned was of hadoop's file system and not the system's.