windows下hadoop启动tasktracker时出现问题

发布于 2024-11-14 04:11:27 字数 977 浏览 1 评论 0 原文

我正在尝试在Windows下使用hadoop,当我想启动tasktracker时遇到问题。例如:

$bin/start-all.sh

那么日志中写道:

2011-06-08 16:32:18,157 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: /tmp/hadoop-Administrator/mapred/local/taskTracker to 0755
    at org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFileSystem.java:525)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:507)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
    at org.apache.hadoop.mapred.TaskTracker.initialize(TaskTracker.java:630)
    at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1328)
    at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3430)

What's the Problem?我该如何解决这个问题?谢谢!

I am trying to use hadoop under windows and I am running into a problem when I want to start tasktracker. For example:

$bin/start-all.sh

then the logs writes:

2011-06-08 16:32:18,157 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: /tmp/hadoop-Administrator/mapred/local/taskTracker to 0755
    at org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFileSystem.java:525)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:507)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
    at org.apache.hadoop.mapred.TaskTracker.initialize(TaskTracker.java:630)
    at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1328)
    at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3430)

What's the problem? How can I solve this? Thanks!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

謌踐踏愛綪 2024-11-21 04:11:27

我在 Windows 服务器上安装 1.0.3 时遇到了这个问题。我更改了 hdfs-site.xml 中的默认目录,以便 hadoop 为 dfs 创建的目录是 cygwin 目录的子目录,如下所示......

 <property>
    <name>dfs.name.dir</name>
    <value>c:/cygwin/usr/mydir/dfs/logs</value>
 </property>
 <property>
    <name>dfs.data.dir</name>
    <value>c:/cygwin/usr/mydir/dfs/data</value>
 </property>
</configuration>

似乎解决了问题。

配置文件的 apache 文档位于此处

I was running into this issue on an installation of 1.0.3 on Windows server. I changed the default directory in hdfs-site.xml so that the directory that hadoop creates for the dfs is a subdir of the cygwin directory like this...

...

 <property>
    <name>dfs.name.dir</name>
    <value>c:/cygwin/usr/mydir/dfs/logs</value>
 </property>
 <property>
    <name>dfs.data.dir</name>
    <value>c:/cygwin/usr/mydir/dfs/data</value>
 </property>
</configuration>

This seemed to resolve the problem.

The apache documentation for the config files is here

笨笨の傻瓜 2024-11-21 04:11:27

使用此更改 hadoop-Admininstrator 文件夹的所有者。您可以使用 chown 命令。

Use this change owner of hadoop-Admininstrator folder. You can use chown command for that.

你是暖光i 2024-11-21 04:11:27

这个问题是在 Apache Hadoop 用户邮件列表上提出的。这似乎是 Hadoop 的某些发行版本中存在的问题,而其他版本则不然。

一个简单的解决方案是下载不同版本的 Hadoop(假设您由于某些其他原因不需要特定的 Hadoop 版本)。

我在 1.0.0 版(测试版)中遇到了这个问题。

然后我尝试了 0.23.0,但得到了致命的 ClassNotFoundException:

log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
Exception in thread "main" java.lang.ClassNotFoundException: hadoop-mapreduce-examples-0.23.0.jar
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:182)

最后我尝试了版本 0.22.0,并且没有错误。因此,我建议您尝试下载并安装版本 0.22.0: http://hadoop.apache.org/common/releases.html#10+December%2C+2011%3A+release+0.22.0+可用

This issue was raised on the Apache Hadoop user mailing list. It appears to be a problem in some release versions of Hadoop and not others.

A simple solution is to download a different version of Hadoop (assuming you do not require a specific Hadoop version for some other reason).

I encountered this exact issue with version 1.0.0 (beta).

I then tried 0.23.0 but got a fatal ClassNotFoundException:

log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
Exception in thread "main" java.lang.ClassNotFoundException: hadoop-mapreduce-examples-0.23.0.jar
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:182)

Finally I tried version 0.22.0 and that worked without error. Therefore I recommend you try downloading and installing version 0.22.0: http://hadoop.apache.org/common/releases.html#10+December%2C+2011%3A+release+0.22.0+available

桃扇骨 2024-11-21 04:11:27

似乎存在与路径相关的权限问题
/tmp/hadoop-Administrator/mapred/local/taskTracker
如错误消息所示,

ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: /tmp/hadoop-Administrator/mapred/local/taskTracker

启动任务跟踪器的帐户需要能够 chmod 指定的文件夹。对于其他方面,它可能需要更多的控制,例如成为所有者。我不记得 hadoop 设置中的组件所需的特定权限。

我没有太多处理 Hadoop 的权限设置方面的问题,尤其是在 Windows 上(根本),所以我所说的很大程度上基于您提供的错误消息。我还没有处理 cygwin 文件夹权限,所以我不知道纠正它的解决方案,但希望这能为您指明正确的方向。

There appears to be a permissions issue related to the path
/tmp/hadoop-Administrator/mapred/local/taskTracker
as evidenced by the error message

ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: /tmp/hadoop-Administrator/mapred/local/taskTracker

The account the taskTracker is being started under needs the ability to chmod the specified folder. It may need more control, such as being owner, for other aspects. I don't recall the specific permissions required for components in hadoop setup.

I haven't dealt with the permission setup aspect of Hadoop much, especially on windows (at all), so what I'm saying is based heavily on the error message you've provided. I also haven't dealth with cygwin folder permission, so I don't know the solution to correct it, but hopefully this will get you pointed in the right direction.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文