Hadoop Pipes 的链接器错误

发布于 2024-11-08 09:14:58 字数 815 浏览 0 评论 0原文

这里是 Hadoop n00b,刚刚开始使用 Hadoop Pipes。我在使用 hadoop-0.20.203(当前最新版本)编译一个简单的 WordCount 示例时遇到链接器错误,该示例在 hadoop-0.20.2 中的相同代码中没有出现,

形式为:未定义引用“EVP_sha1”在 HadoopPipes.cc 中。

EVP_sha1(以及我得到的所有未定义引用)是来自 hadoop-0.20.203 的 HadoopPipes.cc 使用的 openssl 库的一部分,但 hadoop-0.20.2 不使用。

我尝试调整我的 makefile 以链接到 ssl 库,但我仍然不走运。任何想法将不胜感激。谢谢!

PS,这是我当前的makefile:

CC = g++

HADOOP_INSTALL = /usr/local/hadoop-0.20.203.0

SSL_INSTALL = /usr/local/ssl

PLATFORM = Linux-amd64-64

CPPFLAGS = -m64 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include

WordCount: WordCount.cc

    $(CC) $(CPPFLAGS) $< -Wall -Wextra -L$(SSL_INSTALL)/lib -lssl -lcrypto -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils -lpthread -g -O2 -o $@

我正在使用的实际程序可以在

Hadoop n00b here, just started playing around with Hadoop Pipes. I'm getting linker errors while compiling a simple WordCount example using hadoop-0.20.203 (current most recent version) that did not appear for the same code in hadoop-0.20.2

Linker errors of the form: undefined reference to `EVP_sha1' in HadoopPipes.cc.

EVP_sha1 (and all of the undefined references I get) are part of the openssl library which HadoopPipes.cc from hadoop-0.20.203 uses, but hadoop-0.20.2 does not.

I've tried adjusting my makefile to link to the ssl libraries, but I'm still out of luck. Any ideas would be greatly appreciated. Thanks!

PS, here is my current makefile:

CC = g++

HADOOP_INSTALL = /usr/local/hadoop-0.20.203.0

SSL_INSTALL = /usr/local/ssl

PLATFORM = Linux-amd64-64

CPPFLAGS = -m64 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include

WordCount: WordCount.cc

    $(CC) $(CPPFLAGS) 
lt; -Wall -Wextra -L$(SSL_INSTALL)/lib -lssl -lcrypto -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils -lpthread -g -O2 -o $@

The actual program I'm using can be found at http://cs.smith.edu/dftwiki/index.php/Hadoop_Tutorial_2.2_--_Running_C%2B%2B_Programs_on_Hadoop

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

卷耳 2024-11-15 09:27:02

您只需要对 Makefile 进行一些更改即可。 hadoop 原生附带的库似乎没有这样做。您需要“重新制作”它们并更改链接路径。
对此问题的全面解答可以在 http://goo.gl/y5iGZF 中找到。

You just need to make some changes to your Makefile. The libraries that natively accompany hadoop seem not to do it. You'll need to "re-make" them and change your linked path.
A comprehensive answer to this can be found at http://goo.gl/y5iGZF.

躲猫猫 2024-11-15 09:24:13

这里有同样的问题:答案是将 -lcrypto 添加到编译命令行:

http://grokbase.com/p/hadoop.apache.org/common-user/2011/06/re-linker-errors-with-hadoop-pipes/09zqdt5grdudu7no7q6k3gfcynpy

这是修复构建过程的补丁:

diff --git src/examples/pipes/Makefile.in src/examples/pipes/Makefile.in
index 17efa2a..1d8af8e 100644
--- src/examples/pipes/Makefile.in
+++ src/examples/pipes/Makefile.in
@@ -233,7 +233,7 @@ AM_CXXFLAGS = -Wall -I$(HADOOP_UTILS_PREFIX)/include \
         -I$(HADOOP_PIPES_PREFIX)/include

LDADD = -L$(HADOOP_UTILS_PREFIX)/lib -L$(HADOOP_PIPES_PREFIX)/lib \
-      -lhadooppipes -lhadooputils
+      -lhadooppipes -lhadooputils -lcrypto


# Define the sources for each program

Had same problem here: answer is to add -lcrypto to the compile command line:

http://grokbase.com/p/hadoop.apache.org/common-user/2011/06/re-linker-errors-with-hadoop-pipes/09zqdt5grdudu7no7q6k3gfcynpy

Here is a patch to fix the build process:

diff --git src/examples/pipes/Makefile.in src/examples/pipes/Makefile.in
index 17efa2a..1d8af8e 100644
--- src/examples/pipes/Makefile.in
+++ src/examples/pipes/Makefile.in
@@ -233,7 +233,7 @@ AM_CXXFLAGS = -Wall -I$(HADOOP_UTILS_PREFIX)/include \
         -I$(HADOOP_PIPES_PREFIX)/include

LDADD = -L$(HADOOP_UTILS_PREFIX)/lib -L$(HADOOP_PIPES_PREFIX)/lib \
-      -lhadooppipes -lhadooputils
+      -lhadooppipes -lhadooputils -lcrypto


# Define the sources for each program
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文