Hadoop Pipes 的链接器错误
这里是 Hadoop n00b,刚刚开始使用 Hadoop Pipes。我在使用 hadoop-0.20.203(当前最新版本)编译一个简单的 WordCount 示例时遇到链接器错误,该示例在 hadoop-0.20.2 中的相同代码中没有出现,
形式为:未定义引用“EVP_sha1”在 HadoopPipes.cc 中。
EVP_sha1(以及我得到的所有未定义引用)是来自 hadoop-0.20.203 的 HadoopPipes.cc 使用的 openssl 库的一部分,但 hadoop-0.20.2 不使用。
我尝试调整我的 makefile 以链接到 ssl 库,但我仍然不走运。任何想法将不胜感激。谢谢!
PS,这是我当前的makefile:
CC = g++
HADOOP_INSTALL = /usr/local/hadoop-0.20.203.0
SSL_INSTALL = /usr/local/ssl
PLATFORM = Linux-amd64-64
CPPFLAGS = -m64 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include
WordCount: WordCount.cc
$(CC) $(CPPFLAGS) $< -Wall -Wextra -L$(SSL_INSTALL)/lib -lssl -lcrypto -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils -lpthread -g -O2 -o $@
我正在使用的实际程序可以在
Hadoop n00b here, just started playing around with Hadoop Pipes. I'm getting linker errors while compiling a simple WordCount example using hadoop-0.20.203 (current most recent version) that did not appear for the same code in hadoop-0.20.2
Linker errors of the form: undefined reference to `EVP_sha1' in HadoopPipes.cc.
EVP_sha1 (and all of the undefined references I get) are part of the openssl library which HadoopPipes.cc from hadoop-0.20.203 uses, but hadoop-0.20.2 does not.
I've tried adjusting my makefile to link to the ssl libraries, but I'm still out of luck. Any ideas would be greatly appreciated. Thanks!
PS, here is my current makefile:
CC = g++
HADOOP_INSTALL = /usr/local/hadoop-0.20.203.0
SSL_INSTALL = /usr/local/ssl
PLATFORM = Linux-amd64-64
CPPFLAGS = -m64 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include
WordCount: WordCount.cc
$(CC) $(CPPFLAGS) lt; -Wall -Wextra -L$(SSL_INSTALL)/lib -lssl -lcrypto -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils -lpthread -g -O2 -o $@
The actual program I'm using can be found at http://cs.smith.edu/dftwiki/index.php/Hadoop_Tutorial_2.2_--_Running_C%2B%2B_Programs_on_Hadoop
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您只需要对 Makefile 进行一些更改即可。 hadoop 原生附带的库似乎没有这样做。您需要“重新制作”它们并更改链接路径。
对此问题的全面解答可以在 http://goo.gl/y5iGZF 中找到。
You just need to make some changes to your Makefile. The libraries that natively accompany hadoop seem not to do it. You'll need to "re-make" them and change your linked path.
A comprehensive answer to this can be found at http://goo.gl/y5iGZF.
这里有同样的问题:答案是将 -lcrypto 添加到编译命令行:
http://grokbase.com/p/hadoop.apache.org/common-user/2011/06/re-linker-errors-with-hadoop-pipes/09zqdt5grdudu7no7q6k3gfcynpy
这是修复构建过程的补丁:
Had same problem here: answer is to add -lcrypto to the compile command line:
http://grokbase.com/p/hadoop.apache.org/common-user/2011/06/re-linker-errors-with-hadoop-pipes/09zqdt5grdudu7no7q6k3gfcynpy
Here is a patch to fix the build process: