Hadoop Pipes 找不到共享库
我在运行 hadoop 管道程序时收到此错误。 程序编译成功,但在hadoop管道上失败。
error while loading shared libraries: Lib.so.0: cannot open shared object file: No such file or directory
Makefile:
CC = g++
HADOOP_PATH = usr/lib/HADOOP
OTHERLIB1_PATH = usr/lib/OTHERLIB1
OTHERLIB2_PATH = usr/lib/OTHERLIB2
OTHERLIB3_PATH = usr/lib/OTHERLIB3
OTHERLIB4_PATH = usr/lib/OTHERLIB4
IMAGE_PATH = usr/lib/IMAGE
LIB_PATH = ../../../src/Lib
PLATFORM = Linux-amd64-64
CFLAGS_HDP = -O3 \
-I$(LIB_PATH) \
-I$(OTHERLIB1_PATH)/include \
-I$(HADOOP_PATH)/$(PLATFORM)/include \
-I$(OTHERLIB4_PATH)/include \
-I$(OTHERLIB2_PATH)/include \
-I$(OTHERLIB3_PATH)/include
LDFLAGS_HDP = -L$(OTHERLIB1_PATH)/lib \
-L$(HADOOP_PATH)/$(PLATFORM)/lib \
-L$(OTHERLIB3_PATH)/lib \
-L$(OTHERLIB2_PATH)/lib \
-L$(OTHERLIB4_PATH)/lib \
-L$(LIB_PATH)/.libs \
-lhadooppipes -lhadooputils -lpthread -lcrypto\
-lLib -lLib4 -lLib1
all: pipes clean
clean:
rm *.o
pipes: LibPipes.cpp xml DocToXml
$(CC) $(CFLAGS_HDP) \
LibPipes.cpp \
-o Lib_Pipes base64.o \
xml.o DocToXml.o $(LDFLAGS_HDP)
xml: xml.cpp base64
$(CC) $(CFLAGS_HDP) base64.o -c xml.cpp -o xml.o
base64: base64.cpp
$(CC) $(CFLAGS_HDP) -c base64.cpp -o base64.o
DocToXml: DocToXml.cpp
$(CC) $(CFLAGS_HDP) -c DocToXml.cpp -o DocToXml.o
我使用以下命令在hadoop上运行程序:
hadoop pipes \
-D hadoop.pipes.java.recordreader=false \
-D hadoop.pipes.java.recordwriter=false \
-D mapred.map.tasks=128 \
-inputformat org.apache.hadoop.mapred.SequenceFileInputFormat \
-writer org.apache.hadoop.mapred.SequenceFileOutputFormat \
-reduce org.apache.hadoop.mapred.lib.IdentityReducer \
-input Input \
-output Output \
-program /user/uss/bin/Lib_Pipes \
-reduces 1
这似乎是由于动态链接而引起的问题。我尝试使用 -files 标志将库提供给 hadoop。还尝试使用不同的编译标志(例如 -static、-Bstatic、-static-libgcc、-static-libstdc++)静态链接该程序,但这些也不起作用。有谁知道如何在 hadoop 管道上处理这种类型的二进制文件?任何帮助将不胜感激。
I am getting this error while running a hadoop pipes program. The program compiles successfully but fails on hadoop pipes.
error while loading shared libraries: Lib.so.0: cannot open shared object file: No such file or directory
Makefile:
CC = g++
HADOOP_PATH = usr/lib/HADOOP
OTHERLIB1_PATH = usr/lib/OTHERLIB1
OTHERLIB2_PATH = usr/lib/OTHERLIB2
OTHERLIB3_PATH = usr/lib/OTHERLIB3
OTHERLIB4_PATH = usr/lib/OTHERLIB4
IMAGE_PATH = usr/lib/IMAGE
LIB_PATH = ../../../src/Lib
PLATFORM = Linux-amd64-64
CFLAGS_HDP = -O3 \
-I$(LIB_PATH) \
-I$(OTHERLIB1_PATH)/include \
-I$(HADOOP_PATH)/$(PLATFORM)/include \
-I$(OTHERLIB4_PATH)/include \
-I$(OTHERLIB2_PATH)/include \
-I$(OTHERLIB3_PATH)/include
LDFLAGS_HDP = -L$(OTHERLIB1_PATH)/lib \
-L$(HADOOP_PATH)/$(PLATFORM)/lib \
-L$(OTHERLIB3_PATH)/lib \
-L$(OTHERLIB2_PATH)/lib \
-L$(OTHERLIB4_PATH)/lib \
-L$(LIB_PATH)/.libs \
-lhadooppipes -lhadooputils -lpthread -lcrypto\
-lLib -lLib4 -lLib1
all: pipes clean
clean:
rm *.o
pipes: LibPipes.cpp xml DocToXml
$(CC) $(CFLAGS_HDP) \
LibPipes.cpp \
-o Lib_Pipes base64.o \
xml.o DocToXml.o $(LDFLAGS_HDP)
xml: xml.cpp base64
$(CC) $(CFLAGS_HDP) base64.o -c xml.cpp -o xml.o
base64: base64.cpp
$(CC) $(CFLAGS_HDP) -c base64.cpp -o base64.o
DocToXml: DocToXml.cpp
$(CC) $(CFLAGS_HDP) -c DocToXml.cpp -o DocToXml.o
I run the program on hadoop using the following command:
hadoop pipes \
-D hadoop.pipes.java.recordreader=false \
-D hadoop.pipes.java.recordwriter=false \
-D mapred.map.tasks=128 \
-inputformat org.apache.hadoop.mapred.SequenceFileInputFormat \
-writer org.apache.hadoop.mapred.SequenceFileOutputFormat \
-reduce org.apache.hadoop.mapred.lib.IdentityReducer \
-input Input \
-output Output \
-program /user/uss/bin/Lib_Pipes \
-reduces 1
This seems to be a problem caused because of dynamic linking. I have tried giving the libraries to hadoop using the -files flag. Also trying to link this program statically using different compilation flags like -static, -Bstatic, -static-libgcc, -static-libstdc++, but these also don't work. Does anyone know how to handle this type of binaries on hadoop pipes? Any help would be appreciated.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我使用的解决方案是压缩所有外部库并在管道中使用
-archives
标志。这里的
lib/
包含您想要链接的所有.so
文件。编辑
还将
-rpath my.zip/lib
添加到LDFLAGS_HDP
。为了使编译工作,您还需要执行以下操作。/编辑
然后使用您的命令运行管道并添加
The solution I use is to zip up all of your external libraries and use the
-archives
flag in pipes.Here
lib/
contains all your.so
files that you want to link against.EDIT
Also add
-rpath my.zip/lib
toLDFLAGS_HDP
. For compilation to work, you will also need to do the following./EDIT
Then use your command to run pipes and add in
找出问题所在了。它是 -files 标志中 , 之后的一个空格。
Figured out the problem. It was a space after the , in the -files flag.