Hadoop Pipes 找不到共享库

发布于 2024-12-13 05:39:18 字数 2087 浏览 0 评论 0原文

我在运行 hadoop 管道程序时收到此错误。 程序编译成功,但在hadoop管道上失败。

error while loading shared libraries: Lib.so.0: cannot open shared object file: No such file or directory

Makefile:

CC = g++
HADOOP_PATH = usr/lib/HADOOP
OTHERLIB1_PATH = usr/lib/OTHERLIB1
OTHERLIB2_PATH = usr/lib/OTHERLIB2
OTHERLIB3_PATH = usr/lib/OTHERLIB3
OTHERLIB4_PATH = usr/lib/OTHERLIB4
IMAGE_PATH = usr/lib/IMAGE
LIB_PATH = ../../../src/Lib
PLATFORM = Linux-amd64-64

CFLAGS_HDP =  -O3 \
        -I$(LIB_PATH) \
        -I$(OTHERLIB1_PATH)/include \
        -I$(HADOOP_PATH)/$(PLATFORM)/include \
        -I$(OTHERLIB4_PATH)/include \
        -I$(OTHERLIB2_PATH)/include \
        -I$(OTHERLIB3_PATH)/include 

LDFLAGS_HDP =   -L$(OTHERLIB1_PATH)/lib \
        -L$(HADOOP_PATH)/$(PLATFORM)/lib \
        -L$(OTHERLIB3_PATH)/lib \
        -L$(OTHERLIB2_PATH)/lib \
        -L$(OTHERLIB4_PATH)/lib \
        -L$(LIB_PATH)/.libs \
        -lhadooppipes -lhadooputils -lpthread -lcrypto\
        -lLib -lLib4 -lLib1

all: pipes clean 

clean:
        rm  *.o

pipes: LibPipes.cpp xml DocToXml
    $(CC) $(CFLAGS_HDP) \
    LibPipes.cpp \
    -o Lib_Pipes base64.o \
    xml.o DocToXml.o $(LDFLAGS_HDP)


xml: xml.cpp base64
        $(CC) $(CFLAGS_HDP) base64.o -c xml.cpp -o xml.o

base64: base64.cpp
        $(CC) $(CFLAGS_HDP) -c base64.cpp -o base64.o

DocToXml: DocToXml.cpp
        $(CC) $(CFLAGS_HDP) -c DocToXml.cpp -o  DocToXml.o

我使用以下命令在hadoop上运行程序:

hadoop pipes \
-D hadoop.pipes.java.recordreader=false \
-D hadoop.pipes.java.recordwriter=false \
-D mapred.map.tasks=128 \
-inputformat org.apache.hadoop.mapred.SequenceFileInputFormat \
-writer org.apache.hadoop.mapred.SequenceFileOutputFormat \
-reduce org.apache.hadoop.mapred.lib.IdentityReducer \
-input Input \
-output Output \
-program /user/uss/bin/Lib_Pipes \
-reduces 1

这似乎是由于动态链接而引起的问题。我尝试使用 -files 标志将库提供给 hadoop。还尝试使用不同的编译标志(例如 -static、-Bstatic、-static-libgcc、-static-libstdc++)静态链接该程序,但这些也不起作用。有谁知道如何在 hadoop 管道上处理这种类型的二进制文件?任何帮助将不胜感激。

I am getting this error while running a hadoop pipes program. The program compiles successfully but fails on hadoop pipes.

error while loading shared libraries: Lib.so.0: cannot open shared object file: No such file or directory

Makefile:

CC = g++
HADOOP_PATH = usr/lib/HADOOP
OTHERLIB1_PATH = usr/lib/OTHERLIB1
OTHERLIB2_PATH = usr/lib/OTHERLIB2
OTHERLIB3_PATH = usr/lib/OTHERLIB3
OTHERLIB4_PATH = usr/lib/OTHERLIB4
IMAGE_PATH = usr/lib/IMAGE
LIB_PATH = ../../../src/Lib
PLATFORM = Linux-amd64-64

CFLAGS_HDP =  -O3 \
        -I$(LIB_PATH) \
        -I$(OTHERLIB1_PATH)/include \
        -I$(HADOOP_PATH)/$(PLATFORM)/include \
        -I$(OTHERLIB4_PATH)/include \
        -I$(OTHERLIB2_PATH)/include \
        -I$(OTHERLIB3_PATH)/include 

LDFLAGS_HDP =   -L$(OTHERLIB1_PATH)/lib \
        -L$(HADOOP_PATH)/$(PLATFORM)/lib \
        -L$(OTHERLIB3_PATH)/lib \
        -L$(OTHERLIB2_PATH)/lib \
        -L$(OTHERLIB4_PATH)/lib \
        -L$(LIB_PATH)/.libs \
        -lhadooppipes -lhadooputils -lpthread -lcrypto\
        -lLib -lLib4 -lLib1

all: pipes clean 

clean:
        rm  *.o

pipes: LibPipes.cpp xml DocToXml
    $(CC) $(CFLAGS_HDP) \
    LibPipes.cpp \
    -o Lib_Pipes base64.o \
    xml.o DocToXml.o $(LDFLAGS_HDP)


xml: xml.cpp base64
        $(CC) $(CFLAGS_HDP) base64.o -c xml.cpp -o xml.o

base64: base64.cpp
        $(CC) $(CFLAGS_HDP) -c base64.cpp -o base64.o

DocToXml: DocToXml.cpp
        $(CC) $(CFLAGS_HDP) -c DocToXml.cpp -o  DocToXml.o

I run the program on hadoop using the following command:

hadoop pipes \
-D hadoop.pipes.java.recordreader=false \
-D hadoop.pipes.java.recordwriter=false \
-D mapred.map.tasks=128 \
-inputformat org.apache.hadoop.mapred.SequenceFileInputFormat \
-writer org.apache.hadoop.mapred.SequenceFileOutputFormat \
-reduce org.apache.hadoop.mapred.lib.IdentityReducer \
-input Input \
-output Output \
-program /user/uss/bin/Lib_Pipes \
-reduces 1

This seems to be a problem caused because of dynamic linking. I have tried giving the libraries to hadoop using the -files flag. Also trying to link this program statically using different compilation flags like -static, -Bstatic, -static-libgcc, -static-libstdc++, but these also don't work. Does anyone know how to handle this type of binaries on hadoop pipes? Any help would be appreciated.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

绅士风度i 2024-12-20 05:39:18

我使用的解决方案是压缩所有外部库并在管道中使用 -archives 标志。

zip -r my.zip lib/

这里的 lib/ 包含您想要链接的所有 .so 文件。

编辑

还将-rpath my.zip/lib添加到LDFLAGS_HDP。为了使编译工作,您还需要执行以下操作。

# in src dir
mkdir -p my.zip/lib

/编辑

然后使用您的命令运行管道并添加

-archives my.zip

The solution I use is to zip up all of your external libraries and use the -archives flag in pipes.

zip -r my.zip lib/

Here lib/ contains all your .so files that you want to link against.

EDIT

Also add -rpath my.zip/lib to LDFLAGS_HDP. For compilation to work, you will also need to do the following.

# in src dir
mkdir -p my.zip/lib

/EDIT

Then use your command to run pipes and add in

-archives my.zip
短叹 2024-12-20 05:39:18

找出问题所在了。它是 -files 标志中 , 之后的一个空格。

Figured out the problem. It was a space after the , in the -files flag.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文