由于附件javadoc无法编译火花
目标和问题
我正在尝试编译最小版的Spark,以使我们的容器尺寸降低。我们只使用Spark-SQL和Pyspark。这是我在编译时一直使用的Dockerfile,
FROM openjdk:20-bullseye
RUN apt-get update && \
apt-get install git -y && \
git clone --depth=1 --branch=v3.3.0 https://github.com/apache/spark /root/spark && \
cd /root/spark && \
./dev/make-distribution.sh --tgz --pip -pl :spark-core_2.12,:spark-sql_2.12 -P '!test-java-home,kubernetes,hadoop-3,apache-release' -DskipTests
我会收到以下错误
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.1.1:jar (attach-javadocs) on project spark-core_2.12: MavenReportException: Error while generating Javadoc:
[ERROR] Exit code: 1 - /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:36: error: cannot find symbol
[ERROR] public class SparkFirehoseListener implements SparkListenerInterface {
[ERROR] ^
[ERROR] symbol: class SparkListenerInterface
[ERROR] /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:38: error: cannot find symbol
[ERROR] public void onEvent(SparkListenerEvent event) { }
...
,然后只有一堆“错误:找不到符号”的东西。
问题
如何解决此问题?
如何从命令行关闭文档(我真的想避免更改为危险的自动化文件)?
Goal and Problem
I'm trying to compile a minimal version of spark to get our container size down. We only use spark-sql and pyspark. Here's the dockerfile I've been using
FROM openjdk:20-bullseye
RUN apt-get update && \
apt-get install git -y && \
git clone --depth=1 --branch=v3.3.0 https://github.com/apache/spark /root/spark && \
cd /root/spark && \
./dev/make-distribution.sh --tgz --pip -pl :spark-core_2.12,:spark-sql_2.12 -P '!test-java-home,kubernetes,hadoop-3,apache-release' -DskipTests
When compiling I get the following error
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.1.1:jar (attach-javadocs) on project spark-core_2.12: MavenReportException: Error while generating Javadoc:
[ERROR] Exit code: 1 - /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:36: error: cannot find symbol
[ERROR] public class SparkFirehoseListener implements SparkListenerInterface {
[ERROR] ^
[ERROR] symbol: class SparkListenerInterface
[ERROR] /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:38: error: cannot find symbol
[ERROR] public void onEvent(SparkListenerEvent event) { }
...
Then there's just a bunch of "error:cannot find symbol" stuff.
Question
How do I fix this so it works?
How do I turn off the documentation from the command line (I would really like to avoid changing files as automating that is perilous)?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论