由于附件javadoc无法编译火花

发布于 2025-02-08 06:31:48 字数 1304 浏览 1 评论 0原文

目标和问题

我正在尝试编译最小版的Spark,以使我们的容器尺寸降低。我们只使用Spark-SQL和Pyspark。这是我在编译时一直使用的Dockerfile,

FROM openjdk:20-bullseye

RUN apt-get update && \
    apt-get install git -y && \
    git clone --depth=1 --branch=v3.3.0 https://github.com/apache/spark /root/spark && \
    cd /root/spark && \
    ./dev/make-distribution.sh --tgz --pip -pl :spark-core_2.12,:spark-sql_2.12 -P '!test-java-home,kubernetes,hadoop-3,apache-release' -DskipTests

我会收到以下错误

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.1.1:jar (attach-javadocs) on project spark-core_2.12: MavenReportException: Error while generating Javadoc:
[ERROR] Exit code: 1 - /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:36: error: cannot find symbol
[ERROR] public class SparkFirehoseListener implements SparkListenerInterface {
[ERROR]                                               ^
[ERROR]   symbol: class SparkListenerInterface
[ERROR] /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:38: error: cannot find symbol
[ERROR]   public void onEvent(SparkListenerEvent event) { }
...

,然后只有一堆“错误:找不到符号”的东西。

问题

如何解决此问题?
如何从命令行关闭文档(我真的想避免更改为危险的自动化文件)?

Goal and Problem

I'm trying to compile a minimal version of spark to get our container size down. We only use spark-sql and pyspark. Here's the dockerfile I've been using

FROM openjdk:20-bullseye

RUN apt-get update && \
    apt-get install git -y && \
    git clone --depth=1 --branch=v3.3.0 https://github.com/apache/spark /root/spark && \
    cd /root/spark && \
    ./dev/make-distribution.sh --tgz --pip -pl :spark-core_2.12,:spark-sql_2.12 -P '!test-java-home,kubernetes,hadoop-3,apache-release' -DskipTests

When compiling I get the following error

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.1.1:jar (attach-javadocs) on project spark-core_2.12: MavenReportException: Error while generating Javadoc:
[ERROR] Exit code: 1 - /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:36: error: cannot find symbol
[ERROR] public class SparkFirehoseListener implements SparkListenerInterface {
[ERROR]                                               ^
[ERROR]   symbol: class SparkListenerInterface
[ERROR] /root/spark/core/src/main/java/org/apache/spark/SparkFirehoseListener.java:38: error: cannot find symbol
[ERROR]   public void onEvent(SparkListenerEvent event) { }
...

Then there's just a bunch of "error:cannot find symbol" stuff.

Question

How do I fix this so it works?
How do I turn off the documentation from the command line (I would really like to avoid changing files as automating that is perilous)?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文