在 Hudson 的作业之间共享构建工件

发布于 2024-07-18 04:52:23 字数 396 浏览 8 评论 0原文

我正在尝试在哈德逊建立我们的构建流程。

作业 1 将是一个超快速(希望如此)的持续集成构建作业,并且会频繁构建。

作业 2 将负责定期或手动触发运行全面的测试套件。

作业 3 将负责在整个代码库中运行分析工具(与作业 2 非常相似)。

我尝试使用“高级项目选项 > 使用自定义工作区”功能,以便在作业 1 中编译的代码可以在作业 2 和 3 中使用。但是,似乎所有构建工件都保留在作业 1 工作区中。 我这样做对吗? 有更好的方法吗? 我想我正在寻找类似于构建管道设置的东西......以便可以共享事物并且可以分阶段执行适当的作业。

(我也考虑过使用“批处理任务”......但似乎无法安排这些任务?只能手动触发?)

欢迎任何建议。 谢谢!

I'm trying to set up our build process in hudson.

Job 1 will be a super fast (hopefully) continuous integration build job that will be built frequently.

Job 2, will be responsible for running a comprehensive test suite, at a regular interval or triggered manually.

Job 3 will be responsible for running analysis tools across the codebase (much like Job 2).

I tried using the "Advanced Projects Options > use custom workspace" feature so that code compiled in Job 1 can be used in Job 2 and 3. However, it seems that all build artifacts remain inside that Job 1 workspace. I'm I doing this right? Is there a better way of doing this? I guess I'm looking for something similar to a build pipeline setup...so that things can be shared and the appropriate jobs can be executed in stages.

(I also considered using 'batch tasks'...but it seems like those can't be scheduled? only triggered manually?)

Any suggestions are welcomed. Thanks!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(8

吹泡泡o 2024-07-25 04:52:24

您可能想尝试 Copy Artifact 插件:

http://wiki.hudson-ci.org/ display/HUDSON/Copy+Artifact+Plugin

您的连续工作可以构建必要的工件,而您的其他两项工作可以将它们拉入进行分析。

You might want to try the Copy Artifact plugin:

http://wiki.hudson-ci.org/display/HUDSON/Copy+Artifact+Plugin

Your continuous job can build the necessary artifacts, and your other two jobs can pull them in to do analysis.

╰沐子 2024-07-25 04:52:24

Hudson 有一个插件可以解决这个问题: http://wiki .hudson-ci.org/display/HUDSON/Clone+Workspace+SCM+Plugin(链接当前已损坏)

相应的 Jenkins 页面位于:https://wiki.jenkins-ci.org/display/JENKINS/Clone+Workspace+SCM+Plugin

Hudson has a plugin for just this problem: http://wiki.hudson-ci.org/display/HUDSON/Clone+Workspace+SCM+Plugin (link currently broken)

The corresponding Jenkins page is here: https://wiki.jenkins-ci.org/display/JENKINS/Clone+Workspace+SCM+Plugin

倾城泪 2024-07-25 04:52:24

是的,那个维基页面并没有多大帮助,因为它试图让它听起来非常优雅。 事实是,如果您必须将东西从一项工作传递到另一项工作,Hudson 还不能非常优雅地支持工作链。

我还使用压缩和复制工作空间方法将工作空间从一项工作转移到另一项工作。 我有一个快速构建、完整分析构建,然后是分发构建。 在这之间,我使用 Ant 生成时间戳和“构建时间戳”来标记哪个作业的编号构建了哪个其他作业的编号。 指纹识别功能有助于跟踪文件,但由于我不打算存档工作区 zip,指纹识别对用户来说毫无用处,因为他们实际上无法看到工作区 zip。

Yes, that wiki page wasn't very helpful in that it tries to make it sound very elegant. The truth is that Hudson doesn't support job chains very elegantly yet if you have to pass stuff from one job to another.

I'm also doing the zip-up-and-copy-workspace method to transfer workspaces from one job to another. I have a quick build, full analysis build, and then distribution builds. In between, I use Ant to generate timestamps and "build-stamps" to mark which job's number built which other job's number. The fingerprinting feature helps keep track of files, but since I'm not going to archive the workspace zips, fingerprinting is useless to the users because they can't actually see the workspace zips.

柒七 2024-07-25 04:52:24

你看过 Hudson 维基吗? 具体来说: 将一项大工作拆分成更小的工作职位

Have you looked at the Hudson wiki? Specifically: Splitting a big job into smaller jobs

混吃等死 2024-07-25 04:52:24

我遇到了同样的问题,我最终选择的是针对长期运行的任务的单独项目。 这些项目的第一步是将作业 1(即上次构建)的工作区中的所有文件复制到作业 2/3/等工作区。 这通常会起作用,除非作业 1 在作业 2/3 开始时正在构建,因为它将获得不完整的工作空间。 您可以通过使用哨兵文件检测作业 1 中的“构建结束”来解决此问题,或者使用 Hudson locks 插件(我没有尝试过)。

如果您假设其他作业相对于 %WORKSPACE% 的放置,则您不必使用自定义工作区。

I had the same issue, and what I ended up going with is separate projects for the long-running tasks. The first step in these projects was to copy all the files from the workspace of Job 1 (i.e. last build) to the Job 2/3/etc workspaces. This usually worked unless Job 1 was building at the time Job 2/3 started, since it would get an incomplete workspace. You could work around this by detecting "end of build" in Job 1 with a sentinel file, or use the Hudson locks plugin (I haven't tried).

You don't have to use a custom workspace if you make assumptions about the placement of the other jobs relative to %WORKSPACE%.

此生挚爱伱 2024-07-25 04:52:24

我现在正在做类似的事情。 我建议避免尝试在同一共享工作区中运行许多作业。 我只是遇到了这个问题。

我正在使用 Maven 和自由格式项目类型。 当版本控制系统中的文件触发一组作业时,就会运行一组作业。 他们创建本地快照工件。 第二组作业每晚运行并设置集成测试环境,然后对其运行测试。

如果你不使用maven; 一种选择是在磁盘上设置一个区域,并让作业一的最后步骤将工件复制到该位置。 第二项工作的第一步应该是将这些文件移过来。 跑任何你需要跑的东西。

至于工作三,现在有 findbugs/checkstyle/pmd 等 Hudson 的所有插件。 我建议只创建作业 1 的一个版本,该版本会进行一次干净的夜间结帐并在您的代码库上运行这些版本。

I'm doing something like that now. I'd recommend avoiding any attempt to run many jobs in the same shared workspace. I've only had problems with that.

I'm using maven and the free-form projects type. One set of jobs runs when the files in the version control system trigger it. They create local snapshot artifacts. A second set of jobs run nightly and set up a integration test environment then run tests on it.

If you aren't using maven; one option it to set up an area on disk and have the final steps in job one copy the artifacts to that spot. The first steps of job two should be to move those files over. The run whatever you need to run.

As for job three, there are findbugs/checkstyle/pmd et all plugins for Hudson now. I'd recommend just creating a version of job 1 that does a clean nightly checkout and runs those on you code base.

楠木可依 2024-07-25 04:52:24

Hudson 似乎没有用于构建工件的内置存储库。 我们的解决方案是创建一个。

我们在Windows环境中,所以我创建了一个可以被所有Hudson服务器访问的共享(我们给相关服务一个通用帐户,因为系统帐户无法跨网络访问资源)。

在我们的构建脚本(ant)中,我们的任务是将其他作业构建的资源复制到本地工作区,而生成工件的作业将它们复制到公共存储库中。

在其他环境中,您可以通过 FTP 或任何其他移动文件的机制发布和获取。

发布和获取任务的简单示例:

<!-- ==================== Publish ==================================== -->
<target name="Publish" description="Publish files">
  <mkdir dir="${publish.dir}/lib" />
  <copy todir="${publish.dir}/lib" file="${project.jar}"/>
</target>

以及

<!-- ==================== Get ==================================== -->
<target name="getdependencies" description="Get necessary results from published directory">
  <copy todir="${support.dir}">
    <fileset dir="${publish.dir}/lib">
      <include name="*.jar"/>
    </fileset>
  </copy>
</target>

Hudson doesn't appear to have a built in repository for build artifacts. Our solution was to create one.

We are in a Windosw environment so I created a share that could be accessed by all Hudson servers (we give the relevant services a common account as the system account cannot access resources across a network).

Within our build scripts (ant), we have tasks that copy resources build from other jobs to the local workspace and jobs that generate artifacts copy them into the common repository.

In other environments, you could publish and fetch via FTP or any other mechanism for moving files.

Simplistic examples of publish and get tasks:

<!-- ==================== Publish ==================================== -->
<target name="Publish" description="Publish files">
  <mkdir dir="${publish.dir}/lib" />
  <copy todir="${publish.dir}/lib" file="${project.jar}"/>
</target>

and

<!-- ==================== Get ==================================== -->
<target name="getdependencies" description="Get necessary results from published directory">
  <copy todir="${support.dir}">
    <fileset dir="${publish.dir}/lib">
      <include name="*.jar"/>
    </fileset>
  </copy>
</target>
财迷小姐 2024-07-25 04:52:24

我同意当前手动在作业之间复制文件/工件/工作空间不太优雅。

另外,我发现归档巨大的 tgz/zip 文件非常浪费空间/时间。在我们的例子中,这些文件很大(1.5G),并且需要很长时间来打包/归档/指纹/解包。

因此,我采用了稍微优化的变体:

  • 作业 1/2/3 都签出/克隆相同的源存储库,但
  • 作业 1 只打包实际上是构建工件的文件
    • 使用 Git 通过 git ls-files -oz 使这一切变得简单快捷,不确定其他 SCM
  • 使用 Copy Artifact 插件来传输文件
  • 这将这些文件的大小减少到 1/3我们的案例-> 加速,减少空间浪费

I agree that the current copy files/artifact/workspace between jobs manually is less than elegant.

Also, I found it wasteful space/timewise to have to archive huge tgz/zip files.. In our case, these files were huge (1.5G) and took a long time to pack/archive/fingerprint/unpack.

So I settled with a slightly optimised variant of the same:

  • Job 1/2/3 all check out/clone the same source repository, but
  • Job 1 only packs files that are actually build artifacts
    • with Git makes this easy and fast by git ls-files -oz, not sure about others SCMs
  • use Copy Artifact plugin to transfer files
  • This reduces the those files to a 1/3 size in our case -> speedup, less space wasted
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文