用于调整/校准启发式算法属性的软件

发布于 2024-08-24 00:14:09 字数 723 浏览 11 评论 0原文

今天我读到有一个名为 WinCalibra 的软件(向下滚动一点) )它可以将具有属性的文本文件作为输入。

然后,该程序可以根据算法的输出值优化输入属性。请参阅本文或用户文档了解更多信息(请参阅上面的链接;遗憾的是 doc 是一个压缩的 exe)。

您知道在 Linux 下运行的其他可以执行相同操作的软件吗? (首选开源)

编辑:因为我需要这个java应用程序:我应该将我的研究投资于java库,例如高卢还是钟表匠?问题是我不想推出自己的解决方案,也没有时间这样做。您是否有指向像 Calibra 这样的开箱即用应用程序的指导? (互联网搜索不成功;我只找到了图书馆)

尽管我没有找到令人满意的解决方案,但我决定放弃赏金(否则没有人会受益):-((开箱即用的应用程序)

Today I read that there is a software called WinCalibra (scroll a bit down) which can take a text file with properties as input.

This program can then optimize the input properties based on the output values of your algorithm. See this paper or the user documentation for more information (see link above; sadly doc is a zipped exe).

Do you know other software which can do the same which runs under Linux? (preferable Open Source)

EDIT: Since I need this for a java application: should I invest my research in java libraries like gaul or watchmaker? The problem is that I don't want to roll out my own solution nor I have time to do so. Do you have pointers to an out-of-the-box applications like Calibra? (internet searches weren't successfull; I only found libraries)

I decided to give away the bounty (otherwise no one would have a benefit) although I didn't found a satisfactory solution :-( (out-of-the-box application)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

仙女 2024-08-31 00:14:09

某种(Metropolis 算法类)概率选择的随机游走在这种情况下有一种可能性。也许通过模拟退火来改善最终的选择。尽管您提供的计时参数对于以这种方式获得真正好的结果来说并不是最佳的。

它的工作原理是这样的:

  1. 你从某个时刻开始。使用您现有的数据来选择一个看起来有希望的数据(例如您获得的最高值)。将 o 设置为此时的输出值。
  2. 您建议在输入空间中随机选择一个步骤,将输出值分配给n
  3. 如果 1) n>o 或 2) 新值较低,但 [0,1) 上的随机数小于 f,则接受该步骤(即更新工作位置) (n/o) 对于一些范围和域在 [0,1) 上的单调递增的 f()
  4. 只要您有能力,就重复步骤 2 和 3,并在每一步收集统计数据。
  5. 最后计算结果。在你的情况下,所有点的平均值可能就足够了。

重要的装饰:如果空间有许多局部最大值并且它们之间有深陷,那么这种方法就会遇到麻烦除非步长足够大以越过这些陷点;但大的步骤会让整个事情慢慢收敛。要解决此问题,您需要做两件事:

  1. 进行模拟退火(从较大的步长开始,然后逐渐减小它,从而允许步行者尽早在局部最大值之间移动,但稍后将其捕获在一个区域中以积累精度结果。
  2. 使用几个(如果你能负担得起的话,很多)独立的步行者,这样他们就可以陷入不同的局部最大值,你使用的越多,输出值的差异越大,你就越有可能获得最好的最大值,

如果这样的话,这是没有必要的 。你知道你只有一个大的、广泛的、表现良好的局部极端。

最后,你可以选择f(x)。 ) = x,但如果使用 f(x) = exp(-(1/x)),您将获得最佳收敛


。很多步骤(尽管如果您有多台计算机,您可以运行单独的实例来获得多个步行者效果,这将有所帮助),因此您可能会更好地使用某种确定性方法但这不是我知道的主题。足以提供任何建议。

Some kind of (Metropolis algorithm-like) probability selected random walk is a possibility in this instance. Perhaps with simulated annealing to improve the final selection. Though the timing parameters you've supplied are not optimal for getting a really great result this way.

It works like this:

  1. You start at some point. Use your existing data to pick one that look promising (like the highest value you've got). Set o to the output value at this point.
  2. You propose a randomly selected step in the input space, assign the output value there to n.
  3. Accept the step (that is update the working position) if 1) n>o or 2) the new value is lower, but a random number on [0,1) is less than f(n/o) for some monotonically increasing f() with range and domain on [0,1).
  4. Repeat steps 2 and 3 as long as you can afford, collecting statistics at each step.
  5. Finally compute the result. In your case an average of all points is probably sufficient.

Important frill: This approach has trouble if the space has many local maxima with deep dips between them unless the step size is big enough to get past the dips; but big steps makes the whole thing slow to converge. To fix this you do two things:

  1. Do simulated annealing (start with a large step size and gradually reduce it, thus allowing the walker to move between local maxima early on, but trapping it in one region later to accumulate precision results.
  2. Use several (many if you can afford it) independent walkers so that they can get trapped in different local maxima. The more you use, and the bigger the difference in output values, the more likely you are to get the best maxima.

This is not necessary if you know that you only have one, big, broad, nicely behaved local extreme.

Finally, the selection of f(). You can just use f(x) = x, but you'll get optimal convergence if you use f(x) = exp(-(1/x)).


Again, you don't have enough time for a great many steps (though if you have multiple computers, you can run separate instances to get the multiple walkers effect, which will help), so you might be better off with some kind of deterministic approach. But that is not a subject I know enough about to offer any advice.

影子的影子 2024-08-31 00:14:09

有很多基于遗传算法的软件可以做到这一点。一两年前写过一篇关于它的博士论文。

谷歌的 Linux 遗传算法显示了很多起点。

There are a lot of genetic algorithm based software that can do exactly that. Wrote a PHD about it a decade or two ago.

A google for Genetic Algorithms Linux shows a load of starting points.

那伤。 2024-08-31 00:14:09

出于对这个问题的兴趣,我做了一些探索,试图更好地了解 CALIBRA 的本质、它在学术界的地位以及开源和 Linux 世界中类似软件项目的存在。
对于我的断言可能不完整、不准确甚至完全错误的情况,请善待(并且请直接编辑,或建议编辑)。在相关领域工作时,我绝不是运筹学(OR)权威!

[算法] 参数调整问题是一个相对明确的定义问题,通常被视为解决方案搜索问题之一,其中所有可能的参数值的组合构成了一个解决方案空间,并且参数调整逻辑的目标是“导航”此空间的[部分]以搜索最佳(或局部最佳)参数集。
给定解决方案的最优性可以通过多种方式来衡量,这些指标有助于指导搜索。在参数调整问题的情况下,给定解决方案的有效性是直接或通过函数从算法的输出来测量的(即正在调整的算法而不是调整逻辑的算法!)。

作为一个搜索问题,算法参数调优的原则与其他解决方案搜索问题没有显着差异,其中解决方案空间是由给定算法的参数以外的其他东西定义的。但由于它适用于本身就是各种解决方案的算法,因此该学科有时被称为元启发学 或元搜索。 (元启发式方法可以应用于各种算法)
当然,与其他优化应用相比,参数调整问题有许多具体特征,但就解决方案搜索本身而言,方法和问题通常是相同的。

事实上,虽然搜索问题得到了很好的定义,但总体上仍然没有得到解决,并且是许多不同领域的许多不同方向的积极研究的对象。根据领域的具体条件和要求,各种方法提供了不同的成功,而学术研究和实际应用的这种充满活力和多样化的组合是元启发法和整个优化的共同特征。

那么...回到 CALIBRA...
据其作者承认,Calibra 有几个限制

  • 限制为 5 个参数,最大
  • 要求参数 [某些?] 的值范围
  • 当参数相对独立时效果更好(但是......等等,当这是在这种情况下,整个搜索问题不是更容易了吗;-) )

CALIBRA 基于方法的组合,这些方法按顺序重复。引导搜索和本地优化的结合。

提出 CALIBRA 的论文日期为 2006 年。从那时起,对该论文和整个 CALIBRA 的引用相对较少。此后,其两位作者在与运筹学 (OR) 相关的各个学科中发表了几篇其他论文。
这可能表明 CALIBRA 尚未被视为一项突破。

Intrigued by the question, I did a bit of poking around, trying to get a better understanding of the nature of CALIBRA, its standing in academic circles and the existence of similar software of projects, in the Open Source and Linux world.
Please be kind (and, please, edit directly, or suggest editing) for the likely instances where my assertions are incomplete, inexact and even flat-out incorrect. While working in related fields, I'm by no mean an Operational Research (OR) authority!

[Algorithm] Parameter tuning problem is a relatively well defined problem, typically framed as one of a solution search problem whereby, the combination of all possible parameter values constitute a solution space and the parameter tuning logic's aim is to "navigate" [portions of] this space in search of an optimal (or locally optimal) set of parameters.
The optimality of a given solution is measured in various ways and such metrics help direct the search. In the case of the Parameter Tuning problem, the validity of a given solution is measured, directly or through a function, from the output of the algorithm [i.e. the algorithm being tuned not the algorithm of the tuning logic!].

Framed as a search problem, the discipline of Algorithm Parameter Tuning doesn't differ significantly from other other Solution Search problems where the solution space is defined by something else than the parameters to a given algorithm. But because it works on algorithms which are in themselves solutions of sorts, this discipline is sometimes referred as Metaheuristics or Metasearch. (A metaheuristics approach can be applied to various algorihms)
Certainly there are many specific features of the parameter tuning problem as compared to the other optimization applications but with regard to the solution searching per-se, the approaches and problems are generally the same.

Indeed, while well defined, the search problem is generally still broadly unsolved, and is the object of active research in very many different directions, for many different domains. Various approaches offer mixed success depending on the specific conditions and requirements of the domain, and this vibrant and diverse mix of academic research and practical applications is a common trait to Metaheuristics and to Optimization at large.

So... back to CALIBRA...
From its own authors' admission, Calibra has several limitations

  • Limit of 5 parameters, maximum
  • Requirement of a range of values for [some of ?] the parameters
  • Works better when the parameters are relatively independent (but... wait, when that is the case, isn't the whole search problem much easier ;-) )

CALIBRA is based on a combination of approaches, which are repeated in a sequence. A mix of guided search and local optimization.

The paper where CALIBRA was presented is dated 2006. Since then, there's been relatively few references to this paper and to CALIBRA at large. Its two authors have since published several other papers in various disciplines related to Operational Research (OR).
This may be indicative that CALIBRA hasn't been perceived as a breakthrough.

彡翼 2024-08-31 00:14:09

该领域的最新技术(“参数调整”、“算法配置”)是 SPOT 包。您可以使用您选择的语言连接外部健身函数。它真的很强大。

我正在研究用于简化实验设置的 C++ 和 Java 适配器,这需要一些时间来适应 SPOT。该项目的名称为 InPUT,调整部分的第一个版本即将推出。

State of the art in that area ("parameter tuning", "algorithm configuration") is the SPOT package in R. You can connect external fitness functions using a language of your choice. It is really powerful.

I am working on adapters for e.g. C++ and Java that simplify the experimental setup, which requires some getting used to in SPOT. The project goes under name InPUT, and a first version of the tuning part will be up soon.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文