OpenMDAO:使用ExecComp Maximal()的使用是否会干扰不受设计变量影响的约束?

发布于 2025-01-30 16:18:31 字数 2263 浏览 3 评论 0原文

在大型型号上运行优化驱动程序时,我会收到:

衍生作用:约束或目标[('max_current.current_constraint.current_constraint',inds = [0]),('max_current.cortin.continuul_cortinuul_current_constraint.continouul_cortinuun_current_current_contraint_contraint_contraint_containt_contraint_conteraint',inds = [0]),无法触及的变量。问题。

我读了一个类似问题的答案在这里

随着设计变量的变化,值确实会改变,并且在优化过程中满足了两个约束。

我以为这是由于这些组件的execcomp使用maximal(),因为这是我模型中唯一使用最大函数的位置,但是当设置具有类似方式的最大值()函数的简单问题时,没有收到错误。

我的模型使用循环的明确组件,N2图的左下方有连接,而NLBGS正在收敛整个模型。我目前认为这是由于仅使用显式组件和NLBG而不是隐式组件。

感谢您对解决此警告的任何见解。

以下是一个简单的脚本,使用最大值(),该脚本不报告错误。 (我非常确定就是这样),因为我创建了一个最低限度的工作示例,该示例以与较大模型相似的方式给出了错误,所以我将上传它。

import openmdao.api as om

prob=om.Problem()
prob.driver = om.ScipyOptimizeDriver()
prob.driver.options['optimizer'] = 'SLSQP'
prob.driver.options['tol'] = 1e-6
prob.driver.options['maxiter'] = 80
prob.driver.options['disp'] = True

indeps = prob.model.add_subsystem('indeps', om.IndepVarComp())
indeps.add_output('x', val=2.0, units=None)
prob.model.promotes('indeps', outputs=['*'])

prob.model.add_subsystem('y_func_1',
                         om.ExecComp('y_func_1 = x'),
                         promotes_inputs=['x'],
                         promotes_outputs=['y_func_1'])
prob.model.add_subsystem('y_func_2',
                         om.ExecComp('y_func_2 = x**2'),
                         promotes_inputs=['x'],
                         promotes_outputs=['y_func_2'])
prob.model.add_subsystem('y_max',
                         om.ExecComp('y_max = maximum( y_func_1 , y_func_2 )'),
                         promotes_inputs=['y_func_1',
                                          'y_func_2'],
                         promotes_outputs=['y_max'])
prob.model.add_subsystem('y_check',
                         om.ExecComp('y_check = y_max - 1.1'),
                         promotes_inputs=['*'],
                         promotes_outputs=['*'])

prob.model.add_constraint('y_check', lower=0.0)

prob.model.add_design_var('x', lower=0.0, upper=2.0)
prob.model.add_objective('x')

prob.setup()
prob.run_driver()

print(prob.get_val('x'))

When running the optimization driver on a large model I recieve:

DerivativesWarning:Constraints or objectives [('max_current.current_constraint.current_constraint', inds=[0]), ('max_current.continuous_current_constraint.continuous_current_constraint', inds=[0])] cannot be impacted by the design variables of the problem.

I read the answer to a similar question posed here.

The values do change as the design variables change, and the two constraints are satisfied during the course of optimization.

I had assumed this was due to those components' ExecComp using a maximum(), as this is the only place in my model I use a maximum function, however when setting up a simple problem with a maximum() function in a similar manner I do not receive an error.

My model uses explicit components that are looped, there are connections in the bottom left of the N2 diagram and NLBGS is converging the whole model. I currently am thinking it is due to the use of only explicit components and the NLBGS instead of implicit components.

Thank you for any insight you can give in resolving this warning.

Below is a simple script using maximum() that does not report errors. (I was so sure that was it) As I create a minimum working example that gives the error in a similar way to my larger model I will upload it.

import openmdao.api as om

prob=om.Problem()
prob.driver = om.ScipyOptimizeDriver()
prob.driver.options['optimizer'] = 'SLSQP'
prob.driver.options['tol'] = 1e-6
prob.driver.options['maxiter'] = 80
prob.driver.options['disp'] = True

indeps = prob.model.add_subsystem('indeps', om.IndepVarComp())
indeps.add_output('x', val=2.0, units=None)
prob.model.promotes('indeps', outputs=['*'])

prob.model.add_subsystem('y_func_1',
                         om.ExecComp('y_func_1 = x'),
                         promotes_inputs=['x'],
                         promotes_outputs=['y_func_1'])
prob.model.add_subsystem('y_func_2',
                         om.ExecComp('y_func_2 = x**2'),
                         promotes_inputs=['x'],
                         promotes_outputs=['y_func_2'])
prob.model.add_subsystem('y_max',
                         om.ExecComp('y_max = maximum( y_func_1 , y_func_2 )'),
                         promotes_inputs=['y_func_1',
                                          'y_func_2'],
                         promotes_outputs=['y_max'])
prob.model.add_subsystem('y_check',
                         om.ExecComp('y_check = y_max - 1.1'),
                         promotes_inputs=['*'],
                         promotes_outputs=['*'])

prob.model.add_constraint('y_check', lower=0.0)

prob.model.add_design_var('x', lower=0.0, upper=2.0)
prob.model.add_objective('x')

prob.setup()
prob.run_driver()

print(prob.get_val('x'))

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

指尖微凉心微凉 2025-02-06 16:18:31

在此上下文中,最大功能存在问题。从技术上讲,最大功能是无可分析的。至少当哪个值为最大的索引可能会发生变化时。如果最大值不发生更改,则是可区分的...但是无论如何您都不需要最大函数。

进行基于梯度的事情时,一种正确的,可不同的方法是使用KS功能。 OpenMDAO提供

但是,即使最大在技术上不是可区分的……您也可以选择它。至少,numpy(ExecComp使用)使您可以将复杂的步骤差异应用到最大函数,并且似乎给出了非零衍生物。因此,尽管它在技术上不正确,但您可以摆脱它。至少,它不太可能是您问题的核心。

您提到使用NLBG,并且您的组件已循环。不过,您的测试用例纯粹是向前馈送的(这是您的测试用例的N2)。
。这是一个重要的区别。

这里的问题是您的衍生产品,而不是最大函数。由于您有一个非线性求解器,因此您需要做一些事情以使衍生物正确。在 prob.model.approx_totals(),它告诉OpenMDAO在整个模型(包括非线性求解器)上进行有限差分。这很简单,并保持示例紧凑。无论您的组件是否定义衍生物,它也可以工作。但是,这很慢,遭受数值困难。因此,请自行使用“真实”问题。

如果您不包含该内容(并且上面的示例没有,所以我认为您的真正问题也不是),那么您基本上是在告诉OpenMDAO,您想使用分析衍生物(是的!它们真是太棒了)。这意味着您需要拥有一个线性求解器来匹配非线性求解器。对于您最初的大多数问题,您只需将directsolver就位于模型的顶部,它们都可以解决。对于更高级的模型,您需要一个更复杂的线性求解器结构……但这是另一个问题。

尝试一下:
prob.model.linear_solver = om.directSolver()

无论您是否有耦合(loops),它都应该给您非零的总衍生物。

There is a problem with the maximum function in this context. Technically a maximum function is not differentiable; at least not when the index of which value is max is subject to change. If the maximum value is not subject to change, then it is differentiable... but you didn't need the max function anyway.

One correct, differentiable way to handle a max when doing gradient based things is to use a KS function. OpenMDAO provides the KSComp which implements it. There are other kinds of functions (like p-norm that you could use as well).

However, even though maximum is not technically differentiable ... you can sort-of/kind-of get away with it. At least, numpy (which ExecComp uses) lets you apply complex-step differentiation to the maximum function and it seems to give a non-zero derivative. So while its not technically correct, you can maybe get rid of it. At least, its not likely to be the core of your problem.

You mention using NLBGS, and that you have components which are looped. Your test case is purely feed forward though (here is the N2 from your test case).
OpenMDAO N2 diagram of feed forward model. That is an important difference.

The problem here is with your derivatives, not with the maximum function. Since you have a nonlinear solver, you need to do something to get the derivatives right. In the example Sellar optimization, the model uses this line: prob.model.approx_totals(), which tells OpenMDAO to finite-difference across the whole model (including the nonlinear solver). This is simple and keeps the example compact. It also works regardless of whether your components define derivatives or not. It is however, slow and suffers from numerical difficulties. So use on "real" problems at your own risk.

If you don't include that (and your above example does not, so I assume your real problem does not either) then you're basically telling OpenMDAO that you want to use analytic derivatives (yay! they are so much more awesome). That means that you need to have a Linear solver to match your nonlinear one. For most problems that you start out with, you can simply put a DirectSolver right at the top of the model and it will all work out. For more advanced models, you need a more complex linear solver structure... but thats a different question.

Give this a try:
prob.model.linear_solver = om.DirectSolver()

That should give you non-zero total derivatives regardless of whether you have coupling (loops) or not.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文