如何使 Fabric 执行遵循 env.hosts 列表顺序?
我有以下 fabfile.py:
from fabric.api import env, run
host1 = '192.168.200.181'
host2 = '192.168.200.182'
host3 = '192.168.200.183'
env.hosts = [host1, host2, host3]
def df_h():
run("df -h | grep sda3")
我得到以下输出:
[192.168.200.181] run: df -h | grep sda3
[192.168.200.181] out: /dev/sda3 365G 180G 185G 50% /usr/local/nwe
[192.168.200.183] run: df -h | grep sda3
[192.168.200.183] out: /dev/sda3 365G 41G 324G 12% /usr/local/nwe
[192.168.200.182] run: df -h | grep sda3
[192.168.200.182] out: /dev/sda3 365G 87G 279G 24% /usr/local/nwe
Done.
Disconnecting from 192.168.200.182... done.
Disconnecting from 192.168.200.181... done.
Disconnecting from 192.168.200.183... done.
请注意,执行顺序与 env.hosts 规范不同。
为什么会这样呢?有没有办法使执行顺序与 env.hosts 列表中指定的顺序相同?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
env.hosts 中未保留顺序的确切原因是,可以指定要操作的主机的三个“级别”——env.hosts、命令行和每个函数—— - 合并在一起。在fabric/main.py中>第 309 行,您可以看到它们使用
set()
类型来删除三个可能的主机列表中的重复项。由于set()
没有顺序,因此主机将以“随机”顺序作为列表返回。这是方法,这是有充分理由的。这是一种非常有效的机制,可以从列表中删除重复项,对于结构来说,顺序无关紧要,这一点很重要。您要求结构在各个主机上执行一系列完全并行的原子操作。由于并行、原子操作的本质,顺序不会影响操作成功执行的能力。如果订单确实很重要,那么就需要采取不同的策略,并且结构将不再是完成这项工作的正确工具。
也就是说,是否有特殊原因需要这些操作按顺序进行?也许如果您遇到某种因执行顺序而导致的问题,我们可以帮助您解决该问题。
The exact reason that the order is not preserved from
env.hosts
is that there are three "levels" that the hosts to operate can be specified--env.hosts, the command line, and per function--which are merged together. Infabric/main.py
on line 309, you can see that they use theset()
type to remove duplicates in the three possible lists of hosts. Sinceset()
does not have an order, the hosts will be returned as a list in "random" order.There's a pretty good reason that this is method. It's a very efficient mechanism for removing duplicates from a list and for fabric it's important that order doesn't matter. You're asking fabric to perform a series of completely parallel, atomic actions on various hosts. By the very nature of parallel, atomic actions, order does not effect the ability of the actions to be performed successfully. If order did matter, then a different strategy would be necessary and fabric would no longer be the correct tool for the job.
That said, is there a particular reason that you need these operations to occur in order? Perhaps if you're having some sort of problem that's a result of execution order, we can help you work that out.
更新一下,最新的 Fabric 1.1+(甚至是 1.0)现在以保留顺序的方式进行重复数据删除。所以现在这应该不是问题。
Just to update, newest Fabric 1.1+ (think even 1.0) dedupes in an order preserving way now. So this should be a non-issue now.