将数组拆分为块,对每个块执行 snmp_get_request,重新组合生成的哈希引用
我在尝试使用利用 Net::SNMP 的 Nagios 插件时遇到问题。它尝试同时查询大量 OID,从而导致错误,因为响应将超过链路的最大 MTU。 (消息大小 2867 超过了 maxMsgSize 1472。)
本节的代码如下:
$result = $session->get_request(
Varbindlist => \@oids
);
在 Perl 中是否有一种方法可以将
- @oids 分割成更小的片段
- 迭代这些片段 将
- 返回的 $results 组合成对单个散列的单个引用?
这将是对脚本进行的最小修改,以使其支持更大量的接口,对吗?
I'm having an issue attempting to use a Nagios plugin that utilizes Net::SNMP. It attempts to query a large number of OIDs at the same time, resulting in an error, as the response would exceed the maximum MTU for the link. (The message size 2867 exceeds the maxMsgSize 1472.)
The code for this section is as follows:
$result = $session->get_request(
Varbindlist => \@oids
);
Is there a way in Perl to
- Split @oids into smaller pieces
- Iterate over these pieces
- Combine the return $results into a single reference to a single hash?
That would be the smallest modification to make to the script to have it support larger amounts of interfaces, correct?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
使用 splice() 将列表分解为更小的列表。如果您一次想要十个:
use splice() to break up the list into smaller lists. If you want ten at a time:
该代码未经测试,但我提供它作为关于如何划分列表并使用它运行的一般想法。
现在,如果我的计算正确,您将得到 @results,它将是每次迭代 $session->get_request() 的返回值列表。我不知道那是什么样子;也许你只是将它连接在一起。这是你要弄清楚的部分。 ;)
This code is untested, but I'm providing it as a general idea as to how you might divide up the list and run with it.
Now if my calculations are correct you will have @results, which will be a list of the the return values from $session->get_request() per iteration. I don't know what that looks like; maybe you just concatenate it together. That's your part to figure out. ;)