LogStash Elasticsearch过滤器和Elasticsearch输出中的超时

发布于 2025-01-27 23:16:05 字数 3904 浏览 0 评论 0原文

我正在使用http_poller输入插件,该插件每15分钟安排。基于http_poller API响应,我需要执行Elasticsearch查询。

对于执行Elasticsearch查询,我使用的是Elasticsearch filter插件,并且第一次执行无问题,但是在第二次运行后,它将抛出以下错误:

[2022-05-09T11:34:46,738][WARN ][logstash.filters.elasticsearch][logs][9c5fb8a0078cad1be396fedd387eb8680d72086b85be9efe15e6893ce2e73332] Failed to query elasticsearch for previous event {:index=>"logs-xx-prod_xx", :error=>"Read timed out"}

ASLO:ASLO,它在第二次运行中的Elasticsearch Outper Outper filter for second of Seciond ronwards:

[2022-05-09T11:35:17,236][WARN ][logstash.outputs.elasticsearch][logs][8850a096b09c55eca7744c74cb4821d3f6e42a3e87a464228013b22ea1f0d576] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [https://elastic:[email protected]:9243/][Manticore::SocketException] Connection reset by peer: socket write error {:url=>https://elastic:[email protected]:9243/, :error_message=>"Elasticsearch Unreachable: [https://elastic:[email protected]:9243/][Manticore::SocketException] Connection reset by peer: socket write error", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
[2022-05-09T11:35:17,236][ERROR][logstash.outputs.elasticsearch][logs][8850a096b09c55eca7744c74cb4821d3f6e42a3e87a464228013b22ea1f0d576] Attempted to send a bulk request but Elasticsearch appears to be unreachable or down {:message=>"Elasticsearch Unreachable: [https://elastic:[email protected]:9243/][Manticore::SocketException] Connection reset by peer: socket write error", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :will_retry_in_seconds=>2}
[2022-05-09T11:35:19,236][ERROR][logstash.outputs.elasticsearch][logs][8850a096b09c55eca7744c74cb4821d3f6e42a3e87a464228013b22ea1f0d576] Attempted to send a bulk request but there are no living connections in the pool (perhaps Elasticsearch is unreachable or down?) {:message=>"No Available connections", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError, :will_retry_in_seconds=>4}
[2022-05-09T11:35:19,377][WARN ][logstash.outputs.elasticsearch][logs] Restored connection to ES instance {:url=>"https://elastic:[email protected]:9243/"}

我已经配置了Kibana的LogStash管道使用ES 7.16版本的集中式管道管理。

我尝试了以下配置,但似乎没有一种配置正在工作。

  • 更改管道批处理大小值为100然后50然后25
  • 管道工人设置到1
  • set validate_after_inactivity to 0,并在Elasticsearch输出插件中尝试差异值。
  • 尝试了各种超时值,例如100180200600等。

以前,我是在使用自定义文档ID设置document_id现在也禁用的参数。

我注意到的奇怪行为之一是,即使在上述错误之后,文档计数也会增加。

另外,在Elasticsearch滤波器插件中没有选项可以设置timeout。因为当我尝试设置timeout时,会出现“不支持超时参数”的错误。

I am using http_poller input plugin which is scheduled every 15 mins. Based on the http_poller API response I need to execute Elasticsearch query.

For executing Elasticsearch query, I am using Elasticsearch Filter plugins and it is executed the first time without issue, but after second run it is throwing below error:

[2022-05-09T11:34:46,738][WARN ][logstash.filters.elasticsearch][logs][9c5fb8a0078cad1be396fedd387eb8680d72086b85be9efe15e6893ce2e73332] Failed to query elasticsearch for previous event {:index=>"logs-xx-prod_xx", :error=>"Read timed out"}

Aslo, it is throwing below error for Elasticsearch Output filter from onwards second run:

[2022-05-09T11:35:17,236][WARN ][logstash.outputs.elasticsearch][logs][8850a096b09c55eca7744c74cb4821d3f6e42a3e87a464228013b22ea1f0d576] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [https://elastic:[email protected]:9243/][Manticore::SocketException] Connection reset by peer: socket write error {:url=>https://elastic:[email protected]:9243/, :error_message=>"Elasticsearch Unreachable: [https://elastic:[email protected]:9243/][Manticore::SocketException] Connection reset by peer: socket write error", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
[2022-05-09T11:35:17,236][ERROR][logstash.outputs.elasticsearch][logs][8850a096b09c55eca7744c74cb4821d3f6e42a3e87a464228013b22ea1f0d576] Attempted to send a bulk request but Elasticsearch appears to be unreachable or down {:message=>"Elasticsearch Unreachable: [https://elastic:[email protected]:9243/][Manticore::SocketException] Connection reset by peer: socket write error", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :will_retry_in_seconds=>2}
[2022-05-09T11:35:19,236][ERROR][logstash.outputs.elasticsearch][logs][8850a096b09c55eca7744c74cb4821d3f6e42a3e87a464228013b22ea1f0d576] Attempted to send a bulk request but there are no living connections in the pool (perhaps Elasticsearch is unreachable or down?) {:message=>"No Available connections", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError, :will_retry_in_seconds=>4}
[2022-05-09T11:35:19,377][WARN ][logstash.outputs.elasticsearch][logs] Restored connection to ES instance {:url=>"https://elastic:[email protected]:9243/"}

I have configured Logstash pipeline from Kibana as using the centralized pipeline management of the ES 7.16 version.

I have tried below configuration, but seems like not a single configuration is working.

  • Changed Pipeline batch size value to 100 then 50 then 25
  • pipeline workers is set to 1
  • set validate_after_inactivity to 0 and try diffrent value as well in Elasticsearch output plugin.
  • tried various timeout value like 100, 180, 200, 600 etc.

Previously i was setting custom document id using document_id param that also disable now.

One of the strange behavior I have noticed is that, document count are increased in ES index even after above error.

Also, there is no option to set timeout in the Elasticsearch filter plugin. Because when I tried to set timeout it throws error that "timeout param is not supported".

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文