logstash容器停止了,因为错误从过滤器创建动作

发布于 2025-02-03 06:35:13 字数 3748 浏览 5 评论 0原文

您好,我是新手Elasticsearch,

我正在使用FileBeat和LogStash的日志文件,并且我正在尝试添加一个字段“ Response_time”,然后影响时间戳之间的差异。 因此,我创建了一个LogStash的过滤器,并将其添加到Logstash配置文件中,但是当我重新安装容器时,我会得到错误。

这是我的logstash配置文件:


input {
  beats {
    port => 5044
  }
}

filter {
  json {
    source => "message"
  }
  ruby {
    code => "event.set('indexDay', event.get('[@timestamp]').time.localtime('+01:00').strftime('%Y%m%d'))"
  }
  aggregate {
        add_field => {
          "response_time" => "timestamp2-timestamp1"
          }
        }
   grok {
     match => ["message","%{LOGLEVEL:loglevel},%{DATESTAMP_RFC2822:timestamp},%{NOTSPACE:event_type},%{NUMBER:capture_res_id},%{NUMBER:capture_pid},%{NUMBER:mti},%{NUMBER:node_id}
    ,%{UUID:msg_uuid},%{NOTSPACE:module},%{NUMBER :respCode}"]}
    if [event_type] == "request_inc" {
     aggregate {
       msg_uuid => "%{UUID}"
       timestamp1 => event.get('DATESTAMP_RFC2822')
       code => "map['response_time'] = 0"
       map_action => "create"
     }
   }
   if [event_type] == "response_outg" {
     aggregate {
       msg_uuid => "%{UUID}"
       event_type => event.set('event_type')
       timestamp2 => "%{DATESTAMP_RFC2822}"
       code => "map['response_time']"
       map_action => "update"
       end_of_task => true
       timeout =>120
     }
   }
}


output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    template => "/usr/share/logstash/templates/testblogstash.template.json"
    template_name => "testblogstash"
    template_overwrite => true
    index => "testblogstash-%{indexDay}"
    codec => json
  }
  stdout {
    codec => rubydebug
  }
}

,这是我日志文件的一个典范:

{"log_level":"INFO","timestamp":"2021-12-15T16:06:24.400087Z","event_type":"s_tart","ca_id":"11","c_pid":"114","mti":"00","node_id":"00","msg_uuid":"1234","module":"cmde"}
{"log_level":"INFO","timestamp":"2021-12-15T16:06:31.993057Z","event_type":"e_nd","mti":"00","node_id":"00","msg_uuid":"1234","module":"PWC-cmde","respCode":"1"}

这是docker日志中的错误:

[2022-06-01T14:43:24,529] [错误] [logstash.agent]无法执行 action {:action => logstash :: pipelineaction :: create/pipeline_id:main, :exception =>“ logstash :: configurationError”,:message =>“期望之一 [a-za-z0-9_-],[\ t \ r \ n],“#”,“ {”,[a-za-z0-9_],“}”,“}” 25,第24列(字节689)之后{\ r \ n json {\ r \ n source => “消息” \ r \ n} \ r \ n ruby​​ {\ r \ n代码=> “ event.set('indexday', event.get('[@timestamp]')。time.localtime('+01:00')。strftime('%y%m%d'))” \ r \ n } \ r \ n聚集{\ r \ n add_field => {\ r \ n“ response_time” => “ timestamp2-timestamp1” \ r \ n \ t \ t} \ r \ n \ t \ t} \ r \ n grok {\ r \ n match => [“消息”,“%{loglevel:loglevel},%{datestamp_rfc2822:timestamp},%{notspace:event_type},%{number:captution_res_id},%{numbers {number {number:captution_pid}数字:node_id} \ r \ n \ t,%{uuid:msg_uuid},%{notspace:module},%{number :reptCode}“]} \ r \ n如果[event_type] ==“ request_inc” {\ r \ n centrengate {\ r \ n \ t msg_uuid => “%{uuid}” \ r \ n \ t timestamp1 =>事件”, :backtrace => [“/usr/share/logstash/logstash core/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'“”,“ org/logstash/execution/atrackpipelineext.java:187:in indialize' “ org/logstash/execution/javabasepipelext.java:72:在indialize',“ ”, “/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in 执行',“”,“ /USR/SHARE/LOGSTASH/LOGSTASH-CORE/LIB/LOGSTASH/AGENT.RB:383:IN Block 在converge_state'“]}

...

[2022-06-01T14:43:29,460] [info] [logstash.runner] logstash关闭。

Hello I'm new to Elasticsearch

I'm working with log files comming from filebeat and logstash and I'm trying to add a field "response_time", and then affect the difference between timestamps to It.
So I create a logstash's filter and I add it to logstash configuration file but when I restared the container I get the error bellow.

This is my logstash configuration file:


input {
  beats {
    port => 5044
  }
}

filter {
  json {
    source => "message"
  }
  ruby {
    code => "event.set('indexDay', event.get('[@timestamp]').time.localtime('+01:00').strftime('%Y%m%d'))"
  }
  aggregate {
        add_field => {
          "response_time" => "timestamp2-timestamp1"
          }
        }
   grok {
     match => ["message","%{LOGLEVEL:loglevel},%{DATESTAMP_RFC2822:timestamp},%{NOTSPACE:event_type},%{NUMBER:capture_res_id},%{NUMBER:capture_pid},%{NUMBER:mti},%{NUMBER:node_id}
    ,%{UUID:msg_uuid},%{NOTSPACE:module},%{NUMBER :respCode}"]}
    if [event_type] == "request_inc" {
     aggregate {
       msg_uuid => "%{UUID}"
       timestamp1 => event.get('DATESTAMP_RFC2822')
       code => "map['response_time'] = 0"
       map_action => "create"
     }
   }
   if [event_type] == "response_outg" {
     aggregate {
       msg_uuid => "%{UUID}"
       event_type => event.set('event_type')
       timestamp2 => "%{DATESTAMP_RFC2822}"
       code => "map['response_time']"
       map_action => "update"
       end_of_task => true
       timeout =>120
     }
   }
}


output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    template => "/usr/share/logstash/templates/testblogstash.template.json"
    template_name => "testblogstash"
    template_overwrite => true
    index => "testblogstash-%{indexDay}"
    codec => json
  }
  stdout {
    codec => rubydebug
  }
}

And this is an exemple of my log file:

{"log_level":"INFO","timestamp":"2021-12-15T16:06:24.400087Z","event_type":"s_tart","ca_id":"11","c_pid":"114","mti":"00","node_id":"00","msg_uuid":"1234","module":"cmde"}
{"log_level":"INFO","timestamp":"2021-12-15T16:06:31.993057Z","event_type":"e_nd","mti":"00","node_id":"00","msg_uuid":"1234","module":"PWC-cmde","respCode":"1"}

This is the error from docker logs :

[2022-06-01T14:43:24,529][ERROR][logstash.agent ] Failed to execute
action {:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
[A-Za-z0-9_-], [ \t\r\n], "#", "{", [A-Za-z0-9_], "}" at line
25, column 24 (byte 689) after filter {\r\n json {\r\n source =>
"message"\r\n }\r\n ruby {\r\n code => "event.set('indexDay',
event.get('[@timestamp]').time.localtime('+01:00').strftime('%Y%m%d'))"\r\n
}\r\n aggregate {\r\n add_field => {\r\n "response_time" =>
"timestamp2-timestamp1"\r\n\t\t }\r\n\t\t}\r\n grok {\r\n match =>
["message","%{LOGLEVEL:loglevel},%{DATESTAMP_RFC2822:timestamp},%{NOTSPACE:event_type},%{NUMBER:capture_res_id},%{NUMBER:capture_pid},%{NUMBER:mti},%{NUMBER:node_id}\r\n\t,%{UUID:msg_uuid},%{NOTSPACE:module},%{NUMBER
:respCode}"]}\r\n if [event_type] == "request_inc" {\r\n aggregate
{\r\n\t msg_uuid => "%{UUID}"\r\n\t timestamp1 => event",
:backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in
compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:ininitialize'",
"org/logstash/execution/JavaBasePipelineExt.java:72:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:ininitialize'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:383:inblock
in converge_state'"]}

...

[2022-06-01T14:43:29,460][INFO ][logstash.runner ] Logstash shut down.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文