logStash错误:管道错误{:pipeline_id => main&quort&quort&quort&quort&quort:exception =>#< grok :: patternerror:patternerror:pattern%{jobname:project}未定义> ,,
您的文本
Logstash 不断崩溃,我不确定问题是什么
完整日志: [2022-03-30T18:21:34,633][INFO][logstash.agent] 成功启动 Logstash API 端点 {:port=>9600, :ssl_enabled=>false} [2022-03-30T18:21:37,520][INFO][org.reflections.Reflections] 反射花了 167 毫秒扫描 1 个 URL,生成 119 个键和 417 个值 [2022-03-30T18:21:39,677][INFO ][logstash.outputs.elasticsearch][main] 新的 Elasticsearch 输出 {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[" http://104.154.51.160:9200"]} [2022-03-30T18:21:40,456][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch 池 URL 已更新 {:changes=>{:removed=>[], :added=>[http //104.154.51.160:9200/]}} [2022-03-30T18:21:40,912][警告][logstash.outputs.elasticsearch][main]已恢复与 ES 实例的连接 {:url=>"http://104.154.51.160:9200/"} [2022-03-30T18:21:40,944][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch 版本已确定 (7.17.1) {:es_version=>7} [2022-03-30T18:21:40,957][警告][logstash.outputs.elasticsearch][main] 检测到 6.x 及更高版本集群:type
事件字段将不会用于确定文档_type {:es_version=>7} [2022-03-30T18:21:41,089][INFO][logstash.outputs.elasticsearch][main] 配置与数据流不兼容。 data_stream =>; auto
解析为 false
[2022-03-30T18:21:41,094][INFO][logstash.outputs.elasticsearch][main] 配置与数据流不兼容。 data_stream =>; auto
解析为 false
[2022-03-30T18:21:41,209][INFO ][logstash.outputs.elasticsearch][main] 使用默认映射模板 {:es_version=>7, :ecs_compatibility=>:disabled} [2022-03-30T18:21:41,559][错误][logstash.javapipeline][main]管道错误{:pipeline_id =>“main”,:exception =>#<Grok :: PatternError:模式%{ JOBNAME:项目}未定义>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in 块编译'", "org/jruby/RubyKernel.java:1442:in
循环'", “/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in 编译”, “/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.1/lib/logstash/filters/grok.rb:282:in
寄存器中的块'", "org/jruby/RubyArray.java:1821:in each'", “/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.1/lib/logstash/filters/grok.rb:276:in
寄存器中的块'", "org/jruby/RubyHash.java:1415:in each'", “/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.1/lib/logstash/filters/grok.rb:271:in
注册” , "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in 注册'", “/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:232:in
block in register_plugins'”,“org/jruby/RubyArray.java:1821:in each” ”,“/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:231:in
register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:590:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/ lib/logstash/java_pipeline.rb:244:in
start_workers'", “/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:189:in run'”,“/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb: 141:在开始'"]的块中, "pipeline.sources"=>["/etc/logstash/conf.d/jenkins.conf"], :thread=>"#
我的conf文件
`input {
节拍{
端口=> 5000
类型=>詹金斯
}
}
过滤器 {
if [类型] == “詹金斯” {
格洛克{
模式_目录=> [“/etc/logstash/patterns”]
匹配=> {
“消息”=> "%{TIMESTAMP_ISO8601:createdAt}%{SPACE}[id=%{INT:buildId}]%{SPACE}%{LOGLEVEL:level}%{SPACE}%{JAVACLASS:class}%{DATA:state}:%{空间}%{JOBNAME:项目} #%{NUMBER:buildNumber} %{DATA:execution}: %{WORD:status}"
}
}
}
}
输出 {
if [类型] == “詹金斯” {
弹性搜索 {
主机=> 'elasticsearch 服务器 IP 位于此处'
索引 => '詹金斯-%{+YYYY.MM.dd}'
}
}
}
` 我以为它没有看到我的conf文件,但后来这一行出现在引用它的日志中: "pipeline.sources"=>["/etc/logstash/conf.d/jenkins.conf"], :thread=>"#
我很困惑,请帮忙!
your text
Logstash keeps crashing and I'm not sure what the issue is
Full log:
[2022-03-30T18:21:34,633][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-03-30T18:21:37,520][INFO ][org.reflections.Reflections] Reflections took 167 ms to scan 1 urls, producing 119 keys and 417 values
[2022-03-30T18:21:39,677][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://104.154.51.160:9200"]}
[2022-03-30T18:21:40,456][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://104.154.51.160:9200/]}}
[2022-03-30T18:21:40,912][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://104.154.51.160:9200/"}
[2022-03-30T18:21:40,944][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.1) {:es_version=>7}
[2022-03-30T18:21:40,957][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type
event field won't be used to determine the document _type {:es_version=>7}
[2022-03-30T18:21:41,089][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. data_stream => auto
resolved to false
[2022-03-30T18:21:41,094][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. data_stream => auto
resolved to false
[2022-03-30T18:21:41,209][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2022-03-30T18:21:41,559][ERROR][logstash.javapipeline ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{JOBNAME:project} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1442:in
loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.1/lib/logstash/filters/grok.rb:282:in
block in register'", "org/jruby/RubyArray.java:1821:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.1/lib/logstash/filters/grok.rb:276:in
block in register'", "org/jruby/RubyHash.java:1415:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.1/lib/logstash/filters/grok.rb:271:in
register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:232:in
block in register_plugins'", "org/jruby/RubyArray.java:1821:in each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:231:in
register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:590:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:244:in
start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:189:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:141:in
block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/jenkins.conf"], :thread=>"#<Thread:0x7b168333 run>"}
[2022-03-30T18:21:41,570][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-03-30T18:21:41,597][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2022-03-30T18:21:41,780][INFO ][logstash.runner ] Logstash shut down.
[2022-03-30T18:21:41,801][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.20.1.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.20.1.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:94) ~[?:?]
My conf file
`input {
beats {
port => 5000
type => jenkins
}
}
filter {
if [type] == "jenkins" {
grok {
patterns_dir => ["/etc/logstash/patterns"]
match => {
"message" => "%{TIMESTAMP_ISO8601:createdAt}%{SPACE}[id=%{INT:buildId}]%{SPACE}%{LOGLEVEL:level}%{SPACE}%{JAVACLASS:class}%{DATA:state}:%{SPACE}%{JOBNAME:project} #%{NUMBER:buildNumber} %{DATA:execution}: %{WORD:status}"
}
}
}
}
output {
if [type] == "jenkins" {
elasticsearch {
hosts => 'elasticsearch server ip goes in here'
index => 'jenkins-%{+YYYY.MM.dd}'
}
}
}
`
I was thinking it wasn't seeing my conf file but then this line was in the logs referring to it:
"pipeline.sources"=>["/etc/logstash/conf.d/jenkins.conf"], :thread=>"#<Thread:0x7b168333 run>"}
I'm confused please help!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论