Я работаю с logstash в течение месяца, и неделю назад я не могу заставить его начать. Logstash работает на образе докера, версия 6.2.4, на машине aws с linux. Я работал хорошо раньше, поэтому я не знаю, что случилось. Единственное, что сделал мой босс, - это обновление с версии 6.2.3 до 6.2.4, но ошибка возникла не в тот момент, а через несколько дней, поэтому я полагаю, что это не проблема.
У меня есть файл logstash.conf, в котором есть моя конфигурация, и внутренние журналы указывают на определенную строку в «файле конфигурации», но забавно то, что этой строки не существует, так как в файле меньше строк. Я где-то читал, что logstash объединяет все файлы conf, но я не могу найти объединенный файл, чтобы проверить строку. Я разочарован.
Я вставлю файл logstash.conf:
input {
beats {
port => "5044"
client_inactivity_timeout => "120"
}
}
filter {
ruby {
code => "event.set('day',event.get('source').split('/')[5].split('-').last)"
}
ruby {
code => "event.set('app',event.get('source').split('/')[5].split('-').first)"
}
ruby {
code => "event.set('categoria',event.get('source').split('/').last.split('.').first.split('-')[1])"
}
ruby {
code => "event.set('nodo',event.get('source').split('/')[4])"
}
ruby {
code => "event.set('conFecha',true)"
}
if [day] == "A"{
ruby {
code => "event.set('day',Time.now.getlocal('-03:00').strftime('%Y%m%d'))"
}
ruby {
code => "event.set('conFecha',false)"
}
}
grok {
patterns_dir => ["/usr/share/logstash/pipeline/pattern/patterns"]
match => { "message" => "%{SV_TIME:time}, %{SV_TIMESTAMP:numero}, cliente\[%{DATA:client}\], %{WORD:level} , performance - (.)* .*\[%{NUMBER:milliseconds:float}\].*\[com.vtr.servicesvtr.ws.client.factory.([a-z])*.*%{WORD:crm}.%{WORD:grupo}.%{WORD:method}.*.(ejecutar)\]" }
add_field => {
"tipo" => "SOA"
"fechahora" => "%{day} %{time}"
"performance" => "performance"
}
}
grok {
patterns_dir => ["/usr/share/logstash/pipeline/pattern/patterns"]
match => { "message" => "%{SV_TIME:time}, %{SV_TIMESTAMP:numero}, cliente\[%{DATA:client}\], %{WORD:level} , performance - (.)* .*\[%{NUMBER:milliseconds:float}\].*\[/%{SV_PACKAGE:microservicio}/%{DATA:url}\]" }
add_field => {
"tipo" => "MS"
"fechahora" => "%{day} %{time}"
"performance" => "performance"
}
}
if [performance] != "performance"{
grok {
patterns_dir => ["/usr/share/logstash/pipeline/pattern/patterns"]
match => { "message" => "%{SV_TIME:time}, %{SV_TIMESTAMP:numero}, cliente\[%{DATA:client}\], %{WORD:level}.*\, %{GREEDYDATA:resultado}" }
add_field => {
"tipo" => "AUDIT"
"fechahora" => "%{day} %{time}"
"auditoria" => "auditoria"
}
}
}
if [performance] != "performance" and [auditoria] != "auditoria"{
grok {
patterns_dir => ["/usr/share/logstash/pipeline/pattern/patterns"]
match => { "message" => "%{SV_DATE_TIME:fecha}.* seguridad \- %{DATA:usuario}\|%{DATA:rut}\|%{DATA:resultado} \-> %{GREEDYDATA:metodo}" }
add_field => {
"tipo" => "SEGURIDAD"
"fechahora" => "%{fecha}"
"su" => "su"
}
}
mutate {
lowercase => [ "usuario" ]
}
date {
match => ["fechahora", "yyyy-MM-dd HH:mm:ss,SSS"]
}
}
if [su] != "su"{
date {
match => ["fechahora", "yyyyMMdd HH:mm:ss.SSS"]
}
}
date {
match => ["timestamp" , "yyyyMMdd'T'HH:mm:ss.SSS"]
target => "@timestamp"
}
}
output {
if [performance] == "performance" and [url] != "manage/health"{
if [conFecha] {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "%{[app_id]}-performances-%{+YYYY.MM.dd}"
}
}else {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "temp-%{[app_id]}-performances-%
{+YYYY.MM.dd}"
}
}
}else if [auditoria] == "auditoria" {
if [conFecha] {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "%{[app_id]}-auditoria-%{+YYYY.MM.dd}"
}
}else {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "temp-%{[app_id]}-auditoria-%{+YYYY.MM.dd}"
}
}
}
if [performance] == "performance" and [milliseconds] >= 1000 and [url] != "manage/health"{
if [conFecha] {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "%{[app_id]}-performance-scache-%{+YYYY.MM.dd}"
}
}else {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "temp-%{[app_id]}-performance-scache-%{+YYYY.MM.dd}"
}
}
}
}
Я также вставлю логи:
Sending Logstash's logs to /usr/share/logstash/logs which is now
configured via log4j2.properties
[2018-04-27T15:59:06,770][INFO ][logstash.modules.scaffold]
Initializing module {:module_name=>"fb_apache",
:directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-04-27T15:59:06,879][INFO ][logstash.modules.scaffold]
Initializing module {:module_name=>"netflow",
:directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-04-27T15:59:13,551][INFO ][logstash.runner ] Starting
Logstash {"logstash.version"=>"6.2.4"}
[2018-04-27T15:59:17,073][INFO ][logstash.agent ]
Successfully started Logstash API endpoint {:port=>9600}
[2018-04-27T15:59:27,711][ERROR][logstash.agent ] Failed to
execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
#, input, filter, output at line 205, column 1 (byte 6943) after ",
:backtrace=>["/usr/share/logstash/logstash-
core/lib/logstash/compiler.rb:42:in `compile_imperative'",
"/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in
`compile_graph'", "/usr/share/logstash/logstash-
core/lib/logstash/compiler.rb:12:in `block in compile_sources'",
"org/jruby/RubyArray.java:2486:in `map'",
"/usr/share/logstash/logstash-
core/lib/logstash/compiler.rb:11:in `compile_sources'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in
`initialize'", "/usr/share/logstash/logstash-
core/lib/logstash/pipeline.rb:169:in `initialize'",
"/usr/share/logstash/logstash-
core/lib/logstash/pipeline_action/create.rb:40:in `execute'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in
`block
in converge_state'", "/usr/share/logstash/logstash-
core/lib/logstash/agent.rb:141:in `with_pipelines'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in
`block
in converge_state'", "org/jruby/RubyArray.java:1734:in `each'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `
converge_state'", "/usr/share/logstash/logstash-
core/lib/logstash/agent.rb:166:in `block in
converge_state_and_update'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in
`with_pipelines'", "/usr/share/logstash/logstash-
core/lib/logstash/agent.rb:164:in `converge_state_and_update'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in
`execute'", "/usr/share/logstash/logstash-
core/lib/logstash/runner.rb:348:in `block in execute'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-
0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
Я действительно надеюсь, что вы можете помочь мне, ребята. Я в отчаянии. Bye