У меня проблема с динамическим отображением.У меня есть поле message
, то есть json
, а elasticsearch
с помощью json
плагин фильтра определяет поля этого поля, чтобы быть различными.Что я могу с этим поделать?Я хочу разобрать его и добавить поля - мне все равно, какие типы этих полей.
Я получил следующую ошибку в logstash
:
[2019-02-13T13: 12: 20,087] [WARN] [logstash.outputs.elasticsearch] Не удалось проиндексировать событие для Elasticsearch.{: status => 400,: action => ["index", {: _id => nil,: _index => "filebeat-2019.02.13",: _type => "doc",: routing => nil},#],: response => {"index" => {"_ index" => "filebeat-2019.02.13", "_type" => "doc", "_id" => "uhzF5mgBmZ_b74M8qLSn", "status" =>400, "error" => {"type" => "mapper_parsing_exception", "reason" => "объектное сопоставление для [TestJson.payload] попыталось проанализировать поле [payload] как объект, но нашло конкретное значение"}}}}
Мой файл grok выглядит так:
if [source] =~ ".*request_response\.json$" {
json{
source => "message"
target => "TestJson"
}
if [payload] =~ /{+/ { // check if it is an object
mutate {
add_field => { "type" => "%{[TestJson][type]}" }
add_field => { "payload" => "%{[TestJson][payload]}" }
}
} # end if payload is an object
mutate {
convert => {
"type" => "string"
"payload" => "string"
} # end convert
} # end mutate
} # end if source is json
} # end filter
output {
elasticsearch {
hosts => "localhost:9201"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
} # end elasticsearch
Есть json с именем message
, внутри которого есть пустые payload
или payload
, например, date
или более сложный object
внутри.
Я думаю, что эта ошибка возникает, когда объект недостаточно сложен в message
json.
Я пишу этот грок, потому что я хочу сделатьполя из вложенного объекта json.Как я могу исправить эту ошибку?
РЕДАКТИРОВАТЬ:
я добавил stdout
в logstash config
около elasticsearch
и с командой
journalctl -u logstash.service --since "10 minutes ago" | grep -C 30 'Could not index event'
я получаю эти журналы прямо рядом с ошибкой:
Feb 13 15:29:02 f logstash[19144]: [2019-02-13T15:29:02,328][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2019.02.13", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x4fe8d3ae>], :response=>{"index"=>{"_index"=>"filebeat-2019.02.13", "_type"=>"doc", "_id"=>"Px1C52gBmZ_b74M80KX2", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [TestJson.payload] tried to parse field [payload] as object, but found a concrete value"}}}}
Feb 13 15:29:05 f logstash[19144]: {
Feb 13 15:29:05 f logstash[19144]: "host" => {
Feb 13 15:29:05 f logstash[19144]: "id" => "b",
Feb 13 15:29:05 f logstash[19144]: "name" => "f",
Feb 13 15:29:05 f logstash[19144]: "containerized" => true,
Feb 13 15:29:05 f logstash[19144]: "architecture" => "x86_64",
Feb 13 15:29:05 f logstash[19144]: "os" => {
Feb 13 15:29:05 f logstash[19144]: "family" => "redhat",
Feb 13 15:29:05 f logstash[19144]: "platform" => "centos",
Feb 13 15:29:05 f logstash[19144]: "version" => "7 (Core)",
Feb 13 15:29:05 f logstash[19144]: "codename" => "Core"
Feb 13 15:29:05 f logstash[19144]: }
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "pid" => "27854",
Feb 13 15:29:05 f logstash[19144]: "beat" => {
Feb 13 15:29:05 f logstash[19144]: "hostname" => "f",
Feb 13 15:29:05 f logstash[19144]: "name" => "f",
Feb 13 15:29:05 f logstash[19144]: "version" => "6.5.3"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "message" => "{\"type\":\"Response\",\"payload\":\"2019-02-13T15:29:00.276\"}",
Feb 13 15:29:05 f logstash[19144]: "severity" => "DEBUG",
Feb 13 15:29:05 f logstash[19144]: "parent" => "6fce34dc18cb0e31",
Feb 13 15:29:05 f logstash[19144]: "event" => "",
Feb 13 15:29:05 f logstash[19144]: "span" => "44c9f754c7ca5b58",
Feb 13 15:29:05 f logstash[19144]: "@timestamp" => 2019-02-13T14:29:01.669Z,
Feb 13 15:29:05 f logstash[19144]: "input" => {
Feb 13 15:29:05 f logstash[19144]: "type" => "log"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "thread" => "http-nio-9080-exec-54",
Feb 13 15:29:05 f logstash[19144]: "service" => "bi",
--
Feb 13 15:29:05 f logstash[19144]: "@timestamp" => 2019-02-13T14:29:01.669Z,
Feb 13 15:29:05 f logstash[19144]: "service" => "bi",
Feb 13 15:29:05 f logstash[19144]: "prospector" => {
Feb 13 15:29:05 f logstash[19144]: "type" => "log"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "thread" => "http-nio-9080-exec-54",
Feb 13 15:29:05 f logstash[19144]: "offset" => 3006421,
Feb 13 15:29:05 f logstash[19144]: "@version" => "1",
Feb 13 15:29:05 f logstash[19144]: "input" => {
Feb 13 15:29:05 f logstash[19144]: "type" => "log"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "tags" => [
Feb 13 15:29:05 f logstash[19144]: [0] "beats_input_codec_plain_applied"
Feb 13 15:29:05 f logstash[19144]: ],
Feb 13 15:29:05 f logstash[19144]: "TestJson" => {
Feb 13 15:29:05 f logstash[19144]: "payload" => "",
Feb 13 15:29:05 f logstash[19144]: "url" => "/bi/getTime",
Feb 13 15:29:05 f logstash[19144]: "type" => "Request",
Feb 13 15:29:05 f logstash[19144]: "sessionId" => 476,
Feb 13 15:29:05 f logstash[19144]: "username" => "k",
Feb 13 15:29:05 f logstash[19144]: "lang" => "pl",
Feb 13 15:29:05 f logstash[19144]: "contentType" => "null",
Feb 13 15:29:05 f logstash[19144]: "ipAddress" => "127.0.0.1",
Feb 13 15:29:05 f logstash[19144]: "method" => "POST",
Feb 13 15:29:05 f logstash[19144]: "queryString" => "null"
Feb 13 15:29:05 f logstash[19144]: },
Feb 13 15:29:05 f logstash[19144]: "trace" => "6fce34dc18cb0e31",
Feb 13 15:29:05 f logstash[19144]: "date" => "2019-02-13 15:29:00,277",
Feb 13 15:29:05 f logstash[19144]: "source" => "/opt/tomcat-bo/logs/bi_request_response.json"
Feb 13 15:29:05 f logstash[19144]: }
Feb 13 15:29:05 f logstash[19144]: [2019-02-13T15:29:05,326][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2019.02.13", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x1812fa5e>], :response=>{"index"=>{"_index"=>"filebeat-2019.02.13", "_type"=>"doc", "_id"=>"Uh1C52gBmZ_b74M83KWr", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [TestJson.payload] tried to parse field [payload] as object, but found a concrete value"}}}}