Я пытаюсь принять данные whois в формате json в elasticsearch, но получаю ошибку jsonparsefailure.
ФАЙЛ TXT:
Запрос ______________________________________________________________ GET http://ipwhois.app/json/xxx.xxx.xxx.xxx Accept-Encoding: gzip, deflate Хост: ipwhois.app Подключение: Keep-Alive User-Agent: Apache -HttpClient / 4.1.1 (java 1.5)
Ответ _____________________________________________________________ {"ip": " XXX.XXX.XXX.XXX »,« success »: true,« type »:« IPv4 »,« continent »:« Oceania »,« continent_code »:« O C »,« country »:« Австралия », "country_code": "AU", "country_flag": "", "country_capital": "", "country_phone": "+ 61", "country_neighbours": "", "region": "Западная Австралия", "city" : "Перт", "широта": "- 31.9505269", "долгота": "115.8604572", "asn": "", "org": "", "isp": "", "часовой пояс": "Австралия / Перт »,« timezone_name »:« Стандартное западное австралийское время »,« timezone_dstOffset »:« 0 »,« timezone_gmtOffset »:« 28800 »,« timezone_gmt »:« GMT +8: 00 »,« currency »:« Австралийский доллар » , "curr ency_code ":" AUD "," currency_symbol ":" $ "," currency_rates ":" 1.388705 "," currency_plural ":" Австралийские доллары "," completed_requests ": 158}
В elasticsearch мне нужен только поля: "ip", "country", "org" Моя конфигурация выглядит так:
input {
file {
type => "json"
path => "C:/Users/xxxxx/Desktop/New folder/Whois/*.txt"
start_position => "beginning"
sincedb_path => "C:/Users/xxxxx/Desktop/null/logdbpath.txt"
close_older => "1 hour"
stat_interval => "1 second"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "whoislogs"
}
stdout {
codec => rubydebug
}
}
[2020-08-06T21:39:40,983][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
[2020-08-06T21:40:41,957][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Parsed JSON object/hash requ
ires a target configuration option {:source=>"message", :raw=>"\r"}
[2020-08-06T21:40:41,957][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>" Request_____________________________________________________
_________\r", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Req
uest______________________________________________________________': was expecti
ng ('true', 'false' or 'null')
at [Source: (byte[])" Request__________________________________________________
"; line: 1, column: 72]>}
[2020-08-06T21:40:41,962][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>"Accept-Encoding: gzip,deflate\r", :exception=>#<LogStash::Jso
n::ParserError: Unrecognized token 'Accept': was expecting ('true', 'false' or '
null')
"; line: 1, column: 8]>}cept-Encoding: gzip,deflate
[2020-08-06T21:40:41,964][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>"GET http://ipwhois.app/json/2.228.76.247 HTTP/1.1\r", :except
ion=>#<LogStash::Json::ParserError: Unrecognized token 'GET': was expecting ('tr
ue', 'false' or 'null')
"; line: 1, column: 5]>}T http://ipwhois.app/json/2.228.76.247 HTTP/1.1
[2020-08-06T21:40:41,982][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>"Host: ipwhois.app\r", :exception=>#<LogStash::Json::ParserErr
or: Unrecognized token 'Host': was expecting ('true', 'false' or 'null')
"; line: 1, column: 6]>}st: ipwhois.app
[2020-08-06T21:40:41,986][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>"Connection: Keep-Alive\r", :exception=>#<LogStash::Json::Pars
erError: Unrecognized token 'Connection': was expecting ('true', 'false' or 'nul
l')
"; line: 1, column: 12]>}nection: Keep-Alive
[2020-08-06T21:40:41,989][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Parsed JSON object/hash requ
ires a target configuration option {:source=>"message", :raw=>"\r"}
[2020-08-06T21:40:41,989][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Parsed JSON object/hash requ
ires a target configuration option {:source=>"message", :raw=>"\r"}
[2020-08-06T21:40:41,999][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>" Response____________________________________________________
_________\r", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Res
ponse_____________________________________________________________': was expecti
ng ('true', 'false' or 'null')
at [Source: (byte[])" Response_________________________________________________
"; line: 1, column: 72]>}
[2020-08-06T21:40:42,002][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Error parsing json {:source=
>"message", :raw=>"User-Agent: Apache-HttpClient/4.1.1 (java 1.5)\r", :exception
=>#<LogStash::Json::ParserError: Unrecognized token 'User': was expecting ('true
', 'false' or 'null')
"; line: 1, column: 6]>}er-Agent: Apache-HttpClient/4.1.1 (java 1.5)
[2020-08-06T21:40:42,020][WARN ][logstash.filters.json ][main][b258b5251b81c4
b80bf9c6f4ddba9d5668c171bdd02b2d087ddd79a8b0dc0180] Parsed JSON object/hash requ
ires a target configuration option {:source=>"message", :raw=>"\r"}
F:/ELK/logstash-7.8.0/logstash-7.8.0/vendor/bundle/jruby/2.5.0/gems/awesome_prin
t-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::F
ixnum is deprecated
{
"tags" => [
[0] "_jsonparsefailure"
],
"path" => "C:/Users/Desktop/New folder/Whois/2.228.
76.247.txt",
"type" => "json",
"message" => " Request___________________________________________________
___________\r",
"host" => "redacted",
"@version" => "1",
"@timestamp" => 2020-08-06T16:10:41.833Z
}
{
Что-то не хватает, я думаю, в фильтре. Может кто поможет.