Logstash: несколько плагинов на входе logstash - PullRequest
0 голосов
/ 23 ноября 2018

В настоящее время я использую logstash и vulnwhisperer (для извлечения отчетов openvas в json в каталог).Эта интеграция прошла хорошо.

Сейчас у меня проблемы с файлом конфигурации в logstash.Изначально он только получал входные данные из каталога папки, но мне нужно проанализировать информацию, которую я могу получить, запросив эластичный поиск.Итак, я пытаюсь использовать два плагина на входе logstash файла конфигурации.

Как вы можете видеть ниже, logstash не работает должным образом, он продолжает запускаться и завершать работу из-зак ошибке в файле конфигурации.

Ниже вы можете увидеть как статус logstash, так и журналы.Я новичок в logstash, поэтому очень ценю помощь.Спасибо!

IP-адреса, помеченные как "X" только для этой цели

Файл конфигурации Logstash:

# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash

input {
  file {
    path => "/opt/VulnWhisperer/data/openvas/*.json"
    type => json
    codec => json
    start_position => "beginning"
    tags => [ "openvas_scan", "openvas" ]
  }
  elasticsearch {
    hosts => "http://XX.XXX.XXX.XXX:9200" (http://XX.XXX.XXX.XXX:9200') 
    index => "metricbeat-*"
    query => { "query": { "match": {"host.name" : "%{asset}" } } }
    size => 1
    docinfo => false
    sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
  }
}

filter {
  if "openvas_scan" in [tags] {
    mutate {
      replace => [ "message", "%{message}" ]
      gsub => [
        "message", "\|\|\|", " ",
        "message", "\t\t", " ",
        "message", "    ", " ",
        "message", "   ", " ",
        "message", "  ", " ",
        "message", "nan", " ",
        "message",'\n',''
      ]
    }

    grok {
        match => { "path" => "openvas_scan_%{DATA:scan_id}_%{INT:last_updated}.json$" }
     tag_on_failure => []
    }

    mutate {
      add_field => { "risk_score" => "%{cvss}" }
    }

    if [risk] == "1" {
      mutate { add_field => { "risk_number" => 0 }}
      mutate { replace => { "risk" => "info" }}
    }
    if [risk] == "2" {
      mutate { add_field => { "risk_number" => 1 }}
      mutate { replace => { "risk" => "low" }}
    }
    if [risk] == "3" {
      mutate { add_field => { "risk_number" => 2 }}
      mutate { replace => { "risk" => "medium" }}
    }
    if [risk] == "4" {
      mutate { add_field => { "risk_number" => 3 }}
      mutate { replace => { "risk" => "high" }}
    }
    if [risk] == "5" {
      mutate { add_field => { "risk_number" => 4 }}
      mutate { replace => { "risk" => "critical" }}
    }

    mutate {
      remove_field => "message"
    }

    if [first_time_detected] {
      date {
        match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
        target => "first_time_detected"
      }
    }
    if [first_time_tested] {
      date {
        match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
        target => "first_time_tested"
      }
    }
    if [last_time_detected] {
      date {
        match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
        target => "last_time_detected"
      }
    }
    if [last_time_tested] {
      date {
        match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
        target => "last_time_tested"
      }
    }
    date {
      match => [ "last_updated", "UNIX" ]
      target => "@timestamp"
      remove_field => "last_updated"
    }
    mutate {
      convert => { "plugin_id" => "integer"}
      convert => { "id" => "integer"}
      convert => { "risk_number" => "integer"}
      convert => { "risk_score" => "float"}
      convert => { "total_times_detected" => "integer"}
      convert => { "cvss_temporal" => "float"}
      convert => { "cvss" => "float"}
    }
    if [risk_score] == 0 {
      mutate {
        add_field => { "risk_score_name" => "info" }
      }
    }
    if [risk_score] > 0 and [risk_score] < 3 {
      mutate {
        add_field => { "risk_score_name" => "low" }
      }
    }
    if [risk_score] >= 3 and [risk_score] < 6 {
      mutate {
        add_field => { "risk_score_name" => "medium" }
      }
    }
    if [risk_score] >=6 and [risk_score] < 9 {
      mutate {
        add_field => { "risk_score_name" => "high" }
      }
    }
    if [risk_score] >= 9 {
      mutate {
        add_field => { "risk_score_name" => "critical" }
      }
    }
    # Add your critical assets by subnet or by hostname. Comment this field out if you don't want to tag any, but the asset panel will break.
    if [asset] =~ "^10\.0\.100\." {
      mutate {
        add_tag => [ "critical_asset" ]
      }
    }
  }
}
output {
  if "openvas" in [tags] {
    stdout { codec => rubydebug }
    elasticsearch {
      hosts => [ "XX.XXX.XXX.XXX:XXXX" ]
      index => "logstash-vulnwhisperer-%{+YYYY.MM}"
    }
  }
}

Статус службы журнала:

root@logstash:/etc/logstash/conf.d# service logstash status
● logstash.service - logstash
   Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
   Active: active (running) since Fri 2018-11-23 12:17:29 WET; 9s ago
 Main PID: 7041 (java)
    Tasks: 17 (limit: 4915)
   CGroup: /system.slice/logstash.service
           └─7041 /usr/bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedyna

Nov 23 12:17:29 logstash systemd[1]: logstash.service: Service hold-off time over, scheduling restart.
Nov 23 12:17:29 logstash systemd[1]: Stopped logstash.
Nov 23 12:17:29 logstash systemd[1]: Started logstash.

Протокол журнала:

[2018-11-23T16:16:57,156][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:27,133][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:28,380][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, \", ', } at line 31, column 43 (byte 643) after input {\n  file {\n    path => \"/opt/VulnWhisperer/data/openvas/*.json\"\n    type => json\n    codec => json\n    start_position => \"beginning\"\n    tags => [ \"openvas_scan\", \"openvas\" ]\n  }\n  elasticsearch {\n    hosts => \"http://XX.XXX.XXX.XXX:9200\" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:17:28,801][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:58,602][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:59,808][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, \", ', } at line 31, column 43 (byte 643) after input {\n  file {\n    path => \"/opt/VulnWhisperer/data/openvas/*.json\"\n    type => json\n    codec => json\n    start_position => \"beginning\"\n    tags => [ \"openvas_scan\", \"openvas\" ]\n  }\n  elasticsearch {\n    hosts => \"http://XX.XXX.XXX.XXX:XXXX\" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:18:00,174][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

1 Ответ

0 голосов
/ 18 декабря 2018

Пожалуйста, измените настройки ниже

elasticsearch {
    hosts => "localhost" 
    index => "metricbeat-*"
    query => '{ "query": { "match": {"host.name" : "%{asset}" } } }'
    size => 1
    docinfo => false
    #sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
  }
...