Я хочу синхронизировать мои данные mongodb (локальный mongodb) с упругим поиском (local эластичный), используя logstash-плагин mongodb
Я установил плагин logstash, используя
bin/logstash-plugin install logstash-input-mongodb .
Затем ясоздал файл mongodata.conf в каталоге /usr/share/logstash
. Когда я выполняю файл conf, он показывает --> Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
Мой файл конфигурации:
input{
mongodb{
uri => "mongodb://localhost:27017/reporterDB"
placeholder_db_dir => "/opt/logstash-mongodb/"
placeholder_db_name => "logstash_sqlite.db"
collection => "iam_ms_test"
batch_size => 5000
}
}
filter{
}
output {
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => "localhost:9200"
user => elastic
password => changeme
index => "mongo_log"
document_type => "document_type"
document_id => "%{id}"
}
}
Я получаю строки ниже в файле logstash-plain.log
[2019-11-01T15:41:00,869][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-11-01T15:41:00,871][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>6, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>750, :thread=>"#<Thread:0x351f7fd1@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:245 run>"}
[2019-11-01T15:41:01,068][INFO ][logstash.inputs.mongodb ] Registering MongoDB input
[2019-11-01T15:41:01,116][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::MongoDB uri=>\"mongodb://localhost:27017/anchorReports\", placeholder_db_dir=>\"/opt/logstash-mongodb/\", placeholder_db_name=>\"logstash_sqlite.db\", collection=>\"hi_p5m\", batch_size=>5000, id=>\"ec7682e8c6c5676deca84d5072c5f7865120a107ffce81ce21caa878c6e4ed09\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_441f95b8-cc8a-4b9e-a45f-657ed2011e2b\", enable_metric=>true, charset=>\"UTF-8\">, since_table=>\"logstash_since\", since_column=>\"_id\", since_type=>\"id\", parse_method=>\"flatten\", isodate=>false, retry_delay=>3, generateId=>false, unpack_mongo_id=>false, message=>\"Default message...\", interval=>1>", :error=>"Java::JavaSql::SQLException: path to '/opt/logstash-mongodb/logstash_sqlite.db': '/opt/logstash-mongodb' does not exist", :thread=>"#<Thread:0x351f7fd1@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:245 run>"}
[2019-11-01T15:41:01,869][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Sequel::DatabaseConnectionError: Java::JavaSql::SQLException: path to '/opt/logstash-mongodb/logstash_sqlite.db': '/opt/logstash-mongodb' does not exist>, :backtrace=>["org.sqlite.core.CoreConnection.open(org/sqlite/core/CoreConnection.java:190)", "org.sqlite.core.CoreConnection.<init>(org/sqlite/core/CoreConnection.java:74)", "org.sqlite.jdbc3.JDBC3Connection.<init>(org/sqlite/jdbc3/JDBC3Connection.java:24)", "org.sqlite.jdbc4.JDBC4Connection.<init>(org/sqlite/jdbc4/JDBC4Connection.java:23)", "org.sqlite.SQLiteConnection.<init>
"(org/sqlite/SQLiteConnection.java:45)",
"org.sqlite.JDBC.createConnection(org/sqlite/JDBC.java:114)",
"org.sqlite.JDBC.connect(org/sqlite/JDBC.java:88)"
Я хочу, чтобы записи в моем упругом поиске находились в `index (" mongo_log "). Я также хочу знать, как используются placeholder_db_dir и placeholder_db_name и каковы должны быть эти значения, когда мы используем mongodb в качестве входной базы данных.