2017-10-16 173 views
1

我是新来的ELK堆栈,我打破了我的想法,试图使用logstach将单行json文件导入到elasticsearch。 Elasticsearch(10.10.20.13:9200/monitor/_search?q=*)或kibana中没有任何内容显示。无法通过logstash导入JSON数据到弹性

的json的样子:

{"host":"*********","cpu":"2.1","disk":"0.628242","memory":"0.324597","createAt":"2017-10-03T00:18:01"} 

我的配置文件:

input{ 
    file{ 
     path => "/usr/share/logstash/log/monitor-sys-1506979201881.json" 
    sincedb_path => "/dev/null" 
    start_position => "beginning" 
    } 
} 
output{ 
    elasticsearch { 
     hosts =>["10.10.20.13:9200"] 
     index => ["monitor"]   
    } 
    stdout { 
    codec => rubydebug 
    } 
} 

另一个配置我(搜索我还添加了JSON编解码器\ &过滤器,但没有改变的时段后)尝试没有sucsess是:

input{ 
    file{ 
     path => "/usr/share/logstash/log/monitor-sys-1506979201881.json" 
    sincedb_path => "/dev/null" 
    start_position => "beginning" 
    type => "json" 
    } 
} 

filter{ 
json { 
source => "message" 
} 
} 
output{ 
    elasticsearch { 
     hosts =>["10.10.20.13:9200"] 
     index => ["monitor"]   
    } 
    stdout { 
    codec => rubydebug 
    } 
} 

我正在运行的命令:

/usr/share/logstash/bin/logstash -f /opt/*****/sys-monit/logstash-sys-monitor.conf --path.settings /etc/logstash --verbose --debug 

调试产生以下结果:

[2017-10-16T15:56:29,118][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} 
[2017-10-16T15:56:29,122][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x4f2ed590 @kibana_version_parts=["5", "6", "0"], @module_name="fb_apache", @directory="/usr/share/logstash/modules/fb_apache/configuration">} 
[2017-10-16T15:56:29,123][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} 
[2017-10-16T15:56:29,124][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0xb732fcc @kibana_version_parts=["5", "6", "0"], @module_name="netflow", @directory="/usr/share/logstash/modules/netflow/configuration">} 
[2017-10-16T15:56:29,292][DEBUG][logstash.agent   ] Agent: Configuring metric collection 
[2017-10-16T15:56:29,295][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} 
[2017-10-16T15:56:29,420][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} 
[2017-10-16T15:56:29,507][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} 
[2017-10-16T15:56:29,508][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} 
[2017-10-16T15:56:29,529][DEBUG][logstash.agent   ] Reading config file {:config_file=>"/opt/experis-cyber/sys-monitor/logstash-sys-monitor.conf"} 
[2017-10-16T15:56:29,709][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"input", :class=>LogStash::Inputs::File} 
[2017-10-16T15:56:29,733][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain} 
[2017-10-16T15:56:29,750][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_2c3eb918-337a-4935-bf78-bfe3ab709129" 
[2017-10-16T15:56:29,750][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true 
[2017-10-16T15:56:29,750][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8" 
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@path = ["/usr/share/logstash/log/monitor-sys-1506979201881.json"] 
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@sincedb_path = "/dev/null" 
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@start_position = "beginning" 
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@id = "9e9162561d919c7b40b4a16e9f4e8e6e81267f8d-1" 
[2017-10-16T15:56:29,752][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@enable_metric = true 
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain id=>"plain_2c3eb918-337a-4935-bf78-bfe3ab709129", enable_metric=>true, charset=>"UTF-8"> 
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@add_field = {} 
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@stat_interval = 1 
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@discover_interval = 15 
[2017-10-16T15:56:29,753][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@sincedb_write_interval = 15 
[2017-10-16T15:56:29,754][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@delimiter = "\n" 
[2017-10-16T15:56:29,754][DEBUG][logstash.inputs.file  ] config LogStash::Inputs::File/@close_older = 3600 
[2017-10-16T15:56:29,933][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch} 
[2017-10-16T15:56:29,968][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_ddd82ced-4bda-414a-a9c2-d70ea27bde23" 
[2017-10-16T15:56:29,969][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true 
[2017-10-16T15:56:29,969][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8" 
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//10.10.20.13:9200] 
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "monitor" 
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "9e9162561d919c7b40b4a16e9f4e8e6e81267f8d-2" 
[2017-10-16T15:56:30,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true 
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_ddd82ced-4bda-414a-a9c2-d70ea27bde23", enable_metric=>true, charset=>"UTF-8"> 
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1 
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true 
[2017-10-16T15:56:30,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash" 
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false 
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil 
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1 
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = "" 
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false 
[2017-10-16T15:56:30,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = "" 
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline" 
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless" 
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event" 
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false 
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2 
[2017-10-16T15:56:30,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64 
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1 
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil 
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index" 
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true 
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false 
[2017-10-16T15:56:30,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5 
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60 
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = [] 
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000 
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100 
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5 
[2017-10-16T15:56:30,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000 
[2017-10-16T15:56:30,018][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false 
[2017-10-16T15:56:30,044][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout} 
[2017-10-16T15:56:30,118][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug} 
[2017-10-16T15:56:30,122][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_9c1ee8e6-9fa4-4553-96d1-803214216fd9" 
[2017-10-16T15:56:30,122][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true 
[2017-10-16T15:56:30,122][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false 
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_9c1ee8e6-9fa4-4553-96d1-803214216fd9", enable_metric=>true, metadata=>false> 
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@id = "9e9162561d919c7b40b4a16e9f4e8e6e81267f8d-3" 
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@enable_metric = true 
[2017-10-16T15:56:30,345][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@workers = 1 
[2017-10-16T15:56:30,364][DEBUG][logstash.agent   ] starting agent 
[2017-10-16T15:56:30,367][DEBUG][logstash.agent   ] starting pipeline {:id=>"main"} 
[2017-10-16T15:56:30,384][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil} 
[2017-10-16T15:56:31,479][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.10.20.13:9200/]}} 
[2017-10-16T15:56:31,480][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.10.20.13:9200/, :path=>"/"} 
[2017-10-16T15:56:31,945][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.10.20.13:9200/"} 
[2017-10-16T15:56:31,968][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} 
[2017-10-16T15:56:32,144][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} 
[2017-10-16T15:56:32,188][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"} 
[2017-10-16T15:56:32,189][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.10.20.13:9200"]} 
[2017-10-16T15:56:32,192][INFO ][logstash.pipeline  ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125} 
[2017-10-16T15:56:33,141][INFO ][logstash.pipeline  ] Pipeline main started 
[2017-10-16T15:56:33,202][DEBUG][logstash.inputs.file  ] _globbed_files: /usr/share/logstash/log/monitor-sys-1506979201881.json: glob is: ["/usr/share/logstash/log/monitor-sys-1506979201881.json"] 
[2017-10-16T15:56:33,203][DEBUG][logstash.inputs.file  ] _discover_file: /usr/share/logstash/log/monitor-sys-1506979201881.json: new: /usr/share/logstash/log/monitor-sys-1506979201881.json (exclude is []) 
[2017-10-16T15:56:33,204][DEBUG][logstash.inputs.file  ] _open_file: /usr/share/logstash/log/monitor-sys-1506979201881.json: opening 
[2017-10-16T15:56:33,205][DEBUG][logstash.inputs.file  ] /usr/share/logstash/log/monitor-sys-1506979201881.json: initial create, no sincedb, seeking to beginning of file 
[2017-10-16T15:56:33,206][DEBUG][logstash.inputs.file  ] writing sincedb (delta since last write = 1508158593) 
[2017-10-16T15:56:33,207][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:33,208][DEBUG][logstash.agent   ] Starting puma 
[2017-10-16T15:56:33,211][DEBUG][logstash.agent   ] Trying to start WebServer {:port=>9600} 
[2017-10-16T15:56:33,212][DEBUG][logstash.api.service  ] [api-service] start 
[2017-10-16T15:56:33,321][INFO ][logstash.agent   ] Successfully started Logstash API endpoint {:port=>9600} 
[2017-10-16T15:56:34,211][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:35,214][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:36,223][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:37,227][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:38,158][DEBUG][logstash.pipeline  ] Pushing flush onto pipeline 
[2017-10-16T15:56:38,230][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:39,233][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:40,238][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:41,240][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:42,243][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:43,159][DEBUG][logstash.pipeline  ] Pushing flush onto pipeline 
[2017-10-16T15:56:43,245][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:44,247][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:45,250][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:46,252][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:47,255][DEBUG][logstash.inputs.file  ] each: file grew: /usr/share/logstash/log/monitor-sys-1506979201881.json: old size 0, new size 115 
[2017-10-16T15:56:47,257][DEBUG][logstash.inputs.file  ] _globbed_files: /usr/share/logstash/log/monitor-sys-1506979201881.json: glob is: ["/usr/share/logstash/log/monitor-sys-1506979201881.json"] 

我会很感激上的任何帮助。

+0

如何在您的文件输入中简单地使用'codec => json'并移除'json'过滤器? – Val

+0

这是我尝试过的第一件事情之一。 –

+0

它应该工作,虽然... – Val

回答

0

ISSM与logstash-5.5.0 我Logstash配置工作,例如:

input { 
file { 
    path => "/opt/logs/json.log" 
    start_position => "beginning" 
    type => "logstash" 
    } 
} 
filter { 
    json { 
     source => "message" 
     skip_on_invalid_json => true 
     # add_field => { "testfield" => "test_static_value" } 
     # add_tag => [ "test_tag" ] 
     # target => "test_target" 
    } 
} 
output { 
    if [type] == "logstash" { 
     elasticsearch { 
      hosts => ["elasticsearch:9200"] 
      index => "logstash" 
     } 
    } 
    stdout { codec => rubydebug } 
} 
+0

这个配置也没有运气, 任何其他的想法? 我所要做的就是将我预先准备好的json文件输入到弹性搜索中。 –

+0

不管我已经试过,logstash保持在调试日志重复波纹管4号线: _ [2017-10-18T14:23:13,175] [DEBUG] [logstash.inputs.file] _globbed_files: /usr/share/logstash/log/test.csv:glob是:[] [2017-10-18T14:23:14,106] [DEBUG] [logstash.pipeline]将冲水推向管线 [2017-10-18T14: 23:19,107] [DEBUG] [logstash.pipeline]将管道冲洗推入管道 [2017-10-18T14:23:24,107] [DEBUG] [logstash.pipeline] –

0

你Logstash CONFIGS似乎罚款。你的json文件在行尾是否有换行符?如果不是,则可能不会被解析,因为Logstash会在此处被挂起。有时候我用jsons遇到过这个问题。