2017-10-12 100 views
1

创建索引和映射到ES我一直在关注这个教程impport数据从数据库到LOGSTASh并创建一个IDEX和映射到弹性搜索 INSERT INTO LOGSTASH SELECT DATA FROM DATABASE如何从LOGSTASH

这是基于我的输出我的配置文件:

[2017-10-12T11:50:45,807][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/fb_apache/configuration"} 
[2017-10-12T11:50:45,812][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/netflow/configuration"} 
[2017-10-12T11:50:46,518][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} 
[2017-10-12T11:50:46,521][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} 
[2017-10-12T11:50:46,652][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} 
[2017-10-12T11:50:46,654][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} 
[2017-10-12T11:50:46,716][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} 
[2017-10-12T11:50:46,734][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]} 
[2017-10-12T11:50:46,749][INFO ][logstash.pipeline  ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500} 
[2017-10-12T11:50:47,053][INFO ][logstash.pipeline  ] Pipeline main started 
[2017-10-12T11:50:47,196][INFO ][logstash.agent   ] Successfully started Logstash API endpoint {:port=>9600} 
[2017-10-12T11:50:47,817][INFO ][logstash.inputs.jdbc  ] (0.130000s) SELECT * from EP_RDA_STRING 
[2017-10-12T11:50:53,095][WARN ][logstash.agent   ] stopping pipeline {:id=>"main"} 

一切似乎没问题,至少我觉得。除了将ES服务器查询到OUTPUT索引和映射的事实外,我已将它列为空。

http://localhost:9200/_all/_mapping 

{} 

http://localhost:9200/_cat/indices?v 

health status index uuid pri rep docs.count docs.deleted store.size pri.store.size 

这是我的文件配置:

input { 
    jdbc { 
     # sqlserver jdbc connection string to our database, mydb   
     jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=RDA; integratedSecurity=true;" 
     # The user we wish to execute our statement as 
     jdbc_user => "" 
     # The path to our downloaded jdbc driver 
     jdbc_driver_library => "C:\mypath\sqljdbc_6.2\enu\mssql-jdbc-6.2.1.jre8.jar" 
     # The name of the driver class for Postgresql 
     jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver" 
     # our query 
     statement => "SELECT * from EP_RDA_STRING" 
    } 
} 
output { 
    elasticsearch { 

     index => "RDA" 
     document_type => "RDA_string_view" 
     document_id => "%{ndb_no}" 
     hosts => "localhost:9200" 
    } 
} 
+0

一两件事给出的是一个ES指标必须全部小写(即'rda',而不是'RDA'),所以在我看来,你可能在你的ES日志中有一个错误告诉你。 – Val

回答

0

您正在使用哪个logstash的版本?你用来启动logstash的命令是什么?请确保输入和输出块类似于被下面

input { 
    beats { 
     port => "29600" 
     type => "weblogic-server" 
    } 
} 
filter { 
} 

output { 
    elasticsearch { 
     hosts => ["127.0.0.1:9200"] 
     index => "logstash-%{+YYYY.MM.dd}" 
    } 
    stdout { codec => rubydebug } 
}