我试图运行一个简单的csv文件与Logstash到ElasticSearch。ElasticSearch引发日期字符串导入csv与Logstash时的转换错误
但是,当我运行它,我得到以下错误将字符串转换为日期格式(第一个日期列)。
"error"=>{
"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [Date]",
"caused_by"=>{
"type"=>"illegal_argument_exception",
"reason"=>"Invalid format: \"Date\""}}}}
当我删除日期列时,所有工作都很好。
我使用以下csv文件:
Date,Open,High,Low,Close,Volume,Adj Close
2015-04-02,125.03,125.56,124.19,125.32,32120700,125.32
2015-04-01,124.82,125.12,123.10,124.25,40359200,124.25
2015-03-31,126.09,126.49,124.36,124.43,41852400,124.43
2015-03-30,124.05,126.40,124.00,126.37,46906700,126.37
及以下logstash.conf:
input {
file {
path => "path/file.csv"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
}
mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
mutate {convert => ["Volume", "float"]}
date {
match => ["Date", "yyyy-MM-dd"]
target => "Date"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "stock15"
workers => 1
}
stdout {}
}
看来我处理日期罚款。任何想法可能会出错?
谢谢!