2017-02-18 132 views
0

我不断收到此错误。我试图解析一个csv。文件。 我想知道如果我错过了一个库或其他东西。Logstash管道中止

我在Windows命令行中使用logstash.bat -f logstash.conf命令来运行此操作并获取此输出。

我使用rubydebug编解码器从日志

21:19:03.781 [main] INFO logstash.setting.writabledirectory - Creating directory {:setting=>"path.queue", :path=>"C:/Users/Public/logstash-5.2.1/data/queue"} 

21:19:03.787 [LogStash::Runner] INFO logstash.agent - No persistent UUID file found. Generating new UUID {:uuid=>"0546332b-dc4d-4916-b5c6-7900d1fdd8a4", :path=>"C:/Users/Public/logstash-5.2.1/data/uuid"} 

21:19:04.138 [[main]-pipeline-manager] ERROR logstash.agent - Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/Public/logstash-5.2.1/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:178:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "C:/Users/Public/logstash-5.2.1/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:172:in `register'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/pipeline.rb:235:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/pipeline.rb:235:in `start_workers'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/pipeline.rb:188:in `run'", "C:/Users/Public/logstash-5.2.1/logstash-core/lib/logstash/agent.rb:302:in `start_pipeline'"]} 

单线试图输出:

80,17-02-2017 18:28:31,56.000,45.000,0.000,2.000,0.000,44.000,55.000,57.000,50.000 

从日志中的几行。

80,17-02-2017 18:28:31,56.000,45.000,0.000,2.000,0.000,44.000,55.000,57.000,50.000 

80,17-02-2017 18:28:32,53.000,45.000,0.000,3.000,0.000,54.000,43.000,54.000,43.000 

80,17-02-2017 18:28:33,56.000,45.000,0.000,2.000,0.000,45.000,51.000,43.000,50.000 

80,17-02-2017 18:28:34,53.000,45.000,0.000,1.000,0.000,42.000,47.000,48.000,48.000 

80,17-02-2017 18:28:35,59.000,45.000,0.000,2.000,0.000,45.000,59.000,39.000,48.000 

80,17-02-2017 18:28:36,56.000,45.000,0.000,3.000,0.000,44.000,49.000,50.000,50.000 

MY FILTER

filter { 
    csv { 
     columns => ["port", "timestamp", "tempcpuavg", "gputemp", "fanspeed", "gpuusage", "framerate", "tempcpu1", "tempcpu2", "tempcpu3", "tempcpu4"] 
        #80, 17-02-2017 18:28:31,56.000, 45.000,  0.000,  2.000,  0.000,  44.000,  55.000, 57.000,  50.000 
     separator => "," 
     skip_empty_columns => "true" 
     remove_field => ["message"] 
    } 
    mutate { 
     convert => ["port", "integer"] 
     convert => ["tempcpuavg", "double"] 
     convert => ["gputemp", "double"] 
     convert => ["fanspeed", "double"] 
     convert => ["gpuusage", "double"] 
     convert => ["framerate", "double"] 
     convert => ["tempcpu1", "double"] 
     convert => ["tempcpu2", "double"] 
     convert => ["tempcpu3", "double"] 
     convert => ["tempcpu4", "double"] 
    } 

    date { 
     match => [@timestamp", "MM-dd-YYYY HH:mm:ss"] 
    } 
} 
+0

在解析您的筛选器日志文件错误发生的情况。你能粘贴几行日志和你配置的过滤器吗? – NutcaseDeveloper

+0

嘿,我添加了一行,然后再添加一行。另外,为了清楚起见,我添加了日志行之间的额外行。 – ScipioAfricanus

+0

我也添加了过滤器。谢谢。 – ScipioAfricanus

回答

2

mutate/convert滤波器使用不支持的数据类型,即doublemutate筛选器的documentation指出

有效的转换目标为:整数,浮点数,字符串和布尔值。

所以,你只需要改变所有doublefloat

mutate { 
    convert => ["port", "integer"] 
    convert => ["tempcpuavg", "float"] 
    convert => ["gputemp", "float"] 
    convert => ["fanspeed", "float"] 
    convert => ["gpuusage", "float"] 
    convert => ["framerate", "float"] 
    convert => ["tempcpu1", "float"] 
    convert => ["tempcpu2", "float"] 
    convert => ["tempcpu3", "float"] 
    convert => ["tempcpu4", "float"] 
} 
+0

嘿,仍然有同样的错误。我没有安装logst。在我的Windows PC上。我只使用logstash -f logstash.yml在控制台中运行它。我想知道如果我需要安装任何东西.. – ScipioAfricanus

+0

你是否得到完全相同的错误? – Val

+0

是的,看起来像完全相同的错误 – ScipioAfricanus