2017-04-11 89 views
1

我目前在docker中运行ELK堆栈(https://github.com/deviantony/docker-elk),并且有一个独立的java应用程序,我试图使用log4j SocketAppender将日志从logstash发送到该应用程序。当我在Kibana中查看我的日志时,消息看起来编码不正确。我对ELK堆栈非常陌生,并且尝试了很多我在这里找到的解决方案,但是我尝试的任何东西似乎都不起作用。预先感谢您的帮助。使用log4j和ELK堆栈错误编码的日志消息堆栈

logstash.conf:

input { 
    log4j { 
     mode => "server" 
     host => "0.0.0.0" 
     port => 5000 
     type => "log4j" 
     } 
} 

## Add your filters/logstash plugins configuration here 

filter { 
    # All lines that does not start with %{TIMESTAMP} or ' ' + %{TIMESTAMP} belong to the previous event 
    multiline { 
     pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-" 
     negate => true 
     what => "previous" 
    } 
} 

output { 
    elasticsearch { 
     hosts => "elasticsearch:9200" 
    } 
} 

log4j.properties:

log4j.rootLogger=info,tcp 

log4j.appender.tcp=org.apache.log4j.net.SocketAppender 
log4j.appender.tcp.Port=5000 
log4j.appender.tcp.RemoteHost=localhost 
log4j.appender.tcp.ReconnectionDelay=10000 
log4j.appender.tcp.Application=hello-world 
log4j.appender.myappender.encoding=UTF-8 

Kibana登录: enter image description here

回答

1

原来,这是关系到是在无线ndows环境。从linux环境运行解决了编码问题。我不知道是否有在Windows上解决编码的问题的方式......

正确logstash配置为TCP多支持,为我工作:

input { 
    log4j { 
     mode => "server" 
     host => "0.0.0.0" 
     port => 5000 
     type => "log4j" 
     codec => multiline { 
      pattern => "^\s" 
      what => "previous" 
     } 
     } 
} 

## Add your filters/logstash plugins configuration here 

output { 
    elasticsearch { 
     hosts => "elasticsearch:9200" 
    } 
}