2016-11-24 68 views
4

我用下面的参数运行公共弹性搜索容器:在找不到弹性搜索搬运工集装箱原木

docker run -d -v /elasticsearch/data:/usr/share/elasticsearch/data -p 9200:9200 -p 9300:9300 --name my_elastic_search elasticsearch:2.4.1 -Des.cluster.name="elastic_search_name" 

我感兴趣的是获得日志,但我有没有运气找到他们。他们应该在哪里找到?我查看了/var/log/elasticsearch/usr/share/elasticsearch/logs,这两个目录都是空的

+0

这很奇怪。我检查了它的Dockerfile,日志应该放在''/ usr/share/elasticsearch/logs'''中。你可以检查'''elasticsearch.yml'''中的'''path.logs''吗? – Tuan

+0

只是为了确保 - 你是否从容器中寻找日志? – ronkot

+0

如果未指定path.logs,记录被禁用?我在我的配置中所有的都是'network.host:0.0.0.0' – jamesatha

回答

1

标记为2.4.x的docker容器中的默认日志记录配置不启用文件日志记录。一个解决办法是到elasticsearch config文件夹映射到包含一个logging.yml文件自己的音量(和elasticsearch.yml文件和scripts文件夹!)

docker run -d -v /elasticsearch/config:/usr/share/elasticsearch/config -v /elasticsearch/data:/usr/share/elasticsearch/data -p 9200:9200 -p 9300:9300 --name my_elastic_search elasticsearch:2.4.1 -Des.cluster.name="elastic_search_name"

你的日志记录配置必须包含所需的文件追加程序,例如作为这里默认显示:

# you can override this using by setting a system property, for example -Des.logger.level=DEBUG 
es.logger.level: INFO 
rootLogger: ${es.logger.level}, console, file 
logger: 
    # log action execution errors for easier debugging 
    action: DEBUG 

    # deprecation logging, turn to DEBUG to see them 
    deprecation: INFO, deprecation_log_file 

    # reduce the logging for aws, too much is logged under the default INFO 
    com.amazonaws: WARN 
    # aws will try to do some sketchy JMX stuff, but its not needed. 
    com.amazonaws.jmx.SdkMBeanRegistrySupport: ERROR 
    com.amazonaws.metrics.AwsSdkMetrics: ERROR 

    org.apache.http: INFO 

    # gateway 
    #gateway: DEBUG 
    #index.gateway: DEBUG 

    # peer shard recovery 
    #indices.recovery: DEBUG 

    # discovery 
    #discovery: TRACE 

    index.search.slowlog: TRACE, index_search_slow_log_file 
    index.indexing.slowlog: TRACE, index_indexing_slow_log_file 

additivity: 
    index.search.slowlog: false 
    index.indexing.slowlog: false 
    deprecation: false 

appender: 
    console: 
    type: console 
    layout: 
     type: consolePattern 
     conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n" 

    file: 
    type: dailyRollingFile 
    file: ${path.logs}/${cluster.name}.log 
    datePattern: "'.'yyyy-MM-dd" 
    layout: 
     type: pattern 
     conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %.10000m%n" 

    # Use the following log4j-extras RollingFileAppender to enable gzip compression of log files. 
    # For more information see https://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html 
    #file: 
    #type: extrasRollingFile 
    #file: ${path.logs}/${cluster.name}.log 
    #rollingPolicy: timeBased 
    #rollingPolicy.FileNamePattern: ${path.logs}/${cluster.name}.log.%d{yyyy-MM-dd}.gz 
    #layout: 
     #type: pattern 
     #conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n" 

    deprecation_log_file: 
    type: dailyRollingFile 
    file: ${path.logs}/${cluster.name}_deprecation.log 
    datePattern: "'.'yyyy-MM-dd" 
    layout: 
     type: pattern 
     conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n" 

    index_search_slow_log_file: 
    type: dailyRollingFile 
    file: ${path.logs}/${cluster.name}_index_search_slowlog.log 
    datePattern: "'.'yyyy-MM-dd" 
    layout: 
     type: pattern 
     conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n" 

    index_indexing_slow_log_file: 
    type: dailyRollingFile 
    file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log 
    datePattern: "'.'yyyy-MM-dd" 
    layout: 
     type: pattern 
     conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"