2016-08-24 70 views
0

我有一些JSON文件来自ElasticSearch数据库,我试图用ElasticDump导入它们。无法从映射使用ElasticDump导入_timestamp

这是映射文件: “mylog.mapping.json”

[ 
"{\"mylog\":{\"mappings\":{\"search_log\":{\"_timestamp\":{\"enabled\":true,\"store\":true},\"properties\":{\"preArray\":{\"type\":\"long\"},\"preId\":{\"type\":\"string\"},\"filteredSearch\":{\"type\":\"string\"},\"hits\":{\"type\":\"long\"},\"search\":{\"type\":\"string\"},\"searchType\":{\"properties\":{\"name\":{\"type\":\"string\"}}}}}}}}" 
] 

和包含数据本身的文件: “mylog.json”

{"_index":"mylog","_type":"search_log","_id":"AU5AcRy7dbXLQfUndnNS","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"test","filteredSearch":"test","hits":1470,"preId":"","preArray":[47752,51493,52206,50159,52182,53243,43237,51329,42772,44938,44945,44952,42773,58319,43238,48963,52856,52185,47751,61542,51327,42028,51341,45356,44853,44939,48587,42774,43063,98779,46235,53533,47745,48844,44979,53209,47738,98781,47757,44948,44950,48832,97529,52186,96033,53002,48419,44943,44955,52179]},"fields":{"_timestamp":1435600231611}} 
{"_index":"mylog","_type":"search_log","_id":"AU5AcSdcdbXLQfUndnNd","_score":1,"_source":{"searchType":{"name":"TypeSearchTwo"},"search":"squared","filteredSearch":"squared","hits":34,"preId":null,"preArray":null},"fields":{"_timestamp":1435600234333}} 
{"_index":"mylog","_type":"search_log","_id":"AU5AcSiZdbXLQfUndnNj","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"test","filteredSearch":"test","hits":1354,"preId":"","preArray":[55808,53545,53543,53651,55937,53544,54943,54942,54941]},"fields":{"_timestamp":1435600234649}} 

... 

{"_index":"mylog","_type":"search_log","_id":"AU5DSVzLdbXLQfUndnPp","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"lee","filteredSearch":"lee","hits":39,"preId":"53133","preArray":null},"fields":{"_timestamp":1435647958219}} 
{"_index":"mylog","_type":"search_log","_id":"AU5D7M42dbXLQfUndnR9","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"leerwww","filteredSearch":"leerwww","hits":39,"preId":"53133","preArray":null},"fields":{"_timestamp":1435658669622}} 

在我尝试导入这个数据在我ElasticSearch服务器,我已经试过以下ElasticDump命令:

elasticdump --input=/home/user/Desktop/LOGDATA/mylog.mapping.json --output=http://localhost:9200/mylog --type=mapping 
elasticdump --input=/home/user/Desktop/LOGDATA/mylog.json --output=http://localhost:9200/mylog --type=data 

在此之后,该数据是AV但是,_timestamp字段是无处可见的。如果我检查的映射,这是我获得:

[email protected]:~$ curl -XGET 'localhost:9200/mylog/_mapping' 

{ 
    "mylog":{ 
     "mappings":{ 
      "search_log":{ 
       "properties":{ 
        "preArray":{"type":"long"}, 
        "preId":{"type":"string"}, 
        "filteredSearch":{"type":"string"}, 
        "hits":{"type":"long"}, 
        "search":{"type":"string"}, 
        "searchType":{"properties":{"name":{"type":"string"}}} 
       } 
      } 
     } 
    } 
} 

正如你所看到的,_timestamp领域是不存在的,即使它在映射中指定。为什么会发生这种情况,如何导入数据而不会丢失时间戳?

+0

这是什么版本? – pickypg

+0

@pickypg 2.3.3 for ElasticSearch和2.3.0 for ElasticDump – ArthurTheLearner

回答

1

自2.0起,_timestamp is deprecated and a special type of field known as a meta-field。它仍然存在于5.0(至少现在),但你不应该依赖它,你应该期望它被删除。

与其他元字段一样,您不应该修改其映射(例如,指定stored: true),也不打算将其设置为文档的一部分。

你应该做的是设置字段作为一个请求参数:

PUT my_index/my_type/1?timestamp=1435600231611 
{"searchType":{"name":"TypeSearchOne"},"search":"test","filteredSearch":"test","hits":1470,"preId":"","preArray":[47752,51493,52206,50159,52182,53243,43237,51329,42772,44938,44945,44952,42773,58319,43238,48963,52856,52185,47751,61542,51327,42028,51341,45356,44853,44939,48587,42774,43063,98779,46235,53533,47745,48844,44979,53209,47738,98781,47757,44948,44950,48832,97529,52186,96033,53002,48419,44943,44955,52179]} 

我不知道有足够的了解ElasticDump知道是否有可能指示它做“正确的事情”,但实际上这里有一个优越的选项:

修改您的JSON输入以删除_timestamp并将其替换为名为timestamp(或您选择的任何名称)的普通字段。

"mappings": { 
    "my_type": { 
    "properties": { 
     "timestamp": { 
     "type": "date" 
     }, 
     ... 
    } 
    } 
} 

请注意,您ElasticDump输入分离_timestampfields,而不是从source,所以你必须确保你做一个查找/替换,妥善把它们组合在一起:

},"fields":{"_timestamp" 

应该是:

,"timestamp"