我试图将大型JSON文档导入到Elasticsearch 5.1中。数据的一小部分是这样的:使用CURL将JSON导入到Elasticsearch 5.1中
[
{
"id": 1,
"region": "ca-central-1",
"eventName": "CreateRole",
"eventTime": "2016-02-04T03:41:19.000Z",
"userName": "[email protected]"
},
{
"id": 2,
"region": "ca-central-1",
"eventName": "AddRoleToInstanceProfile",
"eventTime": "2016-02-04T03:41:19.000Z",
"userName": "[email protected]"
},
{
"id": 3,
"region": "ca-central-1",
"eventName": "CreateInstanceProfile",
"eventTime": "2016-02-04T03:41:19.000Z",
"userName": "[email protected]"
},
{
"id": 4,
"region": "ca-central-1",
"eventName": "AttachGroupPolicy",
"eventTime": "2016-02-04T01:42:36.000Z",
"userName": "[email protected]"
},
{
"id": 5,
"region": "ca-central-1",
"eventName": "AttachGroupPolicy",
"eventTime": "2016-02-04T01:39:20.000Z",
"userName": "[email protected]"
}
]
我想导入数据,而无需对源数据如果可能的任何变化,所以我认为,排除了_bulk命令,我需要为每个条目添加额外的细节。
我试过几种不同的方法,但没有任何运气。我是否浪费时间尝试按原样导入此文档?
我已经试过:
curl -XPOST 'demo.ap-southeast-2.es.amazonaws.com/rea/test' --data-binary @Records.json
但失败与错误:
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}},"status":400}
谢谢!
感谢您的答复。我会尝试这些选项并报告回来! –