2017-02-24 130 views
1

我跟着文档和编辑:德鲁伊卡夫卡索引服务设置

druid-0.9.2/conf/druid/_common/common.runtime.properties 

,并补充说:

"druid-kafka-indexing-service" 

druid.extensions.loadList并重新启动所有德鲁伊服务:middlemanager, overlord, coordinator, broker, historical

我跑:

curl -X 'POST' -H 'Content-Type:application/json' -d @kafka_connect/script.json druid_server:8090/druid/indexer/v1/task

,但得到:

{"error":"Could not resolve type id 'kafka' into a subtype of [simple type, class io.druid.indexing.common.task.Task]\n at [Source: [email protected]; line: 1, column: 4]"}

我输入JSON有: { "type": "kafka", "dataSchema": { "dataSource": "sensors-kafka", "parser": { "type": "string", "parseSpec": { "format": "json", "timestampSpec": { "column": "timestamp", "format": "auto" }, "dimensionsSpec": { "dimensions": ["machine", "key"], "dimensionExclusions": [ "timestamp", "value" ] } } }, "metricsSpec": [ { "name": "count", "type": "count" }, { "name": "value_sum", "fieldName": "value", "type": "doubleSum" }, { "name": "value_min", "fieldName": "value", "type": "doubleMin" }, { "name": "value_max", "fieldName": "value", "type": "doubleMax" } ], "granularitySpec": { "type": "uniform", "segmentGranularity": "HOUR", "queryGranularity": "NONE" } }, "tuningConfig": { "type": "kafka", "maxRowsPerSegment": 5000000 }, "ioConfig": { "topic": "sensor", "consumerProperties": { "bootstrap.servers": "kafka_server:2181" }, "taskCount": 1, "replicas": 1, "taskDuration": "PT1H" } }

任何想法,我做错了什么?根据引导 : http://druid.io/docs/0.9.2-rc3/development/extensions-core/kafka-ingestion.html

typekafka

有没有办法来检查扩展被正确加载或我指定每个组件的runtime.properties

由于在扩展

回答

1

监事JSON规格是在此端点上发送霸王/druid/indexer/v1/supervisor

curl -X POST -H 'Content-Type: application/json' -d @kafka_connect/script.json http://druid_server:8090/druid/indexer/v1/supervisor 
+0

还是得到了同样的错误 – KillerSnail

+0

如果错误改为像卡夫卡不能霸主解决,我会仔细检查扩展正确加载,我t打印在霸王启动 – Pierre

+1

没问题。我使用的是快速入门指南,但扩展文档指的是'conf'而不是'conf-quickstart',我使用的是启动服务......现在我得到了一个'500错误',而这是因为数据源已经编码发送'timestamp'作为时期而不是'YYYY-MM-DD'等 – KillerSnail