2017-10-05 69 views
0

我正在使用Spark 1.6.0和带Kafka的artifact spark-streaming_2.11 API来使用字符串消息。无法使用KafkaUtils创建Java输入dstream:Spark 1.6.0

按照文档我试图用卡夫卡utils的创建直接流,但我得到下面的编译器错误:

The method createDirectStream(JavaStreamingContext, Class, Class, Class, Class, Map, Set) in the type KafkaUtils is not applicable for the arguments (JavaStreamingContext, Class, Class, Class, Class, Map, Set)

这里是代码片段我写:

conf = new SparkConf().setAppName("Test Streaming App").setMaster("local[*]"); 
sc = new JavaSparkContext(conf); 
ssc = new JavaStreamingContext(sc, new Duration(2000)); 
Map<String, String> kafkaParams = new HashMap<String, String>(); 
kafkaParams.put("metadata.broker.list", "localhost:9092"); 
Set<String> topics = Collections.singleton("test"); 
JavaPairInputDStream<String, String> dstream = KafkaUtils.createDirectStream(ssc, String.class, String.class, StringDecoder.class, StringDecoder.class,kafkaParams,topics); 
ssc.start(); 
ssc.awaitTermination(); 

这是神器和我使用的Spark版本的问题吗?请在此点亮一下。

回答