2017-03-06 64 views
0

在我的本地scala-app中,我想在我的群集中启动Spark任务。任务级是my.spark.SparkRunner和it's包含在一个罐子里这是在HDFS,这就是从来就在我的本地程序配置:使用SparkLauncher运行Spark-Task

val spark = new SparkLauncher() 
    //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4") 
    .setVerbose(true) 
    .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar") 
    .setMainClass("my.spark.SparkRunner") 
    .setMaster("spark://192.168.10.183:7077") 
    //.setMaster("192.168.10.183:7077") 
    .launch(); 

spark.waitFor(); 

它不引发错误,但回报立即并没有启动任务。我究竟做错了什么?谢谢...

回答

0

从来就只是添加了检查发射器的状态,这就是它的线程...

val spark = new SparkLauncher() 
    //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4") 
    .setVerbose(true) 
    .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar") 
    .setMainClass("my.spark.SparkRunner") 
    .setMaster("spark://192.168.10.183:7077") 
    //.setMaster("192.168.10.183:7077") 
    .startApplication(); 

while (spark.getState.toString != "FINISHED") { 

    println (spark.getState) 

    Thread.sleep(1000) 
}