2016-09-17 435 views
1

当我尝试从eclipse上的Yarn提交Spark应用程序时,我遇到了一个问题。我尝试提交一个简单的SVM程序,但我给出了以下错误。我有的MacBook,我会很感激,如果有人给我详细的解答从Eclipse IDE提交YARN上的Spark应用程序

16/09/17 10:04:19 ERROR SparkContext: Error initializing SparkContext. 
java.lang.IllegalStateException: Library directory '.../MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built. 
    at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248) 
    at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368) 
    at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38) 
    at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:500) 
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834) 
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167) 
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) 
    at SVM.main(SVM.java:21) 
+0

转至Eclipse的Run Configurations - > Environment并在其中添加环境变量** SPARK_HOME **。 –

+0

非常感谢Rakesh为你的完美答案。它工作:)但现在我看到这个错误 – marjan

+0

16/09/17 14:44:56警告DFSClient:DataStreamer异常 org.apache.hadoop.ipc.RemoteException(java.io.IOException):File/user/marjanasgari/.sparkStaging/application_1474085529591_0014/__ spark_libs__3176835706727949960.zip只能复制到0节点而不是minReplication(= 1)。有0个数据节点正在运行,并且在此操作中不包含任何节点。 – marjan

回答

1

转到

运行配置 - >环境在Eclipse

,并添加环境变量SPARK_HOME

相关问题