2016-01-29 96 views
1

我有以下内容的简单的斯卡拉对象文件:错误而执行Scala的构建与星火1.5.2和Scala 2.11.7

import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf 


object X { 
     def main(args: Array[String]) { 

     val params = Map[String, String](
      "abc" -> "22",) 
     println("Creating Spark Configuration"); 
     val conf = new SparkConf().setAppName("X") 
     val sc = new SparkContext(conf) 
     val txtFileLines = sc.textFile("/tmp/x.txt", 2).cache() 
     val count = txtFileLines.count() 
     println("Count" + count) 
    } 
} 

我build.sbt样子:

name := "x" 

version := "1.0" 

scalaVersion := "2.11.7" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided" 

我然后做SBT包来创建x.jartarget/scala-2.11/

当我执行上面的代码为: spark-submit --class X --master local[2] x.jar 我收到以下错误:

Creating Spark Configuration 
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object; 
    at Sweeper$.main(Sweeper.scala:14) 
    at Sweeper.main(Sweeper.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
+1

您的火花是2.10还是2.11火花? – Reactormonk

+0

它的火花1.5.2 – Neel

+0

问题是用什么版本的Scala来构建它。 Scala二进制文件在主要版本(即2.10和2.11)之间不兼容,Spark的默认Scala版本是2.10。 – zero323

回答