2016-09-21 100 views
1

我想从一个Scala对象中创建一个JSON字符串,如here所述。Scala/Spark:NoClassDefFoundError:net/liftweb/json /格式

我有以下代码:

import scala.collection.mutable._ 
import net.liftweb.json._ 
import net.liftweb.json.Serialization.write 

case class Person(name: String, address: Address) 
case class Address(city: String, state: String) 

object LiftJsonTest extends App { 

    val p = Person("Alvin Alexander", Address("Talkeetna", "AK")) 

    // create a JSON string from the Person, then print it 
    implicit val formats = DefaultFormats 
    val jsonString = write(p) 
    println(jsonString) 

} 

我build.sbt文件包含以下内容:

libraryDependencies += "net.liftweb" %% "lift-json" % "2.5+" 

当我建立与sbt package,它是成功的。

然而,当我尝试运行它作为一个Spark工作,像这样:

spark-submit \ 
    --packages com.amazonaws:aws-java-sdk-pom:1.10.34,org.apache.hadoop:hadoop-aws:2.6.0,net.liftweb:lift-json:2.5+ \ 
    --class "com.foo.MyClass" \ 
    --master local[4] \ 
    target/scala-2.10/my-app_2.10-0.0.1.jar 

我得到这个错误:

Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: net.liftweb#lift-json;2.5+: not found] 
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068) 
    at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

我在做什么错在这里?我的packages参数中的net.liftweb:lift-json:2.5+是不正确的?我是否需要在build.sbt中添加解析器?

回答

1

Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates with --packages.

2.5+build.sbtIvy version matcher语法,需要对Maven不实际的神器版本的坐标。​​显然不使用Ivy来解析(我认为如果它确实会令人惊讶;您的应用程序可能会突然停止工作,因为发布了新的依赖版本)。因此,您需要找到2.5+解决的版本,例如使用https://github.com/jrudolph/sbt-dependency-graph(或试图在show dependencyClasspath中找到它)。

+0

这里最好的解决方案是使用lift-json的最新版本,它是2.6.3。 –