我的整个build.sbt是:星火单元测试
name := """sparktest"""
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.11.8"
scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8", "-Xexperimental")
parallelExecution in Test := false
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.2",
"org.apache.spark" %% "spark-sql" % "2.0.2",
"org.apache.avro" % "avro" % "1.8.1",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"com.holdenkarau" %% "spark-testing-base" % "2.0.2_0.4.7" % "test"
)
我有一个简单的测试。显然,这只是一个起点,我想测试更多:
package sparktest
import com.holdenkarau.spark.testing.DataFrameSuiteBase
import org.scalatest.FunSuite
class SampleSuite extends FunSuite with DataFrameSuiteBase {
test("simple test") {
assert(1 + 1 === 2)
}
}
我跑sbt clean test
并获得衰竭:
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars
对于我的开发环境,我使用的spark-2.0.2-bin-hadoop2.7.tar.gz
我必须以任何方式配置此环境吗?显然HiveConf是一个传递性的Spark依赖项
我认为你必须明确地向你的依赖添加 '“org.apache.spark”%%“spark-hive”%“2.0.2”'。 –