2017-06-20 426 views
0

我正在尝试使用Twitter流示例项目。定义sbt时我面临问题。sbt.ResolveException无法解析的依赖关系

我build.sbt

name := "Tutorial" 
version := "0.1.0" 
scalaVersion := "2.11.8" 
retrieveManaged := true 
libraryDependencies ++= Seq(
    "org.apache.spark" % "spark-core" % "2.11.0", 
    "org.apache.spark" % "spark-streaming" % "1.1.0", 
    "org.apache.spark" % "spark-streaming-twitter" % "1.1.0" 
) 

错误日志:

[warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes. 
[warn]  com.eed3si9n:sbt-assembly:0.9.2 (sbtVersion=0.11.3, scalaVersion=2.11.8) 
[warn]  com.typesafe.sbteclipse:sbteclipse-plugin:2.2.0 (sbtVersion=0.11.3, scalaVersion=2.11.8) 
[warn]  com.github.mpeltonen:sbt-idea:1.5.1 (sbtVersion=0.11.3, scalaVersion=2.11.8) 
[warn] 
[error] {file:/home/muralee1857/scala/workspace/Tutorial/}default-109f4d/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.8;1.5.1: not found 
[error] unresolved dependency: com.eed3si9n#sbt-assembly;0.9.2: not found 
[error] unresolved dependency: com.typesafe.sbteclipse#sbteclipse-plugin;2.2.0: not found 
[error] unresolved dependency: com.github.mpeltonen#sbt-idea;1.5.1: not found 

回答

1

我建议你明确地定义封装版本的依赖作为

libraryDependencies ++= Seq(
    "org.apache.spark" % "spark-core_2.10" % "1.1.0", 
    "org.apache.spark" % "spark-streaming_2.10" % "1.1.0" % "provided", 
    "org.apache.spark" % "spark-streaming-twitter_2.10" % "1.1.0" 
) 

你可以使用%%而不定义packaged version,但会尝试在您的系统中下载package with scala version。有时sbt将不会找到将造成依赖性问题的scala version packaged packages

0

这应该工作。请注意,我在此处使用%%方法而不是%,以便在此处选择正确版本的spark库(scala版本2.11)。确保你将相同的%%功能,其他像SBT组装插件,SBT组装等

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "2.11.0", 
    "org.apache.spark" %% "spark-streaming" % "1.1.0", 
    "org.apache.spark" %% "spark-streaming-twitter" % "1.1.0" 
) 
+0

我的问题得到解决,我已经改变了我的Scala编译器为2.10后兼容性问题得到了与外部罐子resoled。 –

相关问题