2016-09-22 84 views
0

我通过intellij运行spark工作。作业执行并给出输出。我需要这份工作的jar文件到服务器并运行,但是当我尝试做sbt assembly它抛出以下错误:SBT装配falis

[error] Not a valid command: assembly 
[error] Not a valid project ID: assembly 
[error] Expected ':' (if selecting a configuration) 
[error] Not a valid key: assembly 
[error] assembly 

我SBT版本是0.13.8

下面

是我build.sbt文件:

import sbt._, Keys._ 
name := "mobilewalla" 
version := "1.0" 
scalaVersion := "2.11.7" 
libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "2.0.0", 
"org.apache.spark" %% "spark-sql" % "2.0.0") 

我在项目目录下添加了一个文件assembly.sbt。它包含:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3") 

我在这里

+0

安装sbt程序集后重新加载了吗? –

+0

是的,我做到了。是通过Build.sbt文件正确吗?我需要添加任何东西吗? – toofrellik

+0

'sbt package'创建你的JAR文件以提交spark ... https://spark.apache.org/docs/2.0.0/quick-start.html#self-contained-applications –

回答

1

在build.sbt

assemblyMergeStrategy in assembly := { 
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard 
    case x => MergeStrategy.first 
} 
mainClass in assembly := Some("com.SparkMain") 
resolvers += "spray repo" at "http://repo.spray.io" 
assemblyJarName in assembly := "streaming-api.jar" 

添加这些行,包括这些行在你的plugins.sbt文件中

addSbtPlugin("io.spray" % "sbt-revolver" % "0.7.2") 

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0") 
+0

你可以让我知道为什么我们需要插件文件中的'sbt-revolver'设置 – toofrellik

+0

这是一个非常有用的插件,它可以在分叉的JVM中运行您的项目,并在更改时重新加载它。 – Nilesh

0

缺什么要组装多个罐子一个u需要在plugins.sbt下面添加插件项目目录下。

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3") 

如果妳需要定制组装罐子触发特定MainClass采取例如assembly.sbt

import sbtassembly.Plugin.AssemblyKeys._ 

Project.inConfig(Compile)(baseAssemblySettings) 

mainClass in (Compile, assembly) := Some("<main application name with package path>") 

jarName in (Compile, assembly) := s"${name.value}-${version.value}-dist.jar" 
//below is merge strategy to make what all file need to exclude or include 
mergeStrategy in (Compile, assembly) <<= (mergeStrategy in (Compile, assembly)) { 
    (old) => { 
    case PathList(ps @ _*) if ps.last endsWith ".html" =>MergeStrategy.first 
    case "META-INF/MANIFEST.MF" => MergeStrategy.discard 
    case x => old(x) 
    } 
} 
+0

我不认为这些行对于Apache Spark项目是必需的 –

+0

我在ref中添加了assembly.sbt文件:[sbt assembly](https://github.com/sbt/sbt-assembly),因为我的sbt版本是0.13.8 – toofrellik

+0

您仍然应该尝试使用plugins.sbt。如果这个文档是完美的,那么你首先不会在SO上问一个问题:) – C4stor