在运行我是否缺少任何导入?
> spark-submit --class "TwitterPopularTags" --master local[2] >/home/raja/begin/target/scala-2.11/simple-project_2.11-1.0.jar
> "Exception in thread "main" java.lang.NoClassDefFoundError: >org/apache/spark/Logging
>...
>Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
>...
>"
我SBT是好的:干净,重新加载,编译,打包,但是当我跑上述火花提交,我得到了上面的世界著名的错误。
>My simple.sbt:
> name := "Simple Project"
> version := "1.0"
> scalaVersion := "2.11.6"
>libraryDependencies ++= Seq(
> "org.apache.spark" %% "spark-core" % "2.0.1",
> "org.apache.spark" %% "spark-streaming" % "2.0.1",
> "org.apache.spark" %% "spark-streaming-twitter" % "1.6.2",
> "com.google.code.gson" % "gson" % "2.7",
> "org.twitter4j" % "twitter4j-core" % "4.0.4",
> "org.twitter4j" % "twitter4j-stream" % "4.0.4",
>"org.apache.logging.log4j" % "log4j-slf4j-impl" % "2.7"
>)
>resolvers += "Maven Central" at "https://repo1.maven.org/maven2/"
>resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
我进口:
>import org.apache.spark.streaming.StreamingContext._
>import org.apache.spark.streaming.dstream.DStream
>import org.apache.spark.streaming.twitter._
>import org.apache.spark.streaming.{Seconds, StreamingContext}
>import org.apache.spark.{SparkConf, SparkContext}
>import org.slf4j.{Logger, LoggerFactory}
>import org.slf4j.impl.StaticLoggerBinder
>import org.apache.log4j.{Level, LogManager, PropertyConfigurator}
>import org.apache.log4j.Logger
>import twitter4j.auth.OAuthAuthorization
>import twitter4j.conf.ConfigurationBuilder
>import twitter4j.Twitter
>import twitter4j.Status
>import twitter4j.auth.Authorization
>import twitter4j.TwitterFactory
请不要告诉我改版本。我配置了log4j.properties,属性但没有机会。 SLF4J还是Logback我不得不看?我试图从maven那里获得该版本的火星核心jar,但没有机会。
任何人都可以针点我在这里
感谢, 拉贾
是否有“slf4j-api”依赖或类似的东西?我记得在我自己的项目中需要其他类似的东西。在java中我需要slf4j-api-1.7.7.jar(版本可能有所不同) – applecrusher