2017-05-30 810 views
1

我正在尝试使用mvn构建Spark 2.2.0-rc2版本,但无法这样做。为什么构建Spark RC版本时出现“无法初始化sun.util.calendar.ZoneInfoFile类”?

$ uname -a 
Linux knoldus-Vostro-15-3568 4.4.0-46-generiC#67-Ubuntu SMP Thu Oct 20 15:05:12 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux 

$ java -version 
openjdk version "1.8.0_131" 

下面是我得到的错误堆栈:

$ ./build/mvn -Phadoop-2.7,yarn,mesos,hive,hive-thriftserver -DskipTests clean install 

... 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-tags_2.11 --- 
[INFO] Using zinc server for incremental compilation 
java.lang.NoClassDefFoundError: Could not initialize class sun.util.calendar.ZoneInfoFile 
    at sun.util.calendar.ZoneInfo.getTimeZone(ZoneInfo.java:589) 
    at java.util.TimeZone.getTimeZone(TimeZone.java:560) 
    at java.util.TimeZone.setDefaultZone(TimeZone.java:666) 
    at java.util.TimeZone.getDefaultRef(TimeZone.java:636) 
    at java.util.Date.<init>(Date.java:254) 
    at java.util.zip.ZipUtils.dosToJavaTime(ZipUtils.java:71) 
    at java.util.zip.ZipUtils.extendedDosToJavaTime(ZipUtils.java:88) 
    at java.util.zip.ZipEntry.getTime(ZipEntry.java:194) 
    at sbt.IO$.next$1(IO.scala:278) 
    at sbt.IO$.sbt$IO$$extract(IO.scala:286) 
    at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255) 
    at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255) 
    at sbt.Using.apply(Using.scala:24) 
    at sbt.IO$.unzipStream(IO.scala:255) 
    at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249) 
    at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249) 
    at sbt.Using.apply(Using.scala:24) 
    at sbt.IO$.unzip(IO.scala:249) 
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140) 
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140) 
    at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111) 
    at scala.collection.immutable.List.foldLeft(List.scala:84) 
    at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138) 
    at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105) 
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:140) 
    at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139) 
    at sbt.IO$.withTemporaryDirectory(IO.scala:344) 
    at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139) 
    at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:58) 
    at com.typesafe.zinc.Compiler$.compilerInterface(Compiler.scala:148) 
    at com.typesafe.zinc.Compiler$.create(Compiler.scala:53) 
    at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40) 
    at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40) 
    at com.typesafe.zinc.Cache.get(Cache.scala:41) 
    at com.typesafe.zinc.Compiler$.apply(Compiler.scala:40) 
    at com.typesafe.zinc.Main$.run(Main.scala:96) 
    at com.typesafe.zinc.Nailgun$.zinc(Nailgun.scala:93) 
    at com.typesafe.zinc.Nailgun$.nailMain(Nailgun.scala:82) 
    at com.typesafe.zinc.Nailgun.nailMain(Nailgun.scala) 
    at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at com.martiansoftware.nailgun.NGSession.run(NGSession.java:280) 
[INFO] ------------------------------------------------------------------------ 
[INFO] Reactor Summary: 
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 5.657 s] 
[INFO] Spark Project Tags ................................. FAILURE [ 0.371 s] 
[INFO] Spark Project Sketch ............................... SKIPPED 
[INFO] Spark Project Networking ........................... SKIPPED 
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED 
[INFO] Spark Project Unsafe ............................... SKIPPED 
[INFO] Spark Project Launcher ............................. SKIPPED 
[INFO] Spark Project Core ................................. SKIPPED 
[INFO] Spark Project ML Local Library ..................... SKIPPED 
[INFO] Spark Project GraphX ............................... SKIPPED 
[INFO] Spark Project Streaming ............................ SKIPPED 
[INFO] Spark Project Catalyst ............................. SKIPPED 
[INFO] Spark Project SQL .................................. SKIPPED 
[INFO] Spark Project ML Library ........................... SKIPPED 
[INFO] Spark Project Tools ................................ SKIPPED 
[INFO] Spark Project Hive ................................. SKIPPED 
[INFO] Spark Project REPL ................................. SKIPPED 
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED 
[INFO] Spark Project YARN ................................. SKIPPED 
[INFO] Spark Project Mesos ................................ SKIPPED 
[INFO] Spark Project Hive Thrift Server ................... SKIPPED 
[INFO] Spark Project Assembly ............................. SKIPPED 
[INFO] Spark Project External Flume Sink .................. SKIPPED 
[INFO] Spark Project External Flume ....................... SKIPPED 
[INFO] Spark Project External Flume Assembly .............. SKIPPED 
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED 
[INFO] Spark Project Examples ............................. SKIPPED 
[INFO] Spark Project External Kafka Assembly .............. SKIPPED 
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED 
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED 
[INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 6.855 s 
[INFO] Finished at: 2017-05-30T13:47:02+05:30 
[INFO] Final Memory: 50M/605M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-tags_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed -> [Help 1] 
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 
[ERROR] Re-run Maven using the -X switch to enable full debug logging. 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles: 
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException 
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command 
[ERROR] mvn <goals> -rf :spark-tags_2.11 

我无法理解这个错误。


scala> java.util.TimeZone.getDefault 
res0: java.util.TimeZone = sun.util.calendar.ZoneInfo[id="Asia/Kolkata",offset=19800000‌​,dstSavings=0,useDay‌​light=false,transiti‌​ons=6,lastRule=null] 

区域设置为LANG = en_IN LANGUAGE = en_IN:EN LC_CTYPE = “en_IN” LC_NUMERIC = “en_IN” LC_TIME = “en_IN” LC_COLLATE = “en_IN” LC_MONETARY = “en_IN” LC_MESSAGES =” en_IN” LC_PAPER = “en_IN” LC_NAME = “en_IN” LC_ADDRESS = “en_IN” LC_TELEPHONE = “en_IN” LC_MEASUREMENT = “en_IN” LC_IDENTIFICATION = “en_IN” LC_ALL =

回答

1

的问题是与你的时间区。导出LC_ALL=en_US.UTF-8并重新开始。确保en_US.UTF-8locale的所有条目。

$ locale 
LANG="en_US.UTF-8" 
LC_COLLATE="en_US.UTF-8" 
LC_CTYPE="en_US.UTF-8" 
LC_MESSAGES="en_US.UTF-8" 
LC_MONETARY="en_US.UTF-8" 
LC_NUMERIC="en_US.UTF-8" 
LC_TIME="en_US.UTF-8" 
LC_ALL="en_US.UTF-8" 
相关问题