2017-05-14 76 views
0

这是我运行spark-shell命令(C:\ Spark> spark-shell)后看到的cmd日志。据我所知,这主要是Hadoop的一个问题。我使用Windows 10.请问下面的问题?Windows 10上Spark安装后的问题10

C:\Users\mac>cd c:\ 
c:\>winutils\bin\winutils.exe chmod 777 \tmp\hive 
c:\>cd c:\spark 
c:\Spark>spark-shell 


Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
17/05/14 13:21:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar." 
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar." 
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar." 
17/05/14 13:21:48 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException 
Spark context Web UI available at http://192.168.1.9:4040 
Spark context available as 'sc' (master = local[*], app id = local-1494764489031). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.1.1 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131) 
Type in expressions to have them evaluated. 
Type :help for more information. 
+0

嗨参孙。感谢您的反馈意见。我对编码和Spark非常陌生。我想解决上述问题,因为据我了解,Spark无法在我的电脑上工作。 – Maciej

+0

有关记录,如果必须使用Kerberos身份验证连接到Hadoop集群,则“无法加载native-hadoop库”将成为问题。既然你似乎没有为投资银行工作,你不应该打扰:-) –

回答

1

有在输出没有问题。这些WARN消息可以简单地被忽略。

换句话说,它看起来像就像你已经在Windows 10上正确安装了Spark 2.1.1。

要确保你安装了正确的(这样我就可以删除看起来从上面的句子)是做到以下几点:

spark.range(1).show 

,默认情况下会引发,可能会或可能不会结束加载配置单元类由于Hadoop的要求(因此需要winutils.exe来处理它们),Windows上有例外情况。