2017-02-28 124 views
0

目前我得到一个NullPointException。我不知道如何解决它。 只是想知道我是否可以调整Python日志级别并查看是否可以从中获取更多信息。问题:如何调整PySpark的日志级别?如何调整PySpark shell日志级别?

Python 2.7.5 (default, Oct 11 2015, 17:47:16) 
[GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2 
Type "help", "copyright", "credits" or "license" for more information. 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). 
17/02/28 16:52:22 ERROR spark.SparkContext: Error initializing SparkContext. 
17/02/28 16:52:22 ERROR util.Utils: Uncaught exception in thread Thread-2 
java.lang.NullPointerException 
     at org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:152) 
     at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1231) 
     at org.apache.spark.SparkEnv.stop(SparkEnv.scala:96) 
     at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1768) 
     at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230) 
     at org.apache.spark.SparkContext.stop(SparkContext.scala:1767) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:614) 
     at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234) 
     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381) 
     at py4j.Gateway.invoke(Gateway.java:214) 
     at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79) 
     at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68) 
     at py4j.GatewayConnection.run(GatewayConnection.java:209) 
     at java.lang.Thread.run(Thread.java:745) 
Traceback (most recent call last): 
    File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/shell.py", line 43, in <module> 
    sc = SparkContext(pyFiles=add_files) 
    File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py", line 115, in __init__ 
    conf, jsc, profiler_cls) 
    File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py", line 172, in _do_init 
    self._jsc = jsc or self._initialize_context(self._conf._jconf) 
    File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py", line 235, in _initialize_context 
    return self._jvm.JavaSparkContext(jconf) 
    File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__ 
    File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value 
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. 
: 
>>> 

回答

0

更详细的日志记录级别为网关创建自定义的log4j文件,例如:

log4j.rootCategory=INFO, console 
log4j.appender.console=org.apache.log4j.ConsoleAppender 
log4j.appender.console.target=System.err 
log4j.appender.console.layout=org.apache.log4j.PatternLayout 
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n 

log4j.logger.org.apache.spark.api.python.PythonGatewayServer=DEBUG 

然后在下面的方式来使用它在pyspark命令:

./bin/pyspark --driver-java-options '-Dlog4j.configuration=file:log4j-debug.properties' 
+0

调查后源代码, https://github.com/apache/spark/blob/a36a76ac43c36a3b897a748bd9f138b629dbc684/python/pyspark/java_gateway.py 'java gateway'is somethingthin g就像“./bin/spark-submit --conf xxkey = xxvalue --conf xxxkey = xxxvalue pyspark-shell”。你能否帮助澄清参数'--driver-java-options'是如何传递到'./bin/spark-submit'下面的?谢谢。 – cdhit

+0

它被重写为'--conf spark.driver.extraJavaOptions'。请参阅pyspark shell中的'os.environ ['PYSPARK_SUBMIT_ARGS']'。 – Mariusz