2016-05-17 50 views
4

我想知道是否有人得到SASL在YARN上使用Spark 1.6.1?Spark 1.6.1 SASL

基本上星火文档指出你只需要启用了3个参数:

spark.authenticate.enableSaslEncryption=true  
spark.network.sasl.serverAlwaysEncrypt=true 
spark.authenticate=true 

http://spark.apache.org/docs/latest/security.html

然而,在与--master纱和--deploy模式客户端启动我的火花的工作,我见在我的火花执行日志中的以下内容:

6/05/17 06:50:51 ERROR client.TransportClientFactory: Exception while bootstrapping client after 29 ms 

java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown message type: -22 
     at org.apache.spark.network.shuffle.protocol.BlockTransferMessage$Decoder.fromByteBuffer(BlockTransferMessage.java:67) 
     at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.receive(ExternalShuffleBlockHandler.java:71) 
     at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149) 
     at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) 
     at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787) 
     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) 
     at java.lang.Thread.run(Thread.java:745) 

我仍然排除故障。然而,如果有人以前看到过这个,那将是非常棒的。

+0

也许这可以帮助:https://issues.apache.org/jira/browse/SPARK-6420 – RoyaumeIX

+0

嗨@Fabian谭,我面对完全一样的问题。你有没有设法调试呢? – shridharama

回答

3

您还需要在YARN中设置spark.authenticate=true

摘自星火代码库YarnShuffleService.java

* The service also optionally supports authentication. This ensures that executors from one 
* application cannot read the shuffle files written by those from another. This feature can be 
* enabled by setting `spark.authenticate` in the Yarn configuration before starting the NM. 
* Note that the Spark application must also set `spark.authenticate` manually and, unlike in 
* the case of the service port, will not inherit this setting from the Yarn configuration. This 
* is because an application running on the same Yarn cluster may choose to not use the external 
* shuffle service, in which case its setting of `spark.authenticate` should be independent of 
* the service's. 

您可以通过添加这样做以下到您的hadoop配置core-site.xml

<property> 
    <name>spark.authenticate</name><value>true</value> 
</property>