2015-11-03 210 views
1

我一直试图在配置Hive工作到Spark后,在Hive CLI上运行配置单元查询。Hive-Spark错误 - java.lang.IllegalStateException:未读块数据

spark.masterlocal它工作得很好,但是当我将它设置为我的火花主spark://spark-master:7077我得到星火记录以下错误:

15/11/03 16:37:10 INFO util.Utils: Copying /tmp/spark-5e39df85-d3d7-446f-86e9-d2699501f97e/executor-70d24a32-6913-479d-85b8-32e535dd3dbf/-11208827301446565026180_cache to /usr/local/spark/work/app-20151103163705-0000/0/./hive-exec-1.2.1.jar 
15/11/03 16:37:11 INFO executor.Executor: Adding file:/usr/local/spark/work/app-20151103163705-0000/0/./hive-exec-1.2.1.jar to class loader 
15/11/03 16:37:11 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0) 
java.lang.IllegalStateException: unread block data 
    at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2428) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1382) 
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997) 
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921) 
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) 
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) 
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) 
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69) 
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:95) 
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 

我星火1.4.1和1.2蜂房工作.1

+0

请检查此链接是否有用:https://issues.apache.org/jira/browse/HIVE-8300 –

+0

谢谢。我看到了这个。但我不确定如何处理它。 – sofia

+0

任何答案在这里? – sofia

回答

1

只为其他人可能有同样的问题,我设法解决这个问题,并通过它,我认为这是在执行人员的一面HBase罐(这只是发生在运行查询时通过配置单元接触HBase,并且仅在火花簇模式下)。

我的解决办法是添加到spark-env.sh:

export SPARK_CLASSPATH=$CLASSPATH 

export SPARK_CLASSPATH=/usr/local/hbase-1.1.2/lib/hbase-protocol-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-common-1.1.2.jar:/usr/local/hbase-1.1.2/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hbase-1.1.2/lib/hbase-server-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-client-1.1.2.jar:/usr/local/hive-1.2.1/lib/hive-hbase-handler-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-common-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-exec-1.2.1.jar 

或者,可以添加到配置单元-site.xml中:

<property> 
    <name>spark.executor.extraClassPath</name> 
    <value>/usr/local/hbase-1.1.2/lib/hbase-protocol-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-common-1.1.2.jar:/usr/local/hbase-1.1.2/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hbase-1.1.2/lib/hbase-server-1.1.2.jar:/usr/local/hbase-1.1.2/lib/hbase-client-1.1.2.jar:/usr/local/hive-1.2.1/lib/hive-hbase-handler-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-common-1.2.1.jar:/usr/local/hive-1.2.1/lib/hive-exec-1.2.1.jar</value> 
    </property>