2014-10-29 88 views
1

我想从以下链接做到这一点hcatalog例如:Hcatalog蜂巢问题

http://www.cloudera.com/content/cloudera/en/documentation/cdh4/v4-2-0/CDH4-Installation-Guide/cdh4ig_topic_19_6.html

我收到以下异常,当我运行作业。

Exception in thread "main" com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: org/antlr/runtime/RecognitionException 
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2232) 

    at com.google.common.cache.LocalCache.get(LocalCache.java:3965) 

    at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764) 

    at org.apache.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:167) 

    at org.apache.hcatalog.common.HiveClientCache.get(HiveClientCache.java:143) 

    at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:544) 

    at org.apache.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:103) 

    at org.apache.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:85) 

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:85) 

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:54) 

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:46) 

    at com.otsi.hcat.UseHCat.run(UseHCat.java:69) 

    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 

    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) 

    at com.otsi.hcat.UseHCat.main(UseHCat.java:96) 

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 

    at java.lang.reflect.Method.invoke(Method.java:606) 

    at org.apache.hadoop.util.RunJar.main(RunJar.java:212) 

Caused by: java.lang.NoClassDefFoundError: org/antlr/runtime/RecognitionException 
    at java.lang.Class.forName0(Native Method) 

    at java.lang.Class.forName(Class.java:270) 

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1378) 

    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64) 

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:498) 

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:476) 

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:524) 

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:398) 

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:357) 

    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54) 

    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59) 

    at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4948) 

    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171) 

    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:154) 

    at org.apache.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:246) 

    at org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:170) 

    at org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:167) 

    at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767) 

    at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568) 

    at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350) 

    at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313) 

    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228) 

    ... 19 more 

Caused by: java.lang.ClassNotFoundException: org.antlr.runtime.RecognitionException 

运行MR job之前,我执行以下命令

$ export HCAT_HOME=$HIVE_HOME/hcatalog

$ HCATJAR=$HCAT_HOME/share/hcatalog/hcatalog-core-0.11.0.jar

$ HCATPIGJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-pig-adapter-0.13.0.jar

$ export HADOOP_CLASSPATH=$HCATJAR:$HCATPIGJAR:$HIVE_HOME/lib/hive-exec- 0.13.0.jar:$HIVE_HOME/lib/hive-metastore-0.13.0.jar:$HIVE_HOME/lib/jdo-api- 3.0.1.jar:$HIVE_HOME/lib/libfb303-0.9.0.jar:$HIVE_HOME/lib/libthrift- 0.9.0.jar:$HIVE_HOME/lib/slf4j-api-1.6.4.jar: $HIVE_HOME/conf:/usr/hadoop/hadoop-2.4.0/etc/hadoop/

$ LIBJARS= echo $HADOOP_CLASSPATH | sed -e 's/:/,/g'

$ export LIBJARS=$LIBJARS,$HIVE_HOME/lib/antlr-runtime-3.4.jar

回答

1

我不运行的CDH分布,但是我能得到这个工作,以下配置设置:

export HCAT_HOME=/usr/lib/hive-hcatalog 
export HIVE_HOME=/usr/lib/hive 
HCATJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-core-0.13.0.2.1.1.0-385.jar 
HCATPIGJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-pig-adapter-0.13.0.2.1.1.0-385.jar 
HIVE_VERSION=0.13.0.2.1.1.0-385 
export HADOOP_CLASSPATH=$HCATJAR:$HCATPIGJAR:$HIVE_HOME/lib/hive-exec-$HIVE_VERSION.jar:$HIVE_HOME/lib/hive-metastore-$HIVE_VERSION.jar:$HIVE_HOME/lib/libfb303-0.9.0.jar:$HIVE_HOME/lib/libthrift-0.9.0.jar:$HIVE_HOME/conf:/etc/hadoop/conf 
LIBJARS=`echo $HADOOP_CLASSPATH | sed -e 's/:/,/g'` 
export LIBJARS=$LIBJARS,$HIVE_HOME/lib/antlr-runtime-3.4.jar 

需要注意的几件事:

  1. “$ LIBJARS”和“$ HIVE_HOME”之间最后一行的逗号是正确的。
  2. 我删除了对$ HIVE_HOME/lib/jdo2-api-2.3-ec.jar和$ HIVE_HOME/lib/slf4j-api-1.6.4.jar的引用,因为我的Hadoop发行版中没有这些引用。没有它,代码运行良好。
  3. Hadoop移动速度非常快,因此jar版本改变了。对于这些设置中的每个引用的jar文件,执行ls -l命令以确保该jar文件实际存在于您认为应该是的位置。
  4. 此代码使用一些不赞成使用的API调用。我的建议是(至少现在)不要更改代码。我发现尝试更改代码以使用未弃用的版本会破坏代码(另请参阅Radek's update to the same effect)。

我希望这有助于!

+0

嗨,我已经尝试了上述解决方案,但仍然得到同样的异常。我已经将antlr-runtime-3.4.jar放置在构建路径中,并且也像上面那样在libjars中导出。 – user1217694 2014-10-30 09:06:12

+0

嗨...我仍然得到同样的例外,当我尝试上述solution.but当我用逗号替换冒号我得到不同的异常。 – user1217694 2014-10-30 09:10:09

+0

嗨...我仍然得到同样的例外,当我尝试上述solution.but当我用逗号替换冒号我得到不同的exception.Exception在线程“主”java.lang.NoClassDefFoundError:org/apache/hadoop/hive/ql/metadata/HiveStorageHandler – user1217694 2014-10-30 09:16:04

0

确保在classpath中有以下3个datanucleus jar。

datanucleus-rdbms-3.x.x.jar 
datanucleus-core-3.x.x.jar 
datanucleus-api-jdo-3.x.x.jar 

而且这是一件好事,对HADOOP_CLASSPATH“$ HIVE_HOME/conf目录”和CLASSPATH,因为它有关于如何连接到metastore重要信息。

0

您需要配置环境变量〜/ .bashrc中

export SQOOP_HOME=/usr/lib/sqoop 
export HBASE_HOME=/usr/local/Hbase 
export HIVE_HOME=/usr/local/hive 
export HCAT_HOME=/usr/local/hive/hcatalog