2014-11-21 57 views
0

当我运行我在外壳hadoop文件夹hadoop mapreduce word count jar,它运行正常和正确生成输出,的Map Reduce客户罐子2.4.1 Hadoop的月食

由于我的情况下,使用yarnhadoop 2.4.1,我从eclipse运行MapReduce Sample program,MAP过程完成并且在减少过程中失败。

它清楚问题是与jar配置。

请找到罐子,我已经加入...

enter image description here

这是我

INFO: reduce task executor complete. Nov 21, 2014 8:50:35 PM org.apache.hadoop.mapred.LocalJobRunner$Job run WARNING: job_local1638918104_0001 java.lang.Exception: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles(Ljava/util/Map;)V at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529) Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles(Ljava/util/Map;)V at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:309) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:722)

Exception in thread "Thread-12" java.lang.NoClassDefFoundError: org/apache/commons/httpclient/HttpMethod at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:562) Caused by: java.lang.ClassNotFoundException: org.apache.commons.httpclient.HttpMethod at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:423) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:356) ... 1 more

回答

0

按照截图错误,你是手动添加所有依赖的jar添加到classpath中。 为此,强烈建议使用maven,这将自动执行将相关jar添加到classpath的过程。我们只需要添加主要的相关罐子。
我用的pom.xml这让我没有任何问题,运行下面的依赖..

<properties> 
    <hadoop.version>2.5.2</hadoop.version> 
</properties> 

<dependencies> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-hdfs</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-common</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-client</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-mapreduce-client-core</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-yarn-api</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-yarn-common</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-auth</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-yarn-server-nodemanager</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-yarn-server-resourcemanager</artifactId> 
      <version>${hadoop.version}</version> 
     </dependency> 
    </dependencies> 

来到你的问题, 我在classpath检查,恰好有82的jar文件可用。
找到像这样的每个罐子将是一件繁琐的工作。
你可以添加功能明智的罐子HERE
其他解决方法是,将已安装的hadoop目录路径中的所有jar文件添加为<hadoop-installed>/share/hadoop/,并添加所有lib文件夹中的所有jar。这是你可以做的最好的事情..或
只添加avro特定的jar,因为avro类按截图抛出的异常。这可以解决avro罐问题。但您可能会遇到其他依赖性问题。 我在使用Hadoop V1时也遇到同样的问题。所以后来我意识到并在Hadoop V2中使用Maven。所以不用担心依赖瓶子。
您的重点将放在Hadoop和业务需求上。 :)
希望它可以帮助你..