2017-03-16 97 views
1

我试图与Windows 10建立开发计算机上的Hadoop以下版本的家庭版无法执行目标maven maven-antrun-plugin:1.7项目hadoop-hdfs:围绕Ant部分的Ant BuildException hadoop-hdfsproject hadoop-hdfs:?

的Hadoop-2.7.3-SRC

这里有一些关于我的地方发展环境的细节:

- 10的Windows家庭版

- 英特尔酷睿i5-6200U CPU @ 2.30GHz

-RAM 16 GB

-64位操作系统,基于x64的处理器

- 微软的Visual Studio 2015年的社区版本14.0.25431.01更新3

-.NET框架4.6.01586

-cmake版本3.7.2

-CYGWIN_NT-10.0 LTPBCV82DUG 2.7.0(0.306/5/3)2017年2月12日13点18 x86_64的Cygwin的

-java版本 “1.8.0_121”

-Java(TM)SE运行时环境(建立1.8.0_121-B13)

-Java热点(TM)64位服务器VM(生成25.121-B13,混合模式)

-Apache Maven的3.3 .9(bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T11:41:47-05:00)

-Google Protocol Buffers的protoc --version libprotoc 2.5.0

我打开了开发人员命令提示符为Visual Studio 2015年(VS2015)

C:\ Hadoop的\ Hadoop的2.7.3-SRC> MVN包-Pdist,原生共赢-DskipTests -Dtar -X

不幸的是,我发现了以下错误:

[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 06:27 min 
[INFO] Finished at: 2017-03-15T19:26:50-04:00 
[INFO] Final Memory: 102M/1591M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 
[ERROR] around Ant part ...<exec failonerror="true" dir="C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 8:126 in C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml 
[ERROR] -> [Help 1] 
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 
around Ant part ...<exec failonerror="true" dir="C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 8:126 in C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212) 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) 
     at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116) 
     at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80) 
     at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51) 
     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128) 
     at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307) 
     at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193) 
     at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106) 
     at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863) 
     at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288) 
     at org.apache.maven.cli.MavenCli.main(MavenCli.java:199) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415) 
     at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356) 
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 1 
around Ant part ...<exec failonerror="true" dir="C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... @ 8:126 in C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml 
     at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355) 
     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134) 
     at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207) 
     ... 20 more 
Caused by: C:\hadoop\hadoop-2.7.3-src\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml:8: exec returned: 1 
     at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646) 
     at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672) 
     at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498) 
     at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291) 
     at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) 
     at org.apache.tools.ant.Task.perform(Task.java:348) 
     at org.apache.tools.ant.Target.execute(Target.java:390) 
     at org.apache.tools.ant.Target.performTasks(Target.java:411) 
     at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399) 
     at org.apache.tools.ant.Project.executeTarget(Project.java:1368) 
     at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327) 
     ... 22 more 
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles: 
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command 
[ERROR] mvn <goals> -rf :hadoop-hdfs 

回答

0

我得到了同样的错误或在CentOS 7.2,而Hadoop的共同点是建设 走后我做后 须藤荫-y安装zlib的 须藤荫-y安装的zlib-devel的

后,我得到了一个又一个在Hadoop的管道模块,我做 sudo yum -y install -y openssl-devel

并成功与我的构建。

希望它会有用。