2015-07-19 52 views
0

我刚开始学习的Hadoop所以裸跟我说,如果问题是愚蠢的 我只是想通过Java代码访问的Hadoop文件系统,但我不断地似乎得到例外着访问HDFS

public class hdfsClient { 

public hdfsClient() {} 

public void addFile(String source, String dest) throws IOException{ 
    Configuration conf = new Configuration(); 
    conf.addResource(new Path("/usr/local/hadoop/etc/hadoop/core-site.xml")); 
    conf.addResource(new Path("/usr/local/hadoop/etc/hadoop/hdfs-site.xml")); 
    FileSystem fs = null; 
    try { 
     fs = FileSystem.get(conf); 
    } catch (Exception e) { 
     System.out.println("Error in getting the fileSystem"); 
     e.printStackTrace(); 
    } 
} 

现在主要的文件是这样的

public class testMain { 
public static void main(String[] args) throws Exception{ 
    // TODO Auto-generated method stub 
    hdfsClient client = new hdfsClient(); 

    if (args[0].equals("add")) { 
     if (args.length < 3) { 
      System.out.println("Usage: hdfsclient add <local_path> " + 
      "<hdfs_path>"); 
      System.exit(1); 
     } 

     client.addFile(args[1], args[2]); 
    } 
} 

} 

我创建了日食这些文件并导出为JAR,然后我用

java -jar <jarname> add <path in local system> <path in hadoop> 

具体的命令是

java -jar add.jar add /home/aman/test.txt/

我收到以下错误

org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4 
at org.apache.hadoop.ipc.Client.call(Client.java:1113) 
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) 
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85) 
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62) 
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source) 
at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422) 
at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183) 
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281) 
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245) 
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100) 
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446) 
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) 
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:124) 
at crud.crud.hdfsClient.addFile(hdfsClient.java:28) 
at crud.crud.testMain.main(testMain.java:16) 

任何帮助,我尝试了整整两天,但不可能解决问题的任何帮助

PS: 输出弗洛姆jps

16341 Jps 
14985 NameNode 
20704 -- process information unavailable 
15655 NodeManager 
15146 DataNode 
15349 SecondaryNameNode 
15517 ResourceManager 
+0

请参阅http://stackoverflow.com/questions/23634985/error-when-trying-to-write-to-hdfs-server-ipc-version-9-cannot-communicate-with和http: //hortonworks.com/community/forums/topic/server-ipc-version-9/。这听起来像你有图书馆冲突。 – spork

+0

您应该看到这 http://stackoverflow.com/questions/31453336/exception-in-thread-main-org-apache-hadoop-ipc-remoteexception-server-ipc-ver/31483536#31483536 – Abdulrahman

回答

0

我找到了解决方案,我在pom.xml文件中使用了hadoop核心依赖项,而hadoop核心是hadoop 1.X包的一部分,其余的依赖项来自hadoop 2.X,因此有一个版本冲突。删除hadoop核心依赖解决了这个问题。

0

问题出在您在代码中使用的库中的版本不匹配。删除所有库并添加从您的hadoop安装本身收集的相应库。