2015-06-13 60 views
0

新手在这里。尝试使用Pail从Nathan Marz的书Big Data DFS Datastore运行代码。我究竟做错了什么?尝试连接到HDFS VM。试图用文件替换hdfs。任何帮助赞赏。无法使用桶DFS创建文件

public class AppTest 
{ 
    private App app = new App(); 
    private String path = "hdfs:////192.168.0.101:8080/mypail"; 

    @Before 
    public void init() throws IllegalArgumentException, IOException{ 
     FileSystem fs = FileSystem.get(new Configuration()); 
     fs.delete(new Path(path), true); 
    } 

    @Test public void testAppAccess() throws IOException{ 
     Pail pail = Pail.create(path); 
      TypedRecordOutputStream os = pail.openWrite(); 
      os.writeObject(new byte[] {1, 2, 3}); 
      os.writeObject(new byte[] {1, 2, 3, 4}); 
      os.writeObject(new byte[] {1, 2, 3, 4, 5}); 
      os.close(); 
    } 
} 

得到一个错误 -

java.lang.IllegalArgumentException: Wrong FS: hdfs:/192.168.0.101:8080/mypail, expected: file:/// 
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645) 
    at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:80) 
    at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:529) 
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:747) 

在有文件替换为HDFS文件:///

java.io.IOException: Mkdirs failed to create file:/192.168.0.101:8080/mypail (exists=false, cwd=file:/Users/joshi/git/projectcsr/projectcsr) 
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442) 
    at 

回答

0

我碰到了同样的问题来了,我解决了!你应该添加core-site.xml到Hadoop的Configuration对象,这样的事情应该工作:

Configuration cfg = new Configuration(); 
Path core_site_path = new Path("path/to/your/core-site.xml"); 
cfg.addResource(core_site_path); 
FileSystem fs = FileSystem.get(cfg); 

我想你可以做同样的也编程方式增加财产fs.defaultFScfg对象

来源: http://opensourceconnections.com/blog/2013/03/24/hdfs-debugging-wrong-fs-expected-file-exception/