2016-03-14 63 views
3

我有运行的IntelliJ IDE中HbaseTestingUtility问题,我可以看到下面的错误可能是文件名过长的结果:如何改变HBase的基础目录中HbaseTestingUtility

16/03/14 22:45:13 WARN datanode.DataNode: IOException in BlockReceiver.run(): 
java.io.IOException: Failed to move meta file for ReplicaBeingWritten, blk_1073741825_1001, RBW 
getNumBytes()  = 7 
getBytesOnDisk() = 7 
getVisibleLength()= 7 
getVolume()  = C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current 
getBlockFile() = C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current\BP-429386217-192.168.1.110-1457991908038\current\rbw\blk_1073741825 
bytesAcked=7 
bytesOnDisk=7 from C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current\BP-429386217-192.168.1.110-1457991908038\current\rbw\blk_1073741825_1001.meta to C:\Users\user1\Documents\work\Repos\hadoop-analys\reporting\mrkts-surveillance\target\test-data\9654a646-e923-488a-9e20-46396fd15292\dfscluster_6b264e6b-0218-4f30-ad5b-72e838940b1e\dfs\data\data1\current\BP-429386217-192.168.1.110-1457991908038\current\finalized\subdir0\subdir0\blk_1073741825_1001.meta 
    at  org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.moveBlockFiles(FsDatasetImpl.java:615) 
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice.addBlock(BlockPoolSlice.java:250) 
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsVolumeImpl.addBlock(FsVolumeImpl.java:229) 
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.finalizeReplica(FsDatasetImpl.java:1119) 
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.finalizeBlock(FsDatasetImpl.java:1100) 
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.finalizeBlock(BlockReceiver.java:1293) 
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:1233) 
at java.lang.Thread.run(Thread.java:745) 
Caused by: 3: The system cannot find the path specified. 

任何想法,我如何指定Hbasetestingutility的基础目录不使用这个巨大的起始目录?

谢谢,

+2

基本上你需要做的是这样的: 'System.setProperty( “test.build.data.basedirectory”,“C:/ HBase的/ “);' – Stanislav

+0

@斯坦尼斯拉夫的建议也解决了我的问题。 –

回答

2

您可以使用test.build.data.basedirectory。

看一看getDataTestDirHBaseCommonTestingUtility

/** 
* System property key to get base test directory value 
*/ 
public static final String BASE_TEST_DIRECTORY_KEY = 
    "test.build.data.basedirectory"; 

/** 
    * @return Where to write test data on local filesystem, specific to 
    * the test. Useful for tests that do not use a cluster. 
    * Creates it if it does not exist already. 
    */ 
public Path getDataTestDir() { 
    if (this.dataTestDir == null) { 
    setupDataTestDir(); 
    } 
    return new Path(this.dataTestDir.getAbsolutePath()); 
}