0
当我尝试使用HDFS C驱动,我得到这个错误=>HDFS C驱动错误
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
但是我加的hadoop classpath --glob
输出到我的系统CLASSPATH VAR(薄荷的Linux 18.1),并没有什么改变。
的Hadoop版本:2.7.3
我的C代码:
hdfsFS fs = hdfsConnect("default",9000);
const char * writePath = "/test.txt";
hdfsFile writeFile = hdfsOpenFile(fs,writePath,0,sizeof(writePath),1,1024);
if(!writeFile){
printf("Error Opening HDFS File");
exit(0);
}
const char * buffer= "Test ---- &^^$#@s";
tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
if (hdfsFlush(fs, writeFile)) {
fprintf(stderr, "Failed to 'flush' %s\n", writePath);
exit(-1);
}
hdfsCloseFile(fs, writeFile);
系统CLASSPATH VAR:
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.7.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar
为什么这个标签C++和C?为什么不是Java,因为它是你正在处理的Java异常? –
因为我使用的是C驱动程序,使用jni – KoLiBer
问题不在于C驱动程序;问题在于Java代码找不到C驱动程序的方式。这是一个Java问题,而不是C问题。 C中的技能不会解决这个问题; Java和Java部署技能可能有所帮助。 C语言标签是IMO,不适用。你不打算显示C驱动程序的来源,对吗? –