2013-03-26 81 views
-1

我对这个领域很陌生。 通过参考http://java.dzone.com/articles/hadoop-practice我做了 https://github.com/studhadoop/xmlparsing-hadoop/blob/master/XmlParser11.javaHadoop中的XML处理失败

并创建了jar文件,然后运行mapreduce pgm。 我的XML文件

<configuration> 
    <property> 
      <name>dfs.replication</name> 
      <value>1</value> 
      <type>tr</type> 
    </property> 
</configuration> 

root# javac -classpath /var/root/hadoop-1.0.4/hadoop-core-1.0.4.jar -d xml11 XmlParser11.java 

root# jar -cvf /var/root/xmlparser11/xmlparser.jar -C xml11/ . 

root# bin/hadoop jar /var/root/xmlparser11/xmlparser.jar com.org.XmlParser11 /user/root/xmlfiles/conf.xml /user/root/xmlfiles-outputjava3 

UPDATE

13/03/30 09:39:58 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 
13/03/30 09:39:58 INFO input.FileInputFormat: Total input paths to process : 1 
13/03/30 09:39:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
13/03/30 09:39:58 WARN snappy.LoadSnappy: Snappy native library not loaded 
13/03/30 09:39:58 INFO mapred.JobClient: Running job: job_201303300855_0004 
13/03/30 09:39:59 INFO mapred.JobClient: map 0% reduce 0% 
13/03/30 09:40:13 INFO mapred.JobClient: Task Id : attempt_201303300855_0004_m_000000_0, Status : FAILED 
java.io.IOException: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String 
    at com.org.XmlParser11$Map.map(XmlParser11.java:186) 
    at com.org.XmlParser11$Map.map(XmlParser11.java:148) 
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) 
    at org.apache.hadoop.mapred.Child.main(Child.java:249) 
Caused by: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String 
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1014) 
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691) 
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80) 
    at com.org.XmlParser11$Map.map(XmlParser11.java:184) 
    ... 9 more 

attempt_201303300855_0004_m_000000_0: ‘<property> 
attempt_201303300855_0004_m_000000_0:    <name>dfs.replication</name> 
attempt_201303300855_0004_m_000000_0:     <value>1</value> 
attempt_201303300855_0004_m_000000_0:    </property>‘ 
13/03/30 09:40:19 INFO mapred.JobClient: Task Id : attempt_201303300855_0004_m_000000_1, Status : FAILED 
java.io.IOException: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String 
    at com.org.XmlParser11$Map.map(XmlParser11.java:186) 
    at com.org.XmlParser11$Map.map(XmlParser11.java:148) 
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) 
    at org.apache.hadoop.mapred.Child.main(Child.java:249) 
Caused by: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String 
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1014) 
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691) 
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80) 
    at com.org.XmlParser11$Map.map(XmlParser11.java:184) 
    ... 9 more 

attempt_201303300855_0004_m_000000_1: ‘<property> 
attempt_201303300855_0004_m_000000_1:    <name>dfs.replication</name> 
attempt_201303300855_0004_m_000000_1:     <value>1</value> 
attempt_201303300855_0004_m_000000_1:    </property>‘ 
13/03/30 09:40:25 INFO mapred.JobClient: Task Id : attempt_201303300855_0004_m_000000_2, Status : FAILED 
java.io.IOException: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String 
    at com.org.XmlParser11$Map.map(XmlParser11.java:186) 
    at com.org.XmlParser11$Map.map(XmlParser11.java:148) 
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) 
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) 
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:396) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) 
    at org.apache.hadoop.mapred.Child.main(Child.java:249) 
Caused by: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String 
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1014) 
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691) 
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80) 
    at com.org.XmlParser11$Map.map(XmlParser11.java:184) 
    ... 9 more 

attempt_201303300855_0004_m_000000_2: ‘<property> 
attempt_201303300855_0004_m_000000_2:    <name>dfs.replication</name> 
attempt_201303300855_0004_m_000000_2:     <value>1</value> 
attempt_201303300855_0004_m_000000_2:    </property>‘ 
13/03/30 09:40:37 INFO mapred.JobClient: Job complete: job_201303300855_0004 
13/03/30 09:40:37 INFO mapred.JobClient: Counters: 7 
13/03/30 09:40:37 INFO mapred.JobClient: Job Counters 
13/03/30 09:40:37 INFO mapred.JobClient:  SLOTS_MILLIS_MAPS=27296 
13/03/30 09:40:37 INFO mapred.JobClient:  Total time spent by all reduces waiting after reserving slots (ms)=0 
13/03/30 09:40:37 INFO mapred.JobClient:  Total time spent by all maps waiting after reserving slots (ms)=0 
13/03/30 09:40:37 INFO mapred.JobClient:  Launched map tasks=4 
13/03/30 09:40:37 INFO mapred.JobClient:  Data-local map tasks=4 
13/03/30 09:40:37 INFO mapred.JobClient:  SLOTS_MILLIS_REDUCES=0 
13/03/30 09:40:37 INFO mapred.JobClient:  Failed map tasks=1 

什么是错的map-reduce代码?我该如何纠正它?

+0

你怎么看到的文字:空。我在代码中只看到一个System.out.println。为什么你没有一个Reducer,即使你已经配置了FileOutputFormat。 – shazin 2013-03-27 05:55:34

+0

我也想知道它从哪里来的文本:null ...编辑我的代码现在打印 2013-03-27 05:59:05

回答

0

试试这个命令,看看

root# bin/hadoop fs -cat /user/root/xmlfiles-outputjava3/part-r-00000 

你是否有所需的输出。您指定的输出是Map Reduce在HDFS中运行时获得的标准输出。

UPDATE

你会需要把的System.out.println

if (currentElement.equalsIgnoreCase("name")) { 
    propertyName += reader.getText(); 
    System.out.println(propertyName); 
} else if (currentElement.equalsIgnoreCase("value")) { 
    propertyValue += reader.getText(); 
    System.out.println(propertyValue); 
} 

要查看的属性名和值被设定。如果不是,你需要找到原因?

更新2

context.write(propertyName.trim(), propertyValue.trim()); 

propertyName的和是的PropertyValue字符串,但你已经宣布你的映射器输出文本作为键和值。

这样的变化。

Text name = new Text(); 
Text value = new Text(); 
name.setText(propertyName.trim()); 
value.setText(propertyValue.trim()); 
context.write(name, value); 
+0

打印 abv map-reduce pgm应该打印所有的xml标签吗? ??? – 2013-03-27 06:16:04

+0

我添加了一个IOEXCEPTION ..现在它显示了错误类型不匹配键映射:预期org.apache.hadoop.io.Text,接收到的java.lang.String – 2013-03-30 05:08:04

+0

THX为UR UPDATE 2.它的工作:)我改变了context.write(new Text(propertyName.trim()),new Text(propertyValue.trim())); – 2013-04-01 04:32:50