0
我试图从IntelliJ内部运行Map-Reduce作业。这是我的亚军代码,IntelliJ内部的Mapreduce作业失败
public class ViewCount extends Configured implements Tool{
@Override
public int run(String[] args) throws Exception {
Configuration conf = this.getConf();
Job job = Job.getInstance(conf);
job.setJobName("viewCount");
job.setJarByClass(ViewCount.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);
Path inputFilePath = new Path(args[0]);
Path outputFilePath = new Path(args[1]);
FileInputFormat.addInputPath(job, inputFilePath);
FileOutputFormat.setOutputPath(job, outputFilePath);
return job.waitForCompletion(true) ? 0:1;
}
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new ViewCount(), args);
System.exit(exitCode);
}
该任务无法生成并显示以下错误消息。
error: incompatible types: Job cannot be converted to JobConf
FileOutputFormat.setOutputPath(job, outputFilePath);
Apached文档表明该方法实际上是一个工作而不是一个JobConf,所以我做错了什么?
这与IntelliJ无关 – Moira
您可能正在混合映射减少1和2个API。检查您从哪里导入FileOutputFormat。旧的mapreduce(org.apache.hadoop.mapred.FileOutputFormat)接受JobConf,而新的(org.apache.hadoop.mapreduce.lib.output.FileOutputFormat)接受Job作为参数。 – Amit