A股上市公司传智教育(股票代码 003032)旗下技术交流社区北京昌平校区

 找回密码
 加入黑马

QQ登录

只需一步,快速开始

© 小弹壳 初级黑马   /  2019-12-24 16:42  /  1035 人查看  /  0 人回复  /   0 人收藏 转载请遵从CC协议 禁止商业使用本文

不知觉一个月过去了,感觉没什么说的,来份我的收获,直接来份手写代码
    package cn.itcast.demo1;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;

public class JobMain extends Configured implements Tool {
    @Override
    public int run(String[] args) throws Exception {
        //1.获取Job对象
        Job job  =Job.getInstance(super.getConf(),"E:\\input\\common_friends_step_out");

        //2.设置job任务
            //1.设置输出类和路径
             job.setInputFormatClass(TextInputFormat.class);
             TextInputFormat.addInputPath(job,new Path("file:///E:\\input\\common_friends_step_out"));

             //2.设置Mapper类和数据类型
             job.setMapperClass(MyOutputFormatMapper.class);
             job.setMapOutputKeyClass(Text.class);
             job.setMapOutputValueClass(NullWritable.class);


             //3 4 5 6

             //7.设置输出类和输出路径

             //8设置输出类和路径
             job.setOutputFormatClass(MyOutputFormat.class);
             TextOutputFormat.setOutputPath(job,new Path("file:///E:\\input\\common_friends_step2_out"));
        //3.等待Job任务结束
        boolean b1 = job.waitForCompletion(true);

        return b1? 0:1;
    }

    public static void main(String[] args) throws Exception {
        Configuration configuration = new Configuration();

        //启动Job任务
        int run = ToolRunner.run(configuration, new JobMain(), args);

        System.exit(run);
    }
}

package cn.itcast.demo1;


import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.RecordWriter;
import org.apache.hadoop.mapreduce.TaskAttemptContext;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

import java.io.IOException;

public  class MyOutputFormat extends FileOutputFormat<Text, NullWritable> {


    @Override
    public RecordWriter<Text ,NullWritable> getRecordWriter(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
        //1.获取目标文件输出流
        FileSystem fileSystem = FileSystem.get(taskAttemptContext.getConfiguration());
        FSDataOutputStream goodCommentOutputStream = fileSystem.create(new Path("file:///E:\\input\\good_comments\\good_comments.txt"));
        FSDataOutputStream badCommentOutputStream = fileSystem.create(new Path("file:///E:\\input\\bad_comments\\bad_comments.txt"));

        //2.
        MyRecordWriter myRecordWriter = new MyRecordWriter(goodCommentOutputStream, badCommentOutputStream);

        return myRecordWriter;
    }
}

package cn.itcast.demo1;

import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

import java.io.IOException;

public class MyOutputFormatMapper extends Mapper<LongWritable, Text,Text, NullWritable> {
    @Override
    protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
        context.write(value,NullWritable.get());
    }
}

package cn.itcast.demo1;

import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.RecordWriter;
import org.apache.hadoop.mapreduce.TaskAttemptContext;

import java.io.IOException;
import java.util.Spliterator;

public class MyRecordWriter extends RecordWriter<Text, NullWritable> {
    private FSDataOutputStream goodCommentsOutputStream;
    private FSDataOutputStream badCommentsOutputStream;

    public MyRecordWriter() {
    }

    public MyRecordWriter(FSDataOutputStream goodCommentsOutputStream, FSDataOutputStream badCommentsOutputStream) {
        this.goodCommentsOutputStream = goodCommentsOutputStream;
        this.badCommentsOutputStream = badCommentsOutputStream;
    }

    @Override
    public void write(Text text, NullWritable nullWritable) throws IOException, InterruptedException {
        //1.从文本数据中获得第9个字段
        String[] split = text.toString().split("\t");
        String numStr = split[9];
        //2:根据字段的值,判断评论的类型,然后将对应的数据写入不同的文件夹文件中
              if(Integer.parseInt(numStr) <= 1){
           //好评或者中评
             goodCommentsOutputStream.write(text.toString().getBytes());
              goodCommentsOutputStream.write("\r\n".getBytes());
           }else{
                //差评
                  badCommentsOutputStream.write(text.toString().getBytes());
                  badCommentsOutputStream.write("\r\n".getBytes());
       }
    }

    @Override
    public void close(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
            IOUtils.closeStream(goodCommentsOutputStream);
            IOUtils.closeStream(badCommentsOutputStream);
    }
}

以上就是我的收获是不是很厉害,继续努力

0 个回复

您需要登录后才可以回帖 登录 | 加入黑马