




已阅读5页,还剩20页未读, 继续免费阅读
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
大数据基础课程设计报告一、项目简介: 使用hadoop中的hive、mapreduce以及HBASE对网上的一个搜狗五百万的数进行了一个比较实际的数据分析。搜狗五百万数据,是经过处理后的搜狗搜索引擎生产数据,具有真实性,大数据性,能够较好的满足分布式计算应用开发课程设计的数据要求。 搜狗数据的数据格式为:访问时间t 用户 IDt查询词t 该 URL 在返回结果中的排名t 用户点击的顺序号t 用户点击的 URL。其中,用户 ID 是根据用户使用浏览器访问搜索引擎时的 Cookie 信息自动赋值,即同一次使用浏览器输入的不同查询对应同一个用户 ID。二、操作要求1.将原始数据加载到HDFS平台。 2.将原始数据中的时间字段拆分并拼接,添加年、月、日、小时字段。 3.将处理后的数据加载到HDFS平台。 4.以下操作分别通过MR和Hive实现。l 查询总条数l 非空查询条数l 无重复总条数l 独立UID总数l 查询频度排名(频度最高的前50词)l 查询次数大于2次的用户总数l 查询次数大于2次的用户占比l Rank在10以内的点击次数占比l 直接输入URL查询的比例l 查询搜索过”仙剑奇侠传“的uid,并且次数大于35.将4每步骤生成的结果保存到HDFS中。 6.将5生成的文件通过Java API方式导入到HBase(一张表)。 7.通过HBase shell命令查询6导出的结果。三、实验流程1. 将原始数据加载到HDFS平台2. 将原始数据中的时间字段拆分并拼接,添加年、月、日、小时字段(1) 编写1个脚本sogou-log-extend.sh,其中sogou-log-extend.sh的内容为:#!/bin/bash#infile=/root/sogou.500w.utf8infile=$1#outfile=/root/filesogou.500w.utf8.extoutfile=$2awk -F t print $0tsubstr($1,0,4)年tsubstr($1,5,2)月tsubstr($1,7,2)日tsubstr($1,8,2)hour $infile $outfile处理脚本文件:bash sogou-log-extend.sh sogou.500w.utf8 sogou.500w.utf8.ext结果为:3. 将处理后的数据加载到HDFS平台hadoop fs -put sogou.500w.utf8.ext /4. 以下操作分别通过MR和Hive实现.hive实现1.查看数据库:show databases;2.创建数据库: create database sogou;3.使用数据库: use sogou;4.查看所有表:show tables;5.创建sougou表:Create table sogou(time string,uuid string,name string,num1 int,num2 int,url string) Row format delimited fields terminated by t;6.将本地数据导入到Hive表里:Load data local inpath /root/sogou.500w.utf8 into table sogou;7.查看表信息:desc sogou;(1) 查询总条数select count(*) from sogou;(2) 非空查询条数select count(*) from sogou where name is not null and name !=;(3) 无重复总条数select count(*) from (select * from sogou group by time,num1,num2,uuid,name,url having count(*)=1) a;(4) 独立UID总数select count(distinct uuid) from sogou;(5) 查询频度排名(频度最高的前50词)select name,count(*) as pd from sogou group by name order by pd desc limit 50;(6)查询次数大于2次的用户总数 select count(a.uuid) from (select uuid,count(*) as cnt from sogou group by uuid having cnt 2) a;(7)查询次数大于2次的用户占比select count(*) from (select uuid,count(*) as cnt from sogou group by uuid having cnt 2) a;(8) Rank在10以内的点击次数占比select count(*) from sogou where num13;.MapReduce实现(import的各种包省略)(1) 查询总条数public class MRCountAll public static Integer i = 0; public static boolean flag = true; public static class CountAllMap extends Mapper Override protected void map(Object key, Text value, Mapper.Context context) throws IOException, InterruptedException i+; public static void runcount(String Inputpath, String Outpath) Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); Job job = null; try job = Job.getInstance(conf, count); catch (IOException e) / TODO Auto-generated catch block e.printStackTrace(); job.setJarByClass(MRCountAll.class); job.setMapperClass(CountAllMap.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(Text.class); try FileInputFormat.addInputPath(job, new Path(Inputpath); catch (IllegalArgumentException e) / TODO Auto-generated catch block e.printStackTrace(); catch (IOException e) / TODO Auto-generated catch block e.printStackTrace(); FileOutputFormat.setOutputPath(job, new Path(Outpath); try job.waitForCompletion(true); catch (ClassNotFoundException e) / TODO Auto-generated catch block e.printStackTrace(); catch (IOException e) / TODO Auto-generated catch block e.printStackTrace(); catch (InterruptedException e) / TODO Auto-generated catch block e.printStackTrace(); public static void main(String args) throws Exception runcount(/sogou/data/sogou.500w.utf8, /sogou/data/CountAll); System.out.println(总条数: + i); (2) 非空查询条数public class CountNotNull public static String Str = ; public static int i = 0; public static boolean flag = true; public static class wyMap extends Mapper Override protected void map(Object key, Text value, Mapper.Context context) throws IOException, InterruptedException String values = value.toString().split(t); if (!values2.equals(null) & values2 != ) context.write(new Text(values1), new IntWritable(1); i+; public static void run(String inputPath, String outputPath) Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); Job job = null; try job = Job.getInstance(conf, countnotnull); catch (IOException e) / TODO Auto-generated catch block e.printStackTrace(); assert job != null; job.setJarByClass(CountNotNull.class); job.setMapperClass(wyMap.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); try FileInputFormat.addInputPath(job, new Path(inputPath); catch (IllegalArgumentException e) e.printStackTrace(); catch (IOException e) e.printStackTrace(); try FileOutputFormat.setOutputPath(job, new Path(outputPath); job.waitForCompletion(true); catch (ClassNotFoundException e) e.printStackTrace(); catch (IOException e) e.printStackTrace(); catch (InterruptedException e) e.printStackTrace(); public static void main(String args) run(/sogou/data/sogou.500w.utf8, /sogou/data/CountNotNull); System.out.println(非空条数: + i); (3) 无重复总条数public class CountNotRepeat public static int i = 0; public static class NotRepeatMap extends Mapper Override protected void map(Object key, Text value, Mapper.Context context) throws IOException, InterruptedException String text = value.toString(); String values = text.split(t); String time = values0; String uid = values1; String name = values2; String url = values5; context.write(new Text(time+uid+name+url), new Text(1); public static class NotRepeatReduc extends Reducer Override protected void reduce(Text key, Iterable values, Reducer.Context context) throws IOException, InterruptedException i+; context.write(new Text(key.toString(),new IntWritable(i); public static void main(String args) throws IOException, ClassNotFoundException, InterruptedException Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); Job job = null; try job = Job.getInstance(conf, countnotnull); catch (IOException e) / TODO Auto-generated catch block e.printStackTrace(); assert job != null; job.setJarByClass(CountNotRepeat.class); job.setMapperClass(NotRepeatMap.class); job.setReducerClass(NotRepeatReduc.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(Text.class); try FileInputFormat.addInputPath(job, new Path(/sogou/data/sogou.500w.utf8); catch (IllegalArgumentException e) e.printStackTrace(); catch (IOException e) e.printStackTrace(); try FileOutputFormat.setOutputPath(job, new Path(/sogou/data/CountNotRepeat); job.waitForCompletion(true); catch (ClassNotFoundException e) e.printStackTrace(); catch (IOException e) e.printStackTrace(); catch (InterruptedException e) e.printStackTrace(); System.out.println(无重复总条数为: + i); (4) 独立UID总数public class CountNotMoreUid public static int i = 0; public static class UidMap extends Mapper Override protected void map(Object key, Text value, Mapper.Context context) throws IOException, InterruptedException String text = value.toString(); String values = text.split(t); String uid = values1; context.write(new Text(uid), new Text(1); public static class UidReduc extends Reducer Override protected void reduce(Text key, Iterable values, Reducer.Context context) throws IOException, InterruptedException i+; context.write(new Text(key.toString(),new IntWritable(i); public static void main(String args) throws IOException, ClassNotFoundException, InterruptedException Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); Job job = null; try job = Job.getInstance(conf, countnotnull); catch (IOException e) / TODO Auto-generated catch block e.printStackTrace(); assert job != null; job.setJarByClass(CountNotNull.class); job.setMapperClass(UidMap.class); job.setReducerClass(UidReduc.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(Text.class); try FileInputFormat.addInputPath(job, new Path(/sogou/data/sogou.500w.utf8); catch (IllegalArgumentException e) e.printStackTrace(); catch (IOException e) e.printStackTrace(); try FileOutputFormat.setOutputPath(job, new Path(/sogou/data/CountNotMoreUid); job.waitForCompletion(true); catch (ClassNotFoundException e) e.printStackTrace(); catch (IOException e) e.printStackTrace(); catch (InterruptedException e) e.printStackTrace(); System.out.println(独立UID条数: + i); (5) 查询频度排名(频度最高的前50词)public class CountTop50 public static class TopMapper extends Mapper Text text =new Text(); Override protected void map(LongWritable key, Text value,Context context) throws IOException, InterruptedException String line= value.toString().split(t); String keys = line2; text.set(keys); context.write(text,new LongWritable(1); public static class TopReducer extends Reducer Text text = new Text(); TreeMap map = new TreeMap(); Override protected void reduce(Text key, Iterable value, Context context) throws IOException, InterruptedException int sum=0;/key出现次数 for (LongWritable ltext : value) sum+=ltext.get(); map.put(sum,key.toString(); /去前50条数据 if(map.size()50) map.remove(map.firstKey(); Override protected void cleanup(Context context) throws IOException, InterruptedException for(Integer count:map.keySet() context.write(new Text(map.get(count), new LongWritable(count); public static void main(String args) throws IOException, ClassNotFoundException, InterruptedException Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); Job job = Job.getInstance(conf, count); job.setJarByClass(CountTop50.class); job.setJobName(Five); job.setOutputKeyClass(Text.class); job.setOutputValueClass(LongWritable.class); job.setMapperClass(TopMapper.class); job.setReducerClass(TopReducer.class); FileInputFormat.addInputPath(job, new Path(/sogou/data/sogou.500w.utf8); FileOutputFormat.setOutputPath(job, new Path(/sogou/data/CountTop50); job.waitForCompletion(true); (6) 查询次数大于2次的用户总数public class CountQueriesGreater2 public static int total = 0; public static class MyMaper extends Mapper protected void map(Object key, Text value, Mapper.Context context) throws IOException, InterruptedException String str = value.toString().split(t); Text word; IntWritable one = new IntWritable(1); word = new Text(str1); context.write(word, one); public static class MyReducer extends Reducer Override protected void reduce(Text arg0, Iterable arg1, Reducer.Context arg2) throws IOException, InterruptedException / arg0是一个单词 arg1是对应的次数 int sum = 0; for (IntWritable i : arg1) sum += i.get(); if(sum2) total=total+1; /arg2.write(arg0, new IntWritable(sum); public static void main(String args) throws IOException, ClassNotFoundException, InterruptedException Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); / 1.实例化一个Job Job job = Job.getInstance(conf, six); / 2.设置mapper类 job.setMapperClass(MyMaper.class); / 3.设置Combiner类 不是必须的 / job.setCombinerClass(MyReducer.class); / 4.设置Reducer类 job.setReducerClass(MyReducer.class); / 5.设置输出key的数据类型 job.setOutputKeyClass(Text.class); / 6.设置输出value的数据类型 job.setOutputValueClass(IntWritable.class); / 设置通过哪个类查找job的Jar包 job.setJarByClass(CountQueriesGreater2.class); / 7.设置输入路径 FileInputFormat.addInputPath(job, new Path(/sogou/data/sogou.500w.utf8); / 8.设置输出路径 FileOutputFormat.setOutputPath(job, new Path(/sogou/data/CountQueriesGreater2); / 9.执行该作业 job.waitForCompletion(true); System.out.println(查询次数大于2次的用户总数: + total + 条); (7) 查询次数大于2次的用户占比public class CountQueriesGreaterPro public static int total1 = 0; public static int total2 = 0; public static class MyMaper extends Mapper Override protected void map(Object key, Text value, Mapper.Context context) throws IOException, InterruptedException total2+; String str = value.toString().split(t); Text word; IntWritable one = new IntWritable(1); word = new Text(str1); context.write(word, one); / 执行完毕后就是一个单词 对应一个value(1) public static class MyReducer extends Reducer Override protected void reduce(Text arg0, Iterable arg1, Reducer.Context arg2) throws IOException, InterruptedException / arg0是一个单词 arg1是对应的次数 int sum = 0; for (IntWritable i : arg1) sum += i.get(); if(sum2) total1+; arg2.write(arg0, new IntWritable(sum); public static void main(String args) throws IOException, ClassNotFoundException, InterruptedException System.out.println(seven begin); Configuration conf = new Configuration(); conf.set(fs.defaultFS, hdfs:/10.49.47.20:9000); / 1.实例化一个Job Job job = Job.getInstance(conf, seven); / 2.设置mapper类 job.setMapperClass(MyMaper.class); / 3.设置Combiner类 不是必须的 / job.setCombinerClass(MyReducer.class); / 4.设置Reducer类 job.setReducerClass(MyReducer.class); / 5.设置输出key的数据类型 job.setOutputKeyClass(Text.class); / 6.设置输出value的数据类型 job.setOutputValueClass(IntWritable.class); / 设置通过哪个类查找job的Jar包 job.setJarByClass(CountQueriesGreaterPro.class); / 7.设置输入路径 FileInputFormat.addInputPath(job, new Path
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
评论
0/150
提交评论