电工导论教学材料hadoop basic_第1页
电工导论教学材料hadoop basic_第2页
电工导论教学材料hadoop basic_第3页
电工导论教学材料hadoop basic_第4页
电工导论教学材料hadoop basic_第5页
免费预览已结束,剩余21页可下载查看

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

1、12HadoopFormally speaking, Hadoop is an open source framework for writing and running distributed applications that process large amounts of data.3Hadoop4A Hadoop cluster has many parallel machines that store and process large data sets. Client computers send jobs into this computer cloud and obtain

2、 results.Hadoop5AccessibleHadoop runs on large clusters of commodity machines or on cloud computing services such as Amazons Elastic Compute Cloud (EC2 ).RobustBecause it is intended to run on commodity hardware, Hadoop is architected with the assumption of frequent hardware malfunctions. It can gra

3、cefully handle most such failures.ScalableHadoop scales linearly to handle larger data by adding more nodes to the cluster.SimpleHadoop allows users to quickly write efficient parallel code.Motivation: Large Scale Data ProcessingWant to process lots of data ( 1 TB)Want to parallelize across hundreds

4、/thousands of CPUs Want to make this easy6Google Earth uses 70.5 TB: 70 TB for the raw imagery and 500 GB for the index data.From: MapReduceAutomatic parallelization & distributionFault-tolerantProvides status and monitoring toolsClean abstraction for programmers7Programming ModelBorrows from functi

5、onal programmingUsers implement interface of two functions:map (in_key, in_value) - (out_key, intermediate_value) listreduce (out_key, intermediate_value list) -out_value list8mapRecords from the data source (lines out of files, rows of a database, etc) are fed into the map function as key*value pai

6、rs: e.g., (filename, line).map() produces one or more intermediate values along with an output key from the input.9reduceAfter the map phase is over, all the intermediate values for a given output key are combined together into a listreduce() combines those intermediate values into one or more final

7、 values for that same output key (in practice, usually only one final value per key)10Architecture11Example: Count word occurrencesmap(String input_key, String input_value): / input_key: document name / input_value: document contents for each word w in input_value: EmitIntermediate(w, 1); reduce(Str

8、ing output_key, Iterator intermediate_values): / output_key: a word / output_values: a list of counts int result = 0; for each v in intermediate_values: result += ParseInt(v); Emit(AsString(result);12ExamplePage 1: the weather is goodPage 2: today is goodPage 3: good weather is good.13Map outputWork

9、er 1: (the 1), (weather 1), (is 1), (good 1).Worker 2: (today 1), (is 1), (good 1).Worker 3: (good 1), (weather 1), (is 1), (good 1).14Reduce InputWorker 1:(the 1)Worker 2:(is 1), (is 1), (is 1)Worker 3:(weather 1), (weather 1)Worker 4:(today 1)Worker 5:(good 1), (good 1), (good 1), (good 1)15Reduce

10、 OutputWorker 1:(the 1)Worker 2:(is 3)Worker 3:(weather 2)Worker 4:(today 1)Worker 5:(good 4)16WordCount on Hadoop1 start hadoop step1: step2: step3:17WordCount on Hadoop1 start hadoop step4: to check whether your hadoop is working in your web browser 18WordCount on Hadoop2 to create a directory and

11、 add files in step1: step2:19WordCount on Hadoop2 to create a directory and add files in *in test1.txt you can write down anything you like ,eg: 20WordCount on Hadoop3 to copy files to hdfs4 to run the wordcount example 21WordCount on Hadoop5 the process of Map&Reduce6 the result: 22Mini ExerciseThere is another example in the hadoop-0.20.2-examples.jar file, to compute . The command is as follows: $ bin/hadoop jar hadoop-0.20.2-examples.jar pi *where is the number of mapper jobs and is the number of samples 2

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论