数据挖掘:概念与技术完整(英文)3prepppt课件_第1页
数据挖掘:概念与技术完整(英文)3prepppt课件_第2页
数据挖掘:概念与技术完整(英文)3prepppt课件_第3页
数据挖掘:概念与技术完整(英文)3prepppt课件_第4页
数据挖掘:概念与技术完整(英文)3prepppt课件_第5页
已阅读5页,还剩49页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

03.06.2020,.,1,DataMining:ConceptsandTechniquesSlidesforTextbookChapter3,JiaweiHanandMichelineKamberIntelligentDatabaseSystemsResearchLabSchoolofComputingScienceSimonFraserUniversity,Canadahttp:/www.cs.sfu.ca,03.06.2020,.,2,Chapter3:DataPreprocessing,Whypreprocessthedata?DatacleaningDataintegrationandtransformationDatareductionDiscretizationandconcepthierarchygenerationSummary,03.06.2020,.,3,WhyDataPreprocessing?,Dataintherealworldisdirtyincomplete:lackingattributevalues,lackingcertainattributesofinterest,orcontainingonlyaggregatedatanoisy:containingerrorsoroutliersinconsistent:containingdiscrepanciesincodesornamesNoqualitydata,noqualityminingresults!QualitydecisionsmustbebasedonqualitydataDatawarehouseneedsconsistentintegrationofqualitydata,03.06.2020,.,4,Multi-DimensionalMeasureofDataQuality,Awell-acceptedmultidimensionalview:AccuracyCompletenessConsistencyTimelinessBelievabilityValueaddedInterpretabilityAccessibilityBroadcategories:intrinsic,contextual,representational,andaccessibility.,03.06.2020,.,5,MajorTasksinDataPreprocessing,DatacleaningFillinmissingvalues,smoothnoisydata,identifyorremoveoutliers,andresolveinconsistenciesDataintegrationIntegrationofmultipledatabases,datacubes,orfilesDatatransformationNormalizationandaggregationDatareductionObtainsreducedrepresentationinvolumebutproducesthesameorsimilaranalyticalresultsDatadiscretizationPartofdatareductionbutwithparticularimportance,especiallyfornumericaldata,03.06.2020,.,6,Formsofdatapreprocessing,03.06.2020,.,7,Chapter3:DataPreprocessing,Whypreprocessthedata?DatacleaningDataintegrationandtransformationDatareductionDiscretizationandconcepthierarchygenerationSummary,03.06.2020,.,8,DataCleaning,DatacleaningtasksFillinmissingvaluesIdentifyoutliersandsmoothoutnoisydataCorrectinconsistentdata,03.06.2020,.,9,MissingData,DataisnotalwaysavailableE.g.,manytupleshavenorecordedvalueforseveralattributes,suchascustomerincomeinsalesdataMissingdatamaybeduetoequipmentmalfunctioninconsistentwithotherrecordeddataandthusdeleteddatanotenteredduetomisunderstandingcertaindatamaynotbeconsideredimportantatthetimeofentrynotregisterhistoryorchangesofthedataMissingdatamayneedtobeinferred.,03.06.2020,.,10,HowtoHandleMissingData?,Ignorethetuple:usuallydonewhenclasslabelismissing(assumingthetasksinclassificationnoteffectivewhenthepercentageofmissingvaluesperattributevariesconsiderably.Fillinthemissingvaluemanually:tedious+infeasible?Useaglobalconstanttofillinthemissingvalue:e.g.,“unknown”,anewclass?!UsetheattributemeantofillinthemissingvalueUsetheattributemeanforallsamplesbelongingtothesameclasstofillinthemissingvalue:smarterUsethemostprobablevaluetofillinthemissingvalue:inference-basedsuchasBayesianformulaordecisiontree,03.06.2020,.,11,NoisyData,Noise:randomerrororvarianceinameasuredvariableIncorrectattributevaluesmayduetofaultydatacollectioninstrumentsdataentryproblemsdatatransmissionproblemstechnologylimitationinconsistencyinnamingconventionOtherdataproblemswhichrequiresdatacleaningduplicaterecordsincompletedatainconsistentdata,03.06.2020,.,12,HowtoHandleNoisyData?,Binningmethod:firstsortdataandpartitioninto(equi-depth)binsthenonecansmoothbybinmeans,smoothbybinmedian,smoothbybinboundaries,etc.ClusteringdetectandremoveoutliersCombinedcomputerandhumaninspectiondetectsuspiciousvaluesandcheckbyhumanRegressionsmoothbyfittingthedataintoregressionfunctions,03.06.2020,.,13,SimpleDiscretizationMethods:Binning,Equal-width(distance)partitioning:ItdividestherangeintoNintervalsofequalsize:uniformgridifAandBarethelowestandhighestvaluesoftheattribute,thewidthofintervalswillbe:W=(B-A)/N.ThemoststraightforwardButoutliersmaydominatepresentationSkeweddataisnothandledwell.Equal-depth(frequency)partitioning:ItdividestherangeintoNintervals,eachcontainingapproximatelysamenumberofsamplesGooddatascalingManagingcategoricalattributescanbetricky.,03.06.2020,.,14,BinningMethodsforDataSmoothing,*Sorteddataforprice(indollars):4,8,9,15,21,21,24,25,26,28,29,34*Partitioninto(equi-depth)bins:-Bin1:4,8,9,15-Bin2:21,21,24,25-Bin3:26,28,29,34*Smoothingbybinmeans:-Bin1:9,9,9,9-Bin2:23,23,23,23-Bin3:29,29,29,29*Smoothingbybinboundaries:-Bin1:4,4,4,15-Bin2:21,21,25,25-Bin3:26,26,26,34,03.06.2020,.,15,ClusterAnalysis,03.06.2020,.,16,Regression,x,y,y=x+1,X1,Y1,Y1,03.06.2020,.,17,Chapter3:DataPreprocessing,Whypreprocessthedata?DatacleaningDataintegrationandtransformationDatareductionDiscretizationandconcepthierarchygenerationSummary,03.06.2020,.,18,DataIntegration,Dataintegration:combinesdatafrommultiplesourcesintoacoherentstoreSchemaintegrationintegratemetadatafromdifferentsourcesEntityidentificationproblem:identifyrealworldentitiesfrommultipledatasources,e.g.,A.cust-idB.cust-#Detectingandresolvingdatavalueconflictsforthesamerealworldentity,attributevaluesfromdifferentsourcesaredifferentpossiblereasons:differentrepresentations,differentscales,e.g.,metricvs.Britishunits,03.06.2020,.,19,HandlingRedundantDatainDataIntegration,RedundantdataoccuroftenwhenintegrationofmultipledatabasesThesameattributemayhavedifferentnamesindifferentdatabasesOneattributemaybea“derived”attributeinanothertable,e.g.,annualrevenueRedundantdatamaybeabletobedetectedbycorrelationalanalysisCarefulintegrationofthedatafrommultiplesourcesmayhelpreduce/avoidredundanciesandinconsistenciesandimproveminingspeedandquality,03.06.2020,.,20,DataTransformation,Smoothing:removenoisefromdataAggregation:summarization,datacubeconstructionGeneralization:concepthierarchyclimbingNormalization:scaledtofallwithinasmall,specifiedrangemin-maxnormalizationz-scorenormalizationnormalizationbydecimalscalingAttribute/featureconstructionNewattributesconstructedfromthegivenones,03.06.2020,.,21,DataTransformation:Normalization,min-maxnormalizationz-scorenormalizationnormalizationbydecimalscaling,WherejisthesmallestintegersuchthatMax(|)1,03.06.2020,.,22,Chapter3:DataPreprocessing,Whypreprocessthedata?DatacleaningDataintegrationandtransformationDatareductionDiscretizationandconcepthierarchygenerationSummary,03.06.2020,.,23,DataReductionStrategies,Warehousemaystoreterabytesofdata:Complexdataanalysis/miningmaytakeaverylongtimetorunonthecompletedatasetDatareductionObtainsareducedrepresentationofthedatasetthatismuchsmallerinvolumebutyetproducesthesame(oralmostthesame)analyticalresultsDatareductionstrategiesDatacubeaggregationDimensionalityreductionNumerosityreductionDiscretizationandconcepthierarchygeneration,03.06.2020,.,24,DataCubeAggregation,Thelowestlevelofadatacubetheaggregateddataforanindividualentityofintereste.g.,acustomerinaphonecallingdatawarehouse.MultiplelevelsofaggregationindatacubesFurtherreducethesizeofdatatodealwithReferenceappropriatelevelsUsethesmallestrepresentationwhichisenoughtosolvethetaskQueriesregardingaggregatedinformationshouldbeansweredusingdatacube,whenpossible,03.06.2020,.,25,DimensionalityReduction,Featureselection(i.e.,attributesubsetselection):Selectaminimumsetoffeaturessuchthattheprobabilitydistributionofdifferentclassesgiventhevaluesforthosefeaturesisascloseaspossibletotheoriginaldistributiongiventhevaluesofallfeaturesreduce#ofpatternsinthepatterns,easiertounderstandHeuristicmethods(duetoexponential#ofchoices):step-wiseforwardselectionstep-wisebackwardeliminationcombiningforwardselectionandbackwardeliminationdecision-treeinduction,03.06.2020,.,26,ExampleofDecisionTreeInduction,Initialattributeset:A1,A2,A3,A4,A5,A6,A4?,A1?,A6?,Class1,Class2,Class1,Class2,Reducedattributeset:A1,A4,A6,03.06.2020,.,27,HeuristicFeatureSelectionMethods,Thereare2dpossiblesub-featuresofdfeaturesSeveralheuristicfeatureselectionmethods:Bestsinglefeaturesunderthefeatureindependenceassumption:choosebysignificancetests.Beststep-wisefeatureselection:Thebestsingle-featureispickedfirstThennextbestfeatureconditiontothefirst,.Step-wisefeatureelimination:RepeatedlyeliminatetheworstfeatureBestcombinedfeatureselectionandelimination:Optimalbranchandbound:Usefeatureeliminationandbacktracking,03.06.2020,.,28,DataCompression,StringcompressionThereareextensivetheoriesandwell-tunedalgorithmsTypicallylosslessButonlylimitedmanipulationispossiblewithoutexpansionAudio/videocompressionTypicallylossycompression,withprogressiverefinementSometimessmallfragmentsofsignalcanbereconstructedwithoutreconstructingthewholeTimesequenceisnotaudioTypicallyshortandvaryslowlywithtime,03.06.2020,.,29,DataCompression,OriginalData,CompressedData,lossless,OriginalDataApproximated,lossy,03.06.2020,.,30,WaveletTransforms,Discretewavelettransform(DWT):linearsignalprocessingCompressedapproximation:storeonlyasmallfractionofthestrongestofthewaveletcoefficientsSimilartodiscreteFouriertransform(DFT),butbetterlossycompression,localizedinspaceMethod:Length,L,mustbeanintegerpowerof2(paddingwith0s,whennecessary)Eachtransformhas2functions:smoothing,differenceAppliestopairsofdata,resultingintwosetofdataoflengthL/2Appliestwofunctionsrecursively,untilreachesthedesiredlength,03.06.2020,.,31,GivenNdatavectorsfromk-dimensions,findc=korthogonalvectorsthatcanbebestusedtorepresentdataTheoriginaldatasetisreducedtooneconsistingofNdatavectorsoncprincipalcomponents(reduceddimensions)EachdatavectorisalinearcombinationofthecprincipalcomponentvectorsWorksfornumericdataonlyUsedwhenthenumberofdimensionsislarge,PrincipalComponentAnalysis,03.06.2020,.,32,X1,X2,Y1,Y2,PrincipalComponentAnalysis,03.06.2020,.,33,NumerosityReduction,ParametricmethodsAssumethedatafitssomemodel,estimatemodelparameters,storeonlytheparameters,anddiscardthedata(exceptpossibleoutliers)Log-linearmodels:obtainvalueatapointinm-DspaceastheproductonappropriatemarginalsubspacesNon-parametricmethodsDonotassumemodelsMajorfamilies:histograms,clustering,sampling,03.06.2020,.,34,RegressionandLog-LinearModels,Linearregression:DataaremodeledtofitastraightlineOftenusestheleast-squaremethodtofitthelineMultipleregression:allowsaresponsevariableYtobemodeledasalinearfunctionofmultidimensionalfeaturevectorLog-linearmodel:approximatesdiscretemultidimensionalprobabilitydistributions,.,Linearregression:Y=+XTwoparameters,andspecifythelineandaretobeestimatedbyusingthedataathand.usingtheleastsquarescriteriontotheknownvaluesofY1,Y2,X1,X2,.Multipleregression:Y=b0+b1X1+b2X2.Manynonlinearfunctionscanbetransformedintotheabove.Log-linearmodels:Themulti-waytableofjointprobabilitiesisapproximatedbyaproductoflower-ordertables.Probability:p(a,b,c,d)=abacadbcd,RegressAnalysisandLog-LinearModels,03.06.2020,.,36,Histograms,ApopulardatareductiontechniqueDividedataintobucketsandstoreaverage(sum)foreachbucketCanbeconstructedoptimallyinonedimensionusingdynamicprogrammingRelatedtoquantizationproblems.,03.06.2020,.,37,Clustering,Partitiondatasetintoclusters,andonecanstoreclusterrepresentationonlyCanbeveryeffectiveifdataisclusteredbutnotifdatais“smeared”Canhavehierarchicalclusteringandbestoredinmulti-dimensionalindextreestructuresTherearemanychoicesofclusteringdefinitionsandclusteringalgorithms,furtherdetailedinChapter8,03.06.2020,.,38,Sampling,Allowaminingalgorithmtorunincomplexitythatispotentiallysub-lineartothesizeofthedataChoosearepresentativesubsetofthedataSimplerandomsamplingmayhaveverypoorperformanceinthepresenceofskewDevelopadaptivesamplingmethodsStratifiedsampling:Approximatethepercentageofeachclass(orsubpopulationofinterest)intheoveralldatabaseUsedinconjunctionwithskeweddataSamplingmaynotreducedatabaseI/Os(pageatatime).,03.06.2020,.,39,Sampling,SRSWOR(simplerandomsamplewithoutreplacement),SRSWR,03.06.2020,.,40,Sampling,RawData,Cluster/StratifiedSample,03.06.2020,.,41,HierarchicalReduction,Usemulti-resolutionstructurewithdifferentdegreesofreductionHierarchicalclusteringisoftenperformedbuttendstodefinepartitionsofdatasetsratherthan“clusters”ParametricmethodsareusuallynotamenabletohierarchicalrepresentationHierarchicalaggregationAnindextreehierarchicallydividesadatasetintopartitionsbyvaluerangeofsomeattributesEachpartitioncanbeconsideredasabucketThusanindextreewithaggregatesstoredateachnodeisahierarchicalhistogram,03.06.2020,.,42,Chapter3:DataPreprocessing,Whypreprocessthedata?DatacleaningDataintegrationandtransformationDatareductionDiscretizationandconcepthierarchygenerationSummary,03.06.2020,.,43,Discretization,Threetypesofattributes:NominalvaluesfromanunorderedsetOrdinalvaluesfromanorderedsetContinuousrealnumbersDiscretization:dividetherangeofacontinuousattributeintointervalsSomeclassificationalgorithmsonlyacceptcategoricalattributes.ReducedatasizebydiscretizationPrepareforfurtheranalysis,03.06.2020,.,44,DiscretizationandConcepthierachy,Discretizationreducethenumberofvaluesforagivencontinuousattributebydividingtherangeoftheattributeintointervals.Intervallabelscanthenbeusedtoreplaceactualdatavalues.Concepthierarchiesreducethedatabycollectingandreplacinglowlevelconcepts(suchasnumericvaluesfortheattributeage)byhigherlevelconcepts(suchasyoung,middle-aged,orsenior).,03.06.2020,.,45,Discretizationandconcepthierarchygenerationfornumericdata,Binning(seesectionsbefore)Histogramanalysis(seesectionsbefore)Clusteringanalysis(seesectionsbefore)Entropy-baseddiscretizationSegmentationbynaturalpartitioning,03.06.2020,.,46,Entropy-BasedDiscretization,GivenasetofsamplesS,ifSispartitionedintotwointervalsS1andS2usingboundaryT,theentropyafterpartitioningisTheboundarythatminimizestheentropyfunctionoverallpossibleboundariesisselectedasabinarydiscretization.Theprocessisrecursivelyappliedtopartitionsobtaineduntilsomestoppingcriterionismet,e.g.,Experimentsshowthatitmayreducedatasizeandimproveclassificationaccuracy,03.06.2020,.,47,Segmentationbynaturalpartitioning,3-4-5rulecanbeusedtosegmentnumericdataintorelativelyuniform,“natural”intervals.*Ifanintervalcovers3,6,7or9distinctvaluesatthemostsignificantdigit,partitiontherangeinto3equi-widthintervals*Ifitcovers2,4,or8distinctvaluesatthemostsignificantdigit,partitiontherangeinto4intervals*Ifitcovers1,5,or10distinctvaluesatthemostsignificantdigit,partitiontherangeinto5intervals,03.06.2020,.,48,Exampleof3-4-5rule,(-$4000-$5,000),Step4:,03.06.2020,.,49,Concepthierarchygenerationforcategoricaldata,SpecificationofapartialorderingofattributesexplicitlyattheschemalevelbyusersorexpertsSpecificationofap

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论