版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
DeepDeepLearning:Techniques DeepChallengetoThetruechallengetoartificialinprovedtobesolvingthetasksthatareeasyforpeopletoperformbuthardforpeopletodescribeformally;solvingtheproblemsthatwesolveintuitively,thatfeelautomatic;likerecognizingspokenwordsorfacesinFlowchartsshowinghowthedifferentpartsofanAIsystemrelatetoeachotherwithindifferentAIdisciplines.Shadedboxesindicatecomponentsthatareabletolearnfromdata.Thefigureshowstwoofthethreehistoricalwavesofartificialneuralnetsresearch,asmeasuredbythefrequencyoftheand“connectionism”or“neuralnetworks”accordingtoDeeplearninghasemoreusefulastheamountofavailabletrainingdatahasKeyTrendsaboutDeepDeeplearningmodelshavegrowninsizeovertimeascomputerhardwareandsoftwareinfrastructurefordeeplearninghasKeyTrendsaboutDeepDeeplearningmodelshavegrowninsizeovertimeascomputerhardwareandsoftwareinfrastructurefordeeplearninghasKeyTrendsaboutDeepDeeplearninghassolvedincreasinglywithincreasingaccuracyovertime.2009
微 的语音识别专家LiDeng和DongYu和Hinton开始合作2011微软的语音识别研究取得成把连续多帧的语音特征并在一起,构 特征逐级地进行信息特征抽取无缝地和传统的语音识别技术相结合,在不引起任何系统额外耗况下大幅度地提升语音识别系统的识别率,LeCun,ConvolutionNeuralNetworks,2012年10月 团队 分类问题上用更深 N取最好结果,使得图像识 2015 公布 错误率DeepDeepLearninginNLP&WhyNeuralThehumancerebralcortexis2to4millimetresinThedifferentcorticallayerseachcontainacharacteristicdistributionneuronalcelltypesandconnectionswithothercorticalandsubcorticalthemoreancientpartofthecerebralcortex,thehippocampus,hasatthreecellularThemostrecentpartofthecerebralcortex,theneocortex(alsocalledisocortex),NeuronsinvariouslayersconnectverticallytoformsmallThecortexisorganizedverticallyincolumnsandhorizontallyinThedifferentregionsofsomatosensoryreceivetheirmaininputsfromdifferentkindsofreceptors.Area3breceivesmostofitsprojectionsfromthesuperficialArea3areceivesinputfromreceptorsinthemuscleInputfromthethalamusarrivesatlayerIV,whereneuronsdistributeinformationupanddownlayers.[Kaasetal.,1979.]Hubel-WieselHubel,DavidH.,andTorstenN.Wiesel."Receptivefieldsofsingleneuronesinthecat'sstriatecortex."TheJournalofphysiology148,no.3(1959):574-591.Hubel-WieselTheNobelPrizeinPhysiologyorMedicine,FurtherBrunoDavidCornell 上不同物UniversityofToronto-MachineLearningGroup(GeoffHinton,RichZemel,RuslanSalakhutdinov,BrendanFrey,RadfordUniversitédeMontréal-LisaLab(YoshuaBengio,PascalVincent,AaronCourville,RolandNewYorkUniversity–YannLecun‘sandRobFergus‘StanfordUniversity–AndrewNg‘sUniversityofOxford–Deeplearninggroup,NandodeFreitasandPhilResearch–JeffDean,SamyBengio,JasonWeston,Marc’AurelioRanzato,DumitruErhan,QuocLeetResearch–LiDengetSUPSI–IDSIA(Jurgen UCBerkeley–BrunoOlshausen‘sUniversityofWashington–PedroDomingos‘IDIAPResearchInstitute-RonanCollobert‘sUniversityofCaliforniaMerced–MiguelA.Carreira-Perpinan‘sUniversityofHelsinki-AapoHyvärinen‘sNeuroinformaticsUniversitédeSherbrooke–Hugo e‘sUniversityofGuelph–GrahamTaylor‘sUniversityofMichigan–HonglakLee‘sTechnicalUniversityofBerlin–Klaus-RobertMuller‘sBaidu–KaiYu‘sAaltoUniversity-JuhaKarhunenandTapaniRaikoU.Amsterdam–MaxWelling‘sU.CaliforniaIrvine–PierreBaldi‘sGhentUniversity–BenjaminShrauwen‘sUniversityofTennessee–ItamarArel‘sIBMResearch–BrianKingsburyetUniversityofBonn–SvenBehnke’sGatsbyUnit@UniversityCollegeLondon–ManeeshSahani,Yee-WhyeTeh,PeterComputationalCognitiveNeuroscienceLab@UniversityofColoradoThere'saninterestinghistoryaboutpeople'schangesintheirattitudestowarddeeparchitecturesandtheshallow2012-Asurveyondeeplearning-onesmallsteptowardHistoricalContextofDeepInaround1960,the1stgenerationofneuralnetworkwasborn(byIt'scapabilityofclassifyingsomebasicshapesliketrianglesandsquaresletpeopleseethepotentialthatarealin ligentmachinewhichcansense,learn,rememberandrecognizelikehuman-beingscanbeinventedwiththistrend.BUT,itsfundamentallimitationssoonbrokepeople'sCriticizingfromMarvinMinsky,OneoftheapparentreasonsisthatthefeaturelayerofthisPerceptronisfixedandcraftedbyhumanbeings,whichisabsolu againstthedefinitionofareal“in Anotherreasonisitssingle-layerstructurelimitsthefunctionsitlearn,e.g,anexclusive-orfunctionisoutofitslearningInaround1985,basedonthePerceptrons,Geoffrey cedtheoriginalsinglefixedfeaturelayerwithseveralhiddenlayers,creatingthe2nd-generationneuralnetwork.viaBack-propagationalgorithm(proposedin1969,practicableinBPdidnotworkwellinThecorrectingsignalwillbeweakenedwhenitpassesbackviamultipleItoftengetstrappedinpoorlocaloptimawhenthebatch-modeorevengradientdescentBPalgorithmisTheseverityincreasessignificantlyasthedepthofthenetworksLearningistooslowacrossmultiplehiddenIn1989,YannLeCunetal.builtadeepneuralnetworkwiththepurposerecognizinghandwrittenZIPcodesonDespitethesuccessofapplyingthealgorithm,thetimetotrainthenetworkondatasetwas y3SVMsloweddownthedevelopmentsofWhenpeopleweretryingtomakeimprovementstoHinton'sneuralnetworkswithrespecttothoseadvantagesTryingtoincreasethetrainingdatasetandestimatingtheinitialweight1993-1995VladimirN.Vapnik,etmadeimprovementsontheoriginalTheycreatinganewfamilySupportVectorSVM——GoodorSVMmakeslearningfastandeasy,duetoitssimpleAppropriatefordatawithsimplestructures,e.g,withasmalloffeaturesorthedatawhichdoesn'tcontainhierarchicalBut,forthedatawhichitselfcontainscomplicatedfeatures,tendstoperformworsebecauseofitssimpleOnewaytosolvethisproblemistoaddapriorknowledgetotheSVMmodelinordertoobtainabetterfeaturelayer.But,it'shardtofindageneralsetofpriorSVM——takesusawayfromtheroadtoareal ligentmachineDespitethefactthatSVMcanworkreallywellinsolvingmanyAIproblems,itisnotagoodtrendtoAIduetoitsfataldeficiency,shallowarchitecture.SVMisstillakindofPerceptronwherethefeaturesareobtainednotlearntfromthedataWiththepurposeoffindinganarchitecturethatmeetstherequirementsabove,someresearchersstartedtolookbacktothemulti-layerneuralnetwork,tryingtoexploititsadvantagesrelatedtodeepandethelimitations...After2010年 国防部DARPA计划首次资助深度学习项目,参与方包括福大学、纽约大学和 2011年,微 和谷歌的语音识别研究人员先后采用DNN技术降低语识别错误率20%-30%,是该领域10年来最大突2012年,Hinton将 分类问题的Top5错误率由26%降至至15%Andrewf建 个音和;2013年,Hinton创立的DNNResearch公司 收购2013年,YannLeCun加 的人工智 After CTR预估(Click-Through-RatePrediction, 检索达到了国际领先水平。2014年,AndrewNg加盟。2013年,腾讯着手建立深度学台Mariana。Mariana面向语音识别、图像识别、推荐等众多应用领域,提供默认算法的并行实现以减少新算DNNGPUDNNCPUBMBM->RBM-BM->RBMBM->RBM-DBN模型被视为由若干个RBM堆叠在一起,可过由低到高逐层训练这些RBM来实现(1)底部RBM以原始输入数据训(2)将底部RBM抽取的特征作为顶部的输入训Hintonin自 ,80年代晚期出主要用于降维,后用于主成分分𝒏输入输出层神经元个m隐藏层神经元个p,q各层神经元的偏w输入层与隐藏层之间的权隐藏层与输出层之间的权Y.Bengio,P.Lamblin,D.Popovici,andH. e.Greedylayerwisetrainingofnetworks.InProceedingsofNeuralInformationProcessingSystems(NIPS).2012Hinton–Deep2014/12/9NetworkIn2013ZFVisualizingandUnderstandingConvolutional2014/12/9NetworkIn RNN(RecurrentNeuralallbiologicalneuralnetworksareRNNsimplementdynamicalmathematically,SomeOld
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2026年大北农隆平高科转基因玉米大豆品种布局对比
- 2026年私人银行智能投顾大模型资产配置方案
- 2026年溶剂法回收碳纤维的工艺优化与规模化应用前景
- 2026年自建风光发电制氢项目减排量核算实务
- 2026年养老金融产品覆盖养老准备财富积累消费支付全生命周期
- 2026年智能合约赋能供应链融资政府采购监管应用指南
- 2026年幸福颐养养老服务提升行动实施方案
- 2025年公务员(办公环境管理)试题及答案
- 2026年轮胎产品碳足迹认证:橡胶原料与生产工艺碳排放
- 2026山东东营锦苑大地幼儿园招聘幼儿园教师1人备考题库附答案详解【达标题】
- 《关于大众传媒》课件
- 《东北三省》白山黑水
- 建筑施工企业管理人员、从业人员安全生产责任书(参考范本2023年版)
- 齐齐哈尔大学化学专业实验分析实验报告
- Bankart损伤与Hill-Sachs损伤影像诊断
- 永磁电动机计算公式大全(电磁计算程序)精讲
- DB3701∕T 15-2020 基层网格化服务管理规范
- 公路工程监理工作程序及质量控制
- 幼儿园大班数学活动ppt课件《好玩的数数》
- 正清风痛宁及风湿与疼痛三联序贯疗法新详解演示文稿
- JJG 181-2005石英晶体频率标准
评论
0/150
提交评论