神经网络之-递归神经网络课件_第1页
神经网络之-递归神经网络课件_第2页
神经网络之-递归神经网络课件_第3页
神经网络之-递归神经网络课件_第4页
神经网络之-递归神经网络课件_第5页
已阅读5页,还剩67页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

Lecture10Total69pages1RecurrentNeuralNetworksLecture10Total69pages1ReLecture10Total69pages2ClassificationofNNsFeedforwardNNsRecurrentNNsNeuralNetworksLecture10Total69pages2ClLecture10Total69pages33视网膜信息处理的基本系统视网膜分3层神经细胞(自下而上):外层、中间层、最后层光信息自光感受器经双极细胞传至神经节细胞,神经节细胞的轴突汇聚成视神经离开眼球。水平细胞和无长突细胞通过侧向联系调节双极细胞和神经节细胞的反应。Lecture10Total69pages33视Lecture10Total69pages4FeedforwardNNs神经节细胞层内核层外核层三层神经网络:神经节细胞层-内核层-外核层每层内各神经元之间无连接前一层神经元计算完后传递给下一层神经元进行计算Lecture10Total69pages4FeLecture10Total69pages5FeedforwardNNsLecture10Total69pages5FeLecture10Total69pages6ContainfeedbackamongneuronsRecurrentNNsLecture10Total69pages6CoLecture10Total69pages7RecurrentNNsLecture10Total69pages7ReLecture10Total69pages8RecurrentNNsHowtoderivemathmodelsofRNNs?Lecture10Total69pages8ReLecture10Total69pages9RecurrentNNsLecture10Total69pages9ReLecture10Total69pages10RecurrentNNsLecture10Total69pages10RLecture10Total69pages11RecurrentNNsLecture10Total69pages11RLecture10Total69pages12RecurrentNNsLecture10Total69pages12RLecture10Total69pages13RecurrentNNsLecture10Total69pages13RLecture10Total69pages14RecurrentNNsLecture10Total69pages14RLecture10Total69pages15RecurrentNNsLecture10Total69pages15RLecture10Total69pages16RecurrentNNsLecture10Total69pages16RLecture10Total69pages17RecurrentNNsbbbbLecture10Total69pages17RLecture10Total69pages18DiscreteTimeRNNsLecture10Total69pages18DLecture10Total69pages19DiscreteTimeRNNsNetworkcomputing?Lecture10Total69pages19DLecture10Total69pages20DiscreteTimeRNNsNetworkcomputingRNNInputOutputLecture10Total69pages20DLecture10Total69pages21Computing:DiscreteorContinuous?Lecture10Total69pages21CLecture10Total69pages22DiscretevsContinuous

DiscretetimecomputingContinuoustimecomputingLecture10Total69pages22DLecture10Total69pages23DiscretevsContinuous

ContinuoustimecomputingHowtoderivecontinuoustimecomputingmathmodelsofRNNs?Lecture10Total69pages23DLecture10Total69pages24FromDiscreteComputingtoContinuousComputingChangingtimestepsLecture10Total69pages24FLecture10Total69pages25FromDiscreteComputingtoContinuousComputingLecture10Total69pages25FLecture10Total69pages26FromDiscreteComputingtoContinuousComputingLecture10Total69pages26FLecture10Total69pages27FromDiscreteComputingtoContinuousComputingLecture10Total69pages27FLecture10Total69pages28FromDiscreteComputingtoContinuousComputingLecture10Total69pages28FLecture10Total69pages29FromDiscreteComputingtoContinuousComputingLecture10Total69pages29FLecture10Total69pages30ContinuousComputingRNNsLecture10Total69pages30CLecture10Total69pages31RecurrentNNsRNNmodel:NetworkstateNetworkinputNetworktimeLecture10Total69pages31RLecture10Total69pages32RecurrentNNsWhat’stheoutputofaRNN?NetworkstateNetworkinputNetworktimeNetworkoutputLecture10Total69pages32RLecture10Total69pages33ConvergenceofRNNsNetworkstateConverge?Equilibriumpoint:Lecture10Total69pages33CLecture10Total69pages34TrajectoriesLecture10Total69pages34TLecture10Total69pages35TrajectoriesLecture10Total69pages35TLecture10Total69pages36TrajectoriesLecture10Total69pages36TLecture10Total69pages37ASimpleExampleLecture10Total69pages37ALecture10Total69pages38EquilibriumPointsEquilibriumpoint:Lecture10Total69pages38ELecture10Total69pages39EquilibriumPointstapLecture10Total69pages39ELecture10Total69pages40ConvergenceofRNNsAttractorsLecture10Total69pages40CLecture10Total69pages41ConvergenceofRNNsDoeseachtrajectoryofaRNNconvergetoanequilibrium?Methods:1.Solvingdifferentialequationdirectly;2.Energymethod.Lecture10Total69pages41CLecture10Total69pages42MethodOneSolvingDifferentialEquationsLecture10Total69pages42MLecture10Total69pages43ASimpleExampletapLecture10Total69pages43ALecture10Total69pages44LinearRNNsLecture10Total69pages44LLecture10Total69pages45LinearRNNsLecture10Total69pages45LLecture10Total69pages46/people/seung/index.htmlLecture10Total69pages46hLecture10Total69pages47/people/seung/index.htmlLecture10Total69pages47hLecture10Total69pages48LinearRNNsH.S.Seung,Howthebrainkeepstheeyesstill,Proc.Natl.Acad.Sci.USA,vol.93,pp.13339-13344,1996Lecture10Total69pages48LLecture10Total69pages49HowthebrainkeepstheeyesstillH.S.Seung,Howthebrainkeepstheeyesstill,Proc.Natl.Acad.Sci.USA,vol.93,pp.13339-13344,1996ABSTRACTThebraincanholdtheeyesstillbecauseitstoresamemoryofeyeposition.Thebrain’smemoryofhorizontaleyepositionappearstoberepresentedbypersistentneuralactivityinanetworkknownastheneuralintegrator,whichislocalizedinthebrainstemandcerebellum.Existingexperimentaldataarereinterpretedasevidenceforan“attractorhypothesis”thatthepersistentpatternsofactivityobservedinthisnetworkformanattractivelineoffixedpointsinitsstatespace.Lineattractordynamicscanbeproducedinlinearornonlinearneuralnetworksbylearningmechanismsthatpreciselytunepositivefeedback.Lecture10Total69pages49HLecture10Total69pages50LineAttractorSeung1996Lecture10Total69pages50LLecture10Total69pages51文字阅读人眼的运动方式。尽管人的阅读文字总是遵循一定的顺序,但通过捕捉人员阅读时的目光定位,可以发现人眼的注意是跳跃性的,当人脑找到和过去经验、记忆相近的形象时,目光才更多地集中到具体内容上。完型理论(格式塔理论):人对事物的认知具有强大的“补完”功能。研表究明,汉字序顺并不定一影阅响读!事证实明了当你看这完句话之后才发字现都乱是的Hvaeancieday~Hpoeyoukonwtheifnomariton.

Lecture10Total69pages51文Lecture10Total69pages52阅读实验Lecture10Total69pages52阅Lecture10Total69pages53LinearRNNsH.S.Seung,Patternanalysisandsynthesisinattractorneuralnetworks,1997AnalysisSynthesisLecture10Total69pages53LLecture10Total69pages54Patternanalysisandsynthesis

inattractorneuralnetworksH.S.Seung.Patternanalysisandsynthesisinattractorneuralnetworks.InK.-Y.M.Wong,I.King,andD.-Y.Yeung,editors,TheoreticalAspectsofNeuralComputation:AMultidisciplinaryPerspective,Singapore,1997.Springer-Verlag.AbstractTherepresentationofhiddenvariablemodelsbyattractorneuralnetworksisstudiedMemoriesarestoredinadynamicalattractorthatisacontinuousmanifoldoffixedpointsasillustratedbylinearandnonlinearnetworkswithhiddenneurons.Patternanalysisandsynthesisareformsofpatterncompletionbyrecallofastoredmemory.Analysisandsynthesisinthelinearnetworkareperformedbybottom-upandtop-downconnections.Inthenonlinearnetwork,theanalysiscomputationadditionallyrequiresrectificationnonlinearityandinnerproductinhibitionbetweenhiddenneurons.Lecture10Total69pages54PLecture10Total69pages55Patternanalysisandsynthesis

inattractorneuralnetworksLinearnetwork:Energyfunction:Lecture10Total69pages55PLecture10Total69pages56Patternanalysisandsynthesis

inattractorneuralnetworksH.S.Seung.Patternanalysisandsynthesisinattractorneuralnetworks.InK.-Y.M.Wong,I.King,andD.-Y.Yeung,editors,TheoreticalAspectsofNeuralComputation:AMultidisciplinaryPerspective,Singapore,1997.Springer-Verlag.Nonlinearnetwork:Energyfunction:Lecture10Total69pages56PLecture10Total69pages57RepresentingPart-WholeRelationshipsin

RecurrentNeuralNetworksV.Jain,V.Zhigulin,andH.S.Seung.Representingpart-wholerelationshipsinrecurrentneuralnetworks.Adv.NeuralInfo.Proc.Syst.18,563--70(2006).

AbstractThereislittleconsensusaboutthecomputationalfunctionoftop-downsynapticconnectionsinthevisualsystem.Hereweexplorethehypothesisthattop-downconnections,likebottom-upconnections,reflectpartwholerelationships.Weanalyzearecurrentnetworkwithbidirectionalsynapticinteractionsbetweenalayerofneuronsrepresentingpartsandalayerofneuronsrepresentingwholes.Withineachlayer,thereislateralinhibition.Whenthenetworkdetectsawhole,itcanrigorouslyenforcepart-wholerelationshipsbyignoringpartsthatdonotbelong.Thenetworkcancompletethewholebyfillinginmissingparts.Thenetworkcanrefusetorecognizeawhole,iftheactivatedpartsdonotconformtoastoredpart-wholerelationship.Parameterregimesinwhichthesebehaviorshappenareidentifiedusingthetheoryofpermittedandforbiddensets.ThenetworkbehaviorsareillustratedbyrecreatingRumelhartandMcClelland’s“interactiveactivation”model.Lecture10Total69pages57RLecture10Total69pages58RepresentingPart-WholeRelationshipsin

RecurrentNeuralNetworksV.Jain,V.Zhigulin,andH.S.Seung.Representingpart-wholerelationshipsinrecurrentneuralnetworks.Adv.NeuralInfo.Proc.Syst.18,563--70(2006).

Lecture10Total69pages58RLecture10Total69pages59RepresentingPart-WholeRelationshipsin

RecurrentNeuralNetworksLecture10Total69pages59RLecture10Total69pages60RepresentingPart-WholeRelationshipsin

RecurrentNeuralNetworksLecture10Total69pages60RLecture10Total69pages61MethodTwoEnergyFunctionsMethodLecture10Total69pages61MLecture10Total69pages62EnergyFunctionMethodLyapunovM

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

最新文档

评论

0/150

提交评论