版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
RANLP2015,Hissar,BulgariaDeepLearninginIndustryDataAnalyticsJunlanFengChinaMobileResearch1RANLP2015,Hissar,BulgariaDe人工智能的起点:达特茅斯会议1919-20011927-20111927-20161916-2011NathanielRochester人工智能的起点:达特茅斯会议1919-20011927-2人工智能的阶段
1950s1980s2000sFuture
自动计算机如何为计算机编程使其能够使用语言神经网络计算规模理论自我提升抽象随机性与创造性基于规则的专家系统通用智能123人工智能的阶段1950s3人工智能的当前技术:存在的问题
依赖大量的标注数据“窄人工智能”训练完成特定的任务不够稳定,安全不具备解释能力,模型不透明人工智能的当前技术:存在的问题依赖大量的标注数据4人工智能的当前状态:应用人工智能的当前状态:应用5人工智能成为热点的原因:
深度学习,强化学习大规模的,复杂的,流式的数据人工智能成为热点的原因:深度学习,强化学习大规模的,复杂6概要解析白宫人工智能研发战略计划3.深度学习及最新进展2.解析十家技术公司的的人工智能战略4.强化学习及最新进展5.深度学习在企业数据分析中的应用概要解析白宫人工智能研发战略计划3.深度学习及最新进展27美国人工智能战略规划美国人工智能战略规划8美国人工智能研发战略规划美国人工智能研发战略规划9策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术高效的数据清洁技术以,确保用于训练AI系统的数据的可信性(varascty)和正确性(appropriateness)综合考虑数据,元数据,以及人的反馈或知识异构数据,多模态数据分析和挖掘,离散数据,连续数据,时间域数据,空间域数据,时空数据,图数据小数据挖掘,强调小概率事件的重要性
数据和知识尤其领域知识库的融合使用策略-I:在人工智能研究领域做长期研发投资目标:.10策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力硬件或算法能提升AI系统感知能力的稳健性和可靠性提升在复杂动态环境中对物体的检测,分类,辨别,识别能力提升传感器或算法对人的感知,以便AI系统更好地跟人的合作计算和传播感知系统的不确定性给AI系统以便更好的判断策略-I:在人工智能研究领域做长期研发投资目标:.11策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力当前硬件环境和算法框架下AI的理论上限学习能力语言能力感知能力推理能力创造力计划,规划能力3.理论AI能力和上限策略-I:在人工智能研究领域做长期研发投资目标:.12策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力目前的AI系统均为窄人工智能,“NarrowAI”而不是“GeneralAI”GAI:灵活,多任务,有自由意志,在多认知任务中的通用能力(学习能力,语言能力,感知能力,推理能力,创造力,计划,规划能力迁移学习3.理论AI能力和上限4.通用AI策略-I:在人工智能研究领域做长期研发投资目标:.13策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力多AI系统的协同分布式计划和控制技术3.理论AI能力和上限4.通用AI5.规模化AI系统策略-I:在人工智能研究领域做长期研发投资目标:.14策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力AI系统的自我解释能力目前AI系统的学习方法:大数据,黑盒人的学习方法:小数据,接受正规的指导规则以及各种暗示仿人的AI系统,可以做智能助理,智能辅导3.理论AI能力和上限4.通用AI5.规模化AI系统6.仿人类的AI技术策略-I:在人工智能研究领域做长期研发投资目标:.15策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力提升机器人的感知能力,更智能的同复杂的物理世界交互3.理论AI能力和上限4.通用AI5.规模化AI系统6.仿人类的AI技术7.研发实用,可靠,易用的机器人策略-I:在人工智能研究领域做长期研发投资目标:.16策略-I:在人工智能研究领域做长期研发投资
目标:.确保美国的世界领导地位.优先投资下一代人工智能技术推动以数据为中心的知识发现技术2.增强AI系统的感知能力提升机器人的感知能力,更智能的同复杂的物理世界交互GPU:提升的内存,输入输出,时钟
速度,并行能力,节能“类神经元”处理器处理基于流式,动态数据利用AI技术提升硬件能力:高性能计算,优化能源消耗,增强计算性能,自我智能配置,优化数据在多核处理器和内存直接移动3.理论AI能力和上限4.通用AI5.规模化AI系统6.仿人类的AI技术7.研发实用,可靠,易用的机器人8.AI和硬件的相互推动策略-I:在人工智能研究领域做长期研发投资目标:.17策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用辅助人类的人工智能技术AI系统的设计很多是为人所用复制人类计算,决策,认知策略-II:开发有效的人机合作方法.不是替代人,18策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用辅助人类的人工智能技术2.开发增强人类的AI技术稳态设备穿戴设备植入设备辅助数据理解策略-II:开发有效的人机合作方法.不是替代人,19策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用辅助人类的人工智能技术2.开发增强人类的AI技术数据和信息的可视化,以人可以理解的方式展现提升人和AI系统通信的效率3.可视化,AI-人之间的友好界面策略-II:开发有效的人机合作方法.不是替代人,20策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用辅助人类的人工智能技术2.开发增强人类的AI技术已成功:安静环境下的流畅的语音识未解决的:噪声环境下的识别,远场语音识别,口音,儿童语音识别,受损语音识别,语言理解,对话能力3.可视化,AI-人之间的友好界面4.研发更有效的语言处理系统策略-II:开发有效的人机合作方法.不是替代人,21策略–III:理解并重点关注人工智能可能带来的伦理,法律,社会方面的影响研究人工智能技术可能带来的伦理,法律,社会方面的影响期待其符合人的类规范AI系统从设计上需要符合人类的道德标准:公平,正义,透明,责任感策略–III:理解并重点关注人工智能可能带来的伦理,22策略–III:理解并重点关注人工智能可能带来的伦理,法律,社会方面的影响研究人工智能技术可能带来的伦理,法律,社会方面的影响期待其符合人的类规范AI系统从设计上需要符合人类的道德标准:公平,正义,透明,责任感2.构建符合道德的AI技术如何将道德量化,由模糊变为精确的系统和算法设计道德通常是模糊的,随文化,宗教和信仰而不同策略–III:理解并重点关注人工智能可能带来的伦理,23策略–III:理解并重点关注人工智能可能带来的伦理,法律,社会方面的影响研究人工智能技术可能带来的伦理,法律,社会方面的影响期待其符合人的类规范AI系统从设计上需要符合人类的道德标准:公平,正义,透明,责任感2.构建符合道德的AI技术两层架构:由一层专门负责道德建设道德标准植入每一个工程AI步骤3.符合道德标准的AI技术的实现框架策略–III:理解并重点关注人工智能可能带来的伦理,24策略-IV:确保人工智能系统的自身和对周围环境安全性在人工智能系统广泛使用之前,必须确保系统的安全性研究创造稳定,可依靠,可信赖,可理解,可控制的人工智能系统所面临的挑战及解决办法提升AI系统的可解释性和透明度2.建立信任3.增强verification和validation4.自我监控,自我诊断,自我修正5.意外处理能力,防攻击能力策略-IV:确保人工智能系统的自身和对周围环境安全性25策略-V:发展人工智能技术所需的共享的数据集和共享的模拟环境一件重要的公益事业,同时要充分尊重企业和个人在数据中的权利和利益鼓励开源策略-V:发展人工智能技术所需的共享的数据集和共享的模拟环26策略-VI:评价和评测人工智能技术的标准开发恰当的评级策略和方法策略-VI:评价和评测人工智能技术的标准开发恰当的评级策略27策略-VII:更好的理解国家在人工智能研发方面的人力需求保证足够的人才资源策略-VII:更好的理解国家在人工智能研发方面的人力需求28大数据和人工智能数据是人工智能的来源大数据并行计算,流计算等技术是人工智能能实用化的保障人工智能是大数据,尤其复杂数据分析的主要方法大数据和人工智能数据是人工智能的来源大数据并行计算,292.Top10家技术公司的AI布局2.Top10家技术公司的AI布局30Google:AI-FirstStrategyGoogle化4亿美金购买英国伦敦大学人工智能创业公司:DeepMindAlphaGoGNCWaveNetQ-Learning2011年成立1.语音识别,合成;2.机器翻译;3.无人驾驶车.4.谷歌眼镜.5.GoogleNow.6.收购Api.uiGoogle:AI-FirstStrategyGoogl31Facebook共享深度学习开源代码:TorchFacbookM数字助理研究和应用:FAIR&AMLFacebook共享深度学习开源代码:TorchFacboo32AppleAIAppleSiriApplebought
EmotientandVocalIQ?AppleAIAppleSiriApplebought33PartnershiponAI
Itwill“conductresearch,recommendbestpractices,andpublishresearchunderanopenlicenseinareassuchasethics,fairnessandinclusivity;transparency,privacy,andinteroperability;collaborationbetweenpeopleandAIsystems;andthetrustworthiness,reliabilityandrobustnessofthetechnology”[2016年9月29日]PartnershiponAI
Itwill“con34ElonMusk:OpenAIPaypal,
Telsla,SpaceX,SolarCity
四家公司CEO,投资十个亿美金成立OpenAIElonMusk:OpenAIPaypal,Tel35Microsoft小冰小娜API开放CNTK微软研究院Microsoft小冰小娜API开放CNTK微软研究院36IBM语音文本图片视频Watson计算机IBM语音文本图片视频Watson计算机37百度百度38国内技术巨头腾讯,阿里,讯飞在人工智能领域投入巨大国内技术巨头腾讯,阿里,讯飞在人工智能领域投入巨大395.深度学习在企业数据分析中的案例5.深度学习在企业数据分析中的案例40Anexample:AIinDataAnalyticswithDeepLearning客户情感分析IntroductionEmotionRecognitioninTextEmotionRecognitioninSpeechEmotionRecognitioninConversationsIndustrialApplicationDatasetsFeaturesMethodsAnexample:AIinDataAnalyti41Introduction:InterchangeableTerms42OpinionMiningSentimental
AnalysisEmotion
RecognitionPolarityDetectionReviewMiningIntroduction:InterchangeableIntroduction:Whatemotionsare?43Introduction:WhatemotionsarIntroduction:ProblemDefinitionPositiveandNegative;opinionsTargetoftheopinions;EntityRelatedsetofcomponents;aspectRelatedattributes;aspectOpinionholder;opinionsourceWewillonly
focusondocumentlevelsentimentOpinionMiningIntroduction:ProblemDefiniti44RANLP2015,Hissar,Bulgaria
Introduction:TextExamples6thSeptember201545athriller[
withoutalotofthrills]Anedgythrillerthat[deliversasurprisingpunch
][Aflawedbutengrossing]
thrillerIt’s[
unlikely]we’llsee[abetterthriller]
thisyearAneroticthrillerthat’s[neithertooeroticnorverythrillingeither]Emotionsareexpressedartisticallywithhelpof[Negation][ConjunctionWords][SentimentalWords],e.g.RANLP2015,Hissar,Bulgaria
IRANLP2015,Hissar,BulgariaIntroduction:TextExamplesDSE:explicitlyexpressanopinionholder’sattitudeESE:indirectlyexpresstheattitudeofthewriter6thSeptember201546Emotionsareexpressedexplicitlyandindirectly.RANLP2015,Hissar,BulgariaInRANLP2015,Hissar,BulgariaIntroduction:TextExamples6thSeptember201547Emotionsareexpressedlanguagethatisoftenobscuredbysarcasm,ambiguity,andplaysonwords,allofwhichcouldbeverymisleadingforbothhumansandcomputers
Asharptonguedoesnotmeanyouhaveakeenmind
Idon’tknowwhatmakesyousodumbbutitreallyworks
Please,keeptalking.Sogreat.IalwaysyawnwhenIaminterested.RANLP2015,Hissar,BulgariaInRANLP2015,Hissar,BulgariaIntroduction:SpeechConversationExamples6thSeptember201548RANLP2015,Hissar,BulgariaInRANLP2015,Hissar,BulgariaIntroduction:ConversationExamples6thSeptember201549RANLP2015,Hissar,BulgariaInRANLP2015,Hissar,BulgariaTypicalApproach:AClassificationTask6thSeptember201550ADocumentFeatures:Ngrams(Uni,bigrams)POSTagsTermFrequencySyntacticDependencyNegationTagsSVMMaxentNaïveBayesCRFRandomForestPosNeuNegSupervisedLearningPos-TagPatterns+Dictionary+MutualInfoRulesUnsupervisedLearningRANLP2015,Hissar,BulgariaTyRANLP2015,Hissar,BulgariaTypicalApproach:AClassificationTask6thSeptember201551Features:Prosodicfeatures:
pitch,energy,formants,etc.Voicequalityfeatures:harsh,tense,breathy,etc.Spectralfeatures:LPC,MFCC,LPCC,etc.TeagerEnergyOperator(TEO)-basedfeatures:TEO-FM-var,TEO-Auto-Env,etcSVM
GMM
HMM
DBNKNNLDACARTPosNeuNegSupervisedLearningRANLP2015,Hissar,BulgariaTyChallengesRemainText-Based:CapturethecompositionaleffectswithhigheraccuracyNegatingPositivesentencesNegatingNegativesentencesConjunction:Speech-Based:Effectivefeaturesunknown.Emotionalspeechsegments tendtobetranscribedwithlowerASRaccuracyChallengesRemainText-Based:52OverviewIntroductionEmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMParsing+RNNEmotionRecognitioninSpeechEmotionRecognitioninConversationsIndustrialApplicationHowdeeplearningcanchangethegame?OverviewIntroductionHowdeep53RANLP2015,Hissar,Bulgaria6thSeptember201554EmotionClassificationwithDeeplearningapproachesRANLP2015,Hissar,Bulgaria6tRANLP2015,Hissar,Bulgaria1.WordEmbeddingasFeatures6thSeptember201555Representationoftextisveryimportantforperformanceofmanyreal-worldapplicationsincludingemotionrecognition:
Localrepresentations:N-gramsBag-of-words1-of-NcodingContinuousRepresentations:LatentSemanticAnalysisLatentDirichletAllocationDistributedRepresentations:wordembeddingTomasMikolov,“LearningRepresentationsofTextusingNeuralNetworks”,NIPsDeeplearningWorkshop2013[(Bengioetal.,2006;Collobert&Weston,2008;Mnih&Hinton,2008;Turianetal.,2010;Mikolovetal.,2013a;c)RANLP2015,Hissar,Bulgaria1.RANLP2015,Hissar,Bulgaria1.WordEmbeddingasFeatures6thSeptember201556Representationoftextisveryimportantforperformanceofmanyreal-worldapplicationsincludingemotionrecognition:
Localrepresentations:N-gramsBag-of-words1-of-NcodingContinuousRepresentations:LatentSemanticAnalysisLatentDirichletAllocationDistributedRepresentations:wordembeddingTomasMikolov,“LearningRepresentationsofTextusingNeuralNetworks”,NIPsDeeplearningWorkshop2013RANLP2015,Hissar,Bulgaria1.RANLP2015,Hissar,BulgariaWordEmbedding6thSeptember201557Skip-gramArchCBOWThehiddenlayervectoristheword-embeddingvectorforw(t)RANLP2015,Hissar,BulgariaWoWordEmbeddingforSentimentDetection
IthasbeenwidelyacceptedasstandardfeaturesforNLPapplicationsincludingsentimentanalysissince2013[Mikolov2013]Thewordvectorspaceimplicitlyencodesmanylinguisticregularitiesamongwords:semanticandsyntacticExample:GooglePre-trainedwordvectorswith1000Billionwords
Doesitencodepolaritysimilarities?great 0.729151bad 0.719005terrific 0.688912decent 0.683735nice 0.683609excellent 0.644293fantastic 0.640778better 0.612073solid 0.580604lousy 0.576420wonderful 0.572612terrible 0.560204Good 0.558616TopRelevantWordsto“good”MostlyYes,butitdoesn’tseparateantonymswellWordEmbeddingforSentimentD58RANLP2015,Hissar,BulgariaLearningSentiment-SpecificWordEmbedding6thSeptember201559Tang,etal,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,ACL2014✔RANLP2015,Hissar,BulgariaLeRANLP2015,Hissar,BulgariaLearningSentiment-SpecificWordEmbedding6thSeptember201560Tang,etal,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,ACL2014InSpirit,itissimilartomulti-tasklearning.Itlearnsthesamewayastheregularword-embeddingwithlossfunctionconsideringbothsemanticcontextandsentimentdistancetothetwitteremotionsymbols.RANLP2015,Hissar,BulgariaLe10milliontweetsselectedbypositiveandnegativeemoticonsastrainingdataTheTwittersentimentclassificationtrackofSemEval2013LearningSentiment-SpecificWordEmbeddingTang,etal,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,ACL201410milliontweetsselectedby61ParagraphVectorsLeandMikolov,“DistributionalRepresentationsofSentencesandDocuments,ICML2014Paragraphvectorsaredistributionalvectorrepresentationforpiecesoftext,suchassentencesorparagraphsTheparagraphvectorsarealsoaskedtocontributetothepredictiontaskofthenextwordgivenmanycontextssampledfromtheparagraph.EachparagraphcorrespondstoonecolumninDItactsasamemoryrememberingwhatismissingfromthecurrentcontext,aboutthetopicoftheparagraphParagraphVectorsLeandMikolo62ParagraphVectors–BestResultsonMRDataSetLeandMikolov,“DistributionalRepresentationsofSentencesandDocuments,ICML2014ParagraphVectors–BestResult63OverviewIntroductionEmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollectionEmotionRecognitioninSpeechEmotionRecognitioninConversationsIndustrialApplicationOverviewIntroduction64CNNforSentimentClassification
Ref:YoonKim.ConvolutionalNeuralNetworksforSentenceClassification.EMNLP,2014.CNNforSentimentClassificat65CNNforSentimentClassification
Ref:YoonKim.ConvolutionalNeuralNetworksforSentenceClassification.EMNLP,2014.AsimpleCNNwithOneLayerofconvolutionontopofwordvectors.MotivatedbyCNNhasbeensuccessfulonmanyotherNLPtasksInputLayer:Wordvectorsarefrompre-trainedGoogle-Newsword2vectorConvLayer:Windowsize:3words,4words,5words.Eachwith100featuremap.300featuresinthepenultimatelayerPoolingLayer:MaxOvertimePoolingattheOutputlayer:Fullyconnectedsoftmaxlayer,outputdistributionoverlabelsRegularization:Drop-outonthepenultimatelayerwithaconstrainonthel2normsoftheweightvectorsFine-trainembeddingvectorsduringtrainingCNNforSentimentClassificati66CommonDatasetsCommonDatasets67CNNforSentimentClassification-ResultsCNN-rand:RandomlyinitializeallwordembeddingsCNN-static:word2vec,keeptheembeddingsfixedCNN-nonstatic:Fine-tuningembeddingvectorsCNNforSentimentClassificati68CNNforSentimentClassification-ResultsCNNforSentimentClassificati69Whyitissuccessful?MultiplefiltersandmultiplefeaturemapsEmotionsareexpressedinsegments,insteadofthespanningoverthewholesentenceUsepre-trainedword2vecvectorsasinputfeatures.Embeddingwordvectorsarefurtherimprovedfornon-statictraining.Antonymsarefurtherseparatedaftertraining.Whyitissuccessful?Multiple70ResourcesforThisworkSourceCode:https:///yoonkim/CNN_sentenceImplementationinTensorflow:/dennybritz/cnn-text-classification-tftfExtensiveExperiments:/pdf/1510.03820v4.pdfpdfResourcesforThisworkSource71DynamicCNNforSentimentKalchbrenneretal,“AConvolutionalNeuralNetworkforModelingSentences”,ACL2014HyperParametersinExperiments:
K=4m=5,14featuremapsm=7,6featuremapsd=48DynamicCNNforSentimentKalc72DynamicCNNforSentimentKalchbrenneretal,“AConvolutionalNeuralNetworkforModelingSentences”,ACL2014OneDimensionConvolution
TwoDimensionConvolution48Dwordvectorsrandomlyinitiated300D
InitiatedwithGoogleword2vectorMorecomplicatedmodelarchitecturewithdynamicpoolingStraightForward
6,4featuremaps100-128featuremapsDynamicCNNforSentimentKalc73JohnsonandZhang.,“EffectiveUseofWordOrderforTextCategorizationwithConvolutionalNeuralNetworks”,ACL-2015WhyCNNiseffectiveAsimpleremedyistousewordbi-gramsinadditiontounigramsIthasbeennotedthatlossofwordordercausedbybag-of-wordvectors(bowvectors)isparticularlyproblematiconsentimentclassificationComparingSVMwithTri-gramfeatureswith1,2,3windowfilterCNNTop100FeaturesSVMCNNUni-Grams687Bi-Grams2833Tri-Grams460SVMscan’tfullytakeadvantageofhigh-orderngramsJohnsonandZhang.,“Effectiv74SentimentClassificationConsideringFeaturesbeyondTextwithCNNModelsTangetal.,“LearningSemanticRepresentationsofUsersandProductsforDocumentLevelSentimentClassification“”,ACL-2015SentimentClassificationConsi75OverviewIntroductionEmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollectionEmotionRecognitioninSpeechEmotionRecognitioninConversationsIndustrialApplicationOverviewIntroduction76RecursiveNeuralTensorNetworkSocheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreebank”,EMNLP-2013./sentiment/TheStanfordSentimentTreebackisacorpuswithfullylabeledparsetreesCreatedtofacilitateanalysisofthecompositionaleffectsofsentimentinlanguage10,662sentencesfrommoviereviews.Parsedbystanfordparser.215,154phrasesarelabeledAmodelcalledRecursiveNeuralTensorNetworkswasproposedRecursiveNeuralTensorNetwo77RecursiveNeuralTensorNetwork-DistributionofsentimentvaluesforN-gramsSocheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreebank”,EMNLP-2013./sentiment/StrongersentimentoftenbuildsupinlongerphrasesandthemajorityoftheshorterphrasesareneutralRecursiveNeuralTensorNetwo78RecursiveNeuralTensorNetwork(RNTN)Socheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreebank”,EMNLP-2013./sentiment/f=tanhVisthetensordirectlyrelateinputvectors,WistheregularRNNweightmatrixRecursiveNeuralTensorNetwo79Wangetal..,“PredictingPolaritiesofTweetsbyComposingWordEmbeddingwithLongShort-TermMemory”,ACL-2015LSTMforSentimentAnalysisLSTMworkstremendouslywellonalargenumberofproblemsSucharchitecturesaremorecapabletolearnacomplexcompositionsuchasnegationofwordvectorsthansimpleRNNs.Input,storedinformation,andoutputarecontrolledbythreegates.Wangetal..,“PredictingPol80Wangetal..,“PredictingPolaritiesofTweetsbyComposingWordEmbeddingwithLongShort-TermMemory”,ACL-2015LSTMforSentimentAnalysisDataset:
theStanfordTwitterSentimentcorpus(STS)LSTM-TLT:Word-embeddingvectorsasinput.TLT:TrainableLook-upTable
Itisobservedthatnegationscanbebettercaptured.Wangetal..,“PredictingPol81Tangetal..,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”,EMNLP-2015GatedRecurrentUnitTangetal..,“DocumentModel82Tangetal..,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”,EMNLP-2015GatedRecurrentNeuralNetworkUseCNN/LSTMtogeneratelsentencerepresentationsfromwordvectors
GateRecurrentNeuralNetwork(GRU)toencodesentencerelationsforsentimentclassificationGRUcanviewedasvariantofLSTM,withoutputgatealwaysonTangetal..,“DocumentModel83Tangetal..,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”,EMNLP-2015GatedRecurrentNeuralNetworkTangetal..,“DocumentModel84J.Wangetal.,"DimensionalSentimentAnalysisUsingaRegionalCNN-LSTMModel”,ACL-2016CNN-LSTMJ.Wangetal.,"DimensionalS85J.Wangetal.,"DimensionalSentimentAnalysisUsingaRegionalCNN-LSTMModel”,ACL-2016CNN-LSTMThedimensionalapproachrepresentsemotionalstatesascontinuousnumericalvaluesinmultipledimensionssuchasthevalence-arousal(VA)space(Russell,1980).Thedimensionofvalencereferstothedegreeofpositiveandnegativesentiment,whereasthedimensionofarousalreferstothedegreeofcalmandexcitementJ.Wangetal.,"DimensionalS86K.STaietal,"ImprovedSemanticRepresentationsFromTree-StructuredLongShort-TermMemoryNetworks”,ACL-2015Tree-LSTMTree-LSTM:
ageneralizationofLSTMstotree-structurednetworktopologies.TreeLSTMsoutperformallexistingsystemsandstrongLSTMbaselinesontwotasks:predictingthesemanticrelatednessoftwosentences(SemEval2014,Task1)andsentimentclassification(StanfordSentimentTreebank).K.STaietal,"ImprovedSeman87K.STaietal,"ImprovedSemanticRepresentationsFromTree-StructuredLongShort-TermMemoryNetworks”,ACL-2015Tree-LSTMAchievecomparableaccuracyConstituency-TreebasedperformsbetterThewordvectorsareinitializedbyGloveWord2Vectors(Trainedon840billiontokensofCommonCrawldata,/projects/glove/)K.STaietal,"ImprovedSeman88OverviewIntroductionEmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollectionEmotionRecognitioninSpeechEmotionRecognitioninConversationsIndustrialApplicationOverviewIntroduction89RANLP2015,Hissar,BulgariaPriorKnowledge+DeepNeuralNetworks6thSeptember201590Foreachiteration:Theteachernetworkisobtainedbyprojectingthestudentnetworktoarule-regularizedsubspace(reddashedarrow);Thestudentnetworkisupdatedtobalancebetweenemulatingtheteacher’soutputandpredictingthetruelabels(black/bluesolidarrows).Huetal,”HarnessingDeepNeuralNetworkswithLogicRules”,ACL-2016Thisprocessisagnosticthestudentnetwork,applicabletoanyarchitecture:RNN/DNN/CNNRANLP2015,Hissar,BulgariaPrRANLP2015,Hissar,BulgariaPriorKnowledge+DeepNeuralNetworks6thSeptember201591Huetal,”HarnessingDeepNeuralNetworkswithLogicRules”,ACL-2016TheTeacherNetworkiscreatedeachiterationbasedontwocriteria:[1]closeenoughtothestudentnetwork[2]reflectallrulesRANLP2015,Hissar,BulgariaPrPriorKnowledge+DeepNeuralNetworksAccuracyonSST2Datasets,-Rule-qistheteachernetworkOnedifficultyfortheplainneuralnetworkistoidentifycontrastivesenseinordertocapturethedominantsentimentprecisely.PriorKnowledgeinExperiment:“AButB”,theoverallsentimentisconsistentwiththesentimentofBPriorKnowledge+DeepNeural92OverviewIntroductionEmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollectionEmotionRecognitioninSpeechEmotionRecognitioninConversationsIndustrialApplicationOverviewIntroduction93TextCorpusforSentimentAnalysisTextCorpusforSentimentAnal94TextCorpusforSentimentAnalysisTextCorpusforSentimentAnal95ChineseTextCorpusforSentimentAnalysisNewsandblogpostswithEkmanemotions(Wang,2014)Ren-CECpsblogemotioncorpus(Quan&Ren,2009)Thesentencesareannotatedwitheightemotions:joy,expectation,love,surprise,anxiety,sorrow,anger,andhate.2013ChineseMicroblogSentimentAnalysisEvaluation(CMSAE)DatasetofpostsfromSinaWeiboannotatedwithsevenemotions:anger,disgust,fear,happiness,like,sadnessandsurprise.Thetrainset:4000instances(13252sentences)Thetestset:10000instances(32185sentences)
/conference/2013/pages/page04eva.htmlChineseValence-ArousalTexts(CVAT)
Liang-ChihYu.2016.BuildingChineseAffectiveResources
inValence-ArousalDimensions.(NAACL/HLT-16).SaifM.Mohammad,“ComputationalAnalysisofAffectandEmotioninLanguage”,EMNLP2015
ChineseTextCorpusforSentim96ManuallycreatedlexicalresourcesSaifM.Mohammad,“ComputationalAnalysisofAffectandEmotioninLanguage”,EMNLP2015
Manuallycreatedlexicalresou97SharedtasksatthesentencelevelSaifM.Mohammad,“ComputationalAnalysisofAffectandEmotioninLanguage”,EMNLP2015
Sharedtasksatthesentencel98OtherResources:AffectcorporaSaifM.Mohammad,“ComputationalAnalysisofAffectandEmotioninLanguage”,EMNLP2015
OtherResources:Affectcorpor99OverviewIntroductionEmotionRecognitioninTextEmotionRecognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionrecognitionCNNforspeechemotionrecognitionDatacollectionforspeechemotionrecognitionEmotionRecognitioninConversationsIndustrialApplicationOverviewIntroduction100TheCommonFrameworkStep1:SegmentLevelStep2:Utterance
LevelClassifierCNNDNN/LSTMRNN/ELMTheCommonFrameworkStep1:Se101ThecommonfeaturesFramefeatureset:Framelength:25ms,with10msslidingSegmentlength:265ms,enoughtoexpressemotionINTERSPEECH2009EmotionChallengeFeatureset:12MFCC;F0,root-mean-squaresignalframeenergy;zero-crossingrateoftimesignalandthevoicingprobabilitycomputedfromtheACF[?].1storderderivatives;acousticfeatures:Segmentlength:250ms;stackframefeatures
Classifier,DistributionofemotionstatesThecommonfeaturesFramefeatu102OverviewIntroductionEmotionRecognitioninTextEmotionRecognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionrecognitionCNNforspeechemotionrecognitionDatacollectionforspeechemotionrecognitionEmotionRecognitioninConversationsIndustrialApplicationOverviewIntroduction103DBN+iVectorRuiXiaandYangliu,"DBN-ivectorFrameworkforAcousticEmotionRecognition”,
Interspeech2016DBN+iVectorRuiXiaandYangli104DNN+ELMFrame-levelfeatures:30
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025年市场营销策划执行规范
- 神木化工管理流程
- 物业管理投诉处理流程与规范
- 单位安全责任制度
- 超市商品质量及售后服务制度
- 采购物资供应商评价与淘汰制度
- 办公室员工出差安全管理制度
- 2026年邹平城投集团招聘备考题库含答案详解
- 关于2025年下半年沐川县中等职业学校公开考核招聘急需紧缺专业技术人员的备考题库及一套完整答案详解
- 养老院安全管理制度
- 2026年药店培训计划试题及答案
- 2026春招:中国烟草真题及答案
- 2026河南省气象部门招聘应届高校毕业生14人(第2号)参考题库附答案
- 2025江苏无锡市宜兴市部分机关事业单位招聘编外人员40人(A类)备考笔试试题及答案解析
- 卵巢过度刺激征课件
- 汉服行业市场壁垒分析报告
- 2026华润燃气校园招聘(公共基础知识)综合能力测试题附答案解析
- 临床试验风险管理计划(RMP)编制规范
- 2025年项目总监年底工作总结及2026年度工作计划
- 农业科技园区建设与运营方案
- 招投标业务流程及合同管理指南
评论
0/150
提交评论