版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
Review–propertyofmutualinfo.function:Property.1Relationshipbetweenaveragemutualinfo.andchannelinputprobabilitydistributionProperty1:I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionp(x).I(X;Y)p(x)Review–propertyofmutualinfo.function:Property.2
RelationshipbetweenInfo.contentandchanneltransitionprobabilitydistribution
Property2:I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributesp(y/X).I(X;Y)p(y/x)Whatischannel?Thechannelisacarriertransmittingmessages—apassagethroughwhichsignalpasses.Theinformationisabstract,butthechannelisconcrete.Forinstance:Iftwopeopleconverse,theairisthechannel;Ifthetwocalleachother,thetelephonelineisthechannel;Ifwewatchthetelevision,listentotheradio,thespacebetweenthereceiverandthetransmitteristhechannel.4.1.ThemodelandclassificationofthechannelInthispart,wewillmainlyintroducetwoparts:ChannelmodelsChannelclassifications
4.1.1ChannelModelswecantreatchannelasaconverterwhichtransferevents.thechannelmodelcanbeindicatedasthefollowFig:BinarySymmetricChannel(BSC)isthesimplestchannelmodelABSCisshownasbelow:BSCDMCWeassumethatthechannelandthemodulationismemoryless.TheinputsandoutputscanthenberelatedbyasetofconditionalprobabilitiesDMCThischannelisknownasaDiscreteMemorylessChannel(DMC)andisdepictedas4.1.2ChannelclassificationsChannelcanbeclassifiedintoseveraltypes.(对流层)(电离层)4.2Channeldoubtdegreeandaveragemutualinformation4.2.1Channeldoubtdegree4.2.2Averagemutualinformation4.2.3Propertiesofmutualinformationfunction4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation4.2.1ChanneldoubtdegreeAssumer.v.Xindicatestheinputsetofchannel,andr.v.Yindicatestheoutputsetofchannel,thechanneldoubtdegreeis:Themeaningof“channeldoubtdegreeH(X|Y)”isthatwhenthereceivingterminalgetsmessageY,theaverageuncertaintystillleavesaboutsourceX.Infact,theuncertaintycomesfromthenoiseinchannel.ThismeansthatiftheaverageuncertaintyofsourceXisH(X),we’llgetmoreorlessinformationwhicheliminatestheuncertaintyofthesourceXwhengettheoutputmessageY.Sowehavethefollowingconceptofaveragemutualinformation.Sincewehave:4.2.2AveragemutualinformationTheaveragemutualinformationistheentropyofsourceXminusthechanneldoubtdegree.TheabovemeaningisthatwhenthereceivergetsamessageY,theaverageinformationhecangetaboutXfromeverysymbolhereceived.4.2.3PropertiesofmutualinformationfunctionProperty1:Relationshipbetweenmutualinformationandchannelinputprobabilitydistribution.I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionP(X).ThiscanbeshowninFig.4.5andFig.4.6.Fig.4.5.I(X;Y)isconvexfunctionofP(X)Fig.4.6.MessagepassingthroughthechannelE.g.4.1Consideringadualelementchannel,theprobabilitydistributionisandthematrixofchannelisWhereistheprobabilityoftransmissionerror.Thenthemutualinformationis,Andwecangetthefollowingresults,So,TheaveragemutualinformationdiagramisshowninthefollowingFig.4.7.Fig.4.7.MutualinformationofthedualsymmetricchannelFromthediagram,wecanseethatwhentheinputsymbolssatisfy“equalprobabilitydistribution”,theaveragemutualinformationI(X;Y)reachesthemaximumvalue,andonlyatthistimethereceivergetsthelargestinformationfromeverysymbolhereceived.Property2Relationshipbetweeninformationandchanneltransitionprobabilitydistribution.I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributionofp(Y|X).Fig.4.8.I(X;Y)isaconcavefunctionofP(X|Y)E.g.4.2(Thisisthefollow-upofE.g.4.1)Consideringdualchannel,nowweknowtheaveragemutualinformationis,whenthesourcedistributionisaveragemutualinformationI(X;Y)istheconcavefunctionofp,justseeitfromthefollowingdiagram,,theMutualinfo.offixedbinarysource
Fromthediagram,wecansee,oncethebinarysourcefixed,whenthechannelpropertypchanges,we’llgetthedifferentmutualinformationI(X;Y),whenp=1/2,I(X;Y)=0,thatmeansthereceivergetthelestinformationfromthischannel,andalltheinformationislostinthewayoftransmission,thischannelhasthemostloudlynoise.Property3Ifthechannelinputisdiscreteandwithoutmemory,wehavethefollowinginequalityProperty4Ifthechannelisdiscreteandwithoutmemory,wehave(Remembertheresults)4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation
E.g.4.3Thereisasource
Itsmessagespassthroughachannelwithnoise.Thesymbolsreceivedbytheotherendofthechannelare.Thechannel’stransfermatrixis,pleasecalculate(1)Theself-informationincludedinthesymbolandofevent.(2)Theinformationaboutthereceivergetswhenitobservesthemessage(3)TheentropyofsourceXandreceivedY(4)ThechanneldoubtdegreeH(X|Y)andthenoiseentropyH(Y|X).(5)TheaveragemutualinformationgotbyreceiverwhenitreceivesY.
Solution:(1):(2):(3):(5):(4):4.3DiscretechannelwithoutmemoryThreegroupsofvariablestodescribethechannel:(1)Channelinputprobabilityspace,(2)Channeloutputprobabilityspace,(3)Channeltransferprobability,So,thechannelcanberepresentedbyThiscanbeindicatedbythefollowingillustration,andthechanneltransfermatrixis
WhenK=1,itdegeneratestothesinglemessagechannel;andwhenn=m=2,itdegeneratestothebinarysinglemessagechannel.Ifitsatisfiessymmetry,itconstitutesthemostcommonlyusedBSC.Fig.4.11.Binarymessagesymmetricalchannel4.4Channelcapacity4.4.1Theconceptofchannelcapacity4.4.2Discretechannelwithoutmemoryanditschannelcapacity4.4.3Continuouschannelanditschannelcapacity4.4.1TheconceptofchannelcapacityThecapacityofchannelcanbedefinedasthemaximumvalueofaveragemutualinformation,TheunitofchannelcapacityCisbit/symbolor
nat/symbolFromthepropertymentionedbefore,weknowI(X;Y)isanupperconvexfunctionofprobabilitydistributionp(x)ofinputvariableX.Foraspecificchannel,therealwaysexistsasourcewhichmaximizestheinformationofeverymessagetransmittingthroughthechannel.ThatmeansthemaximumofI(X;Y)exists.Andtheprobabilitydistributionp(x)iscalledtheoptimuminputdistribution.4.4.2DiscretechannelwithoutmemoryanditschannelcapacityClassificationofthediscretemessagesequencechannelFordiscretechannelwithoutmemory,itsatisfiesthefollowingrelationship.Accordingtothe“property4”ofthemutualinformation
I(X;Y)ofthemessagesequence,forthediscretechannelwithoutmemory,wehaveNote:onlywhenthesourceisofwithoutmemory,theequalrelationshipinthisformulamaybesatisfied
SowecangetthefollowingdeductionwhichgetstheformulaofchannelcapacityCTheoremofdiscretechannelwithoutmemory
AssumingthatthetransmissionprobabilitymatrixofthediscretechannelwithoutmemoryisQ,thesufficientconditionsunderwhichtheinputletterprobabilitydistributionp*canmakethemutualinformationI(p;Q)toachievemaximumvalueare
Whereistheaverage
mutualinformationwhensourceletterissent;andCisthechannelcapacityofthischannel.UnderstandingthistheoremFirstly,underthiskindofdistribution,eachletterwhoseprobabilityisabovezeroprovidesmutualinformationC,andeachletterwhoseprobabilityiszeroprovidesmutuallyinformationlowerthanorequaltoC.Secondly,onlyunderthiskindofdistribution,itmaycauseI(p;Q)toobtainthemaximumvalueC.Thirdly,I(X;Y)istheaverageof.Thatistosay,itsatisfiesthisequation
(1)IfwewanttoenhanceI(X;Y),enhancingp(ak)maybeagoodidea.(2)However,oncep(ak)isenhanced,I(x=ak;Y)maybereduced.(3)Toadjustp(ak)repeatedly,makeI(x=ak;Y)allequaltoC
(4)ThistimeI(X,Y)=CThetheoremonlyprovidesasufficientconditionoftomake
distributionandthevalueofC;butitmayhelptogetthevalueofCofseveralkindsofchannelsinsimplesituation.ItdoesnotgivetheconcreteE.g.4.4
Assumethetransmissionmatrixofbinarydiscretesymmetricalchannelis(1)If
pleasecalculate
(2)Pleasecalculatethecapacityofchannel,andtheprobabilitydistributionwhenreachingthecapacityofchannel.(1)(2)ApplicationExample3.6Wheremrepresentsthenumberofoutputsymbolset;Hmiistheentropyoftherowvectorofchannelmatrix.4.4.3ContinuouschannelanditschannelcapacityCharacteristicofcontinuouschannelAnalogchannelBasicknowledgetoaddablechannelShannonformulaUsageofShannonformulaCharacteristicofcontinuouschannelCharacteristic1:Thetimeisdiscrete,thevaluescopeiscontinuous.Characteristic2:Ateachmoment,itisthesinglerandomvariablewhosevalueiscontinuous.AnalogchannelBasicknowledgetoaddablechannel
X:channelinput
N:channelnoiseY:channeloutputIftwoof
X,Y,NareGaussdistributions,thentheotherisalsotheGaussdistribution.Thedifferentialentropyofther.v.satisfyingGaussiandistributiononlyconcernswithitsvariance
andhasnothingtodowiththeaveragevalue.Fig.4.22.AddablechannelTheorem:Whenagenerallystationaryrandomprocesssourcewithlimitedfrequency(F)andtime(T)passesthroughawhiteGaussianchannelwhichhaslimitedpower(PN),thechannelcapacityis:ShannonformulaThisisthefamousShannonformulaforcontinuouschannel.WhenT=1,thecapacityis:ProofofShannonformula:Assume,whereXandNareindependent
discreter.v.’s;andSinceWehaveThebiggestentropytheoremoflimitedaveragepowerProofofShannonformula:DuetothelimitedfrequencyFforandaccordingtotheNyquistsampletheorem,thecontinuoussignalX(t,w)canbeequivalentto2Fdiscretesignalspersecond.Thatis:ConsideringtimedurationT:Fig.4.23.ShannonformulaUsageofShannonformul
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025安徽芜湖市白茆建设投资有限公司招聘和政审有关工作笔试历年参考题库附带答案详解
- 2025四川绵阳市绵投置地有限公司招聘安装工程师岗位拟录用人员笔试历年参考题库附带答案详解
- 2025四川九洲空管科技有限责任公司招聘软件研发岗拟录用人员笔试历年参考题库附带答案详解
- 绿色采购策略优化-第6篇-洞察与解读
- 造价管理真题试卷【完整版】(文末含答案解析)
- 2022护士职业资格证考试《专业实务》综合练习试卷 附答案
- 2025年贵州省公安招聘辅警考试试卷含答案
- 2022年食品行业人员专业知识提升训练试卷 含答案
- 2022年-2023年一级建造师之一建矿业工程实务能力检测试卷B卷附答案
- 2025年机械工程技术员职业能力评价试卷及答案解析
- 采购领域廉洁培训课件
- 畜产品质量安全管理课件
- CJ/T 106-1999城市生活垃圾产量计算及预测方法
- 高校组织员试题及答案
- 产教融合模式下职业院校人才培养质量提升策略研究报告
- 脱离父子关系协议书
- 2025年上海市金山区高考英语一模试卷
- 人工智能赋能中职计算机教学创新路径与实践探索
- 眼科技师考试题及答案
- 危险化学品无仓储经营责任制度及操作规程
- 资源与运营管理-第一次形考任务-国开-参考资料
评论
0/150
提交评论