




版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
Review–propertyofmutualinfo.function:Property.1Relationshipbetweenaveragemutualinfo.andchannelinputprobabilitydistributionProperty1:I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionp(x).I(X;Y)p(x)Review–propertyofmutualinfo.function:Property.2
RelationshipbetweenInfo.contentandchanneltransitionprobabilitydistribution
Property2:I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributesp(y/X).I(X;Y)p(y/x)Whatischannel?Thechannelisacarriertransmittingmessages—apassagethroughwhichsignalpasses.Theinformationisabstract,butthechannelisconcrete.Forinstance:Iftwopeopleconverse,theairisthechannel;Ifthetwocalleachother,thetelephonelineisthechannel;Ifwewatchthetelevision,listentotheradio,thespacebetweenthereceiverandthetransmitteristhechannel.4.1.ThemodelandclassificationofthechannelInthispart,wewillmainlyintroducetwoparts:ChannelmodelsChannelclassifications
4.1.1ChannelModelswecantreatchannelasaconverterwhichtransferevents.thechannelmodelcanbeindicatedasthefollowFig:BinarySymmetricChannel(BSC)isthesimplestchannelmodelABSCisshownasbelow:BSCDMCWeassumethatthechannelandthemodulationismemoryless.TheinputsandoutputscanthenberelatedbyasetofconditionalprobabilitiesDMCThischannelisknownasaDiscreteMemorylessChannel(DMC)andisdepictedas4.1.2ChannelclassificationsChannelcanbeclassifiedintoseveraltypes.(对流层)(电离层)4.2Channeldoubtdegreeandaveragemutualinformation4.2.1Channeldoubtdegree4.2.2Averagemutualinformation4.2.3Propertiesofmutualinformationfunction4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation4.2.1ChanneldoubtdegreeAssumer.v.Xindicatestheinputsetofchannel,andr.v.Yindicatestheoutputsetofchannel,thechanneldoubtdegreeis:Themeaningof“channeldoubtdegreeH(X|Y)”isthatwhenthereceivingterminalgetsmessageY,theaverageuncertaintystillleavesaboutsourceX.Infact,theuncertaintycomesfromthenoiseinchannel.ThismeansthatiftheaverageuncertaintyofsourceXisH(X),we’llgetmoreorlessinformationwhicheliminatestheuncertaintyofthesourceXwhengettheoutputmessageY.Sowehavethefollowingconceptofaveragemutualinformation.Sincewehave:4.2.2AveragemutualinformationTheaveragemutualinformationistheentropyofsourceXminusthechanneldoubtdegree.TheabovemeaningisthatwhenthereceivergetsamessageY,theaverageinformationhecangetaboutXfromeverysymbolhereceived.4.2.3PropertiesofmutualinformationfunctionProperty1:Relationshipbetweenmutualinformationandchannelinputprobabilitydistribution.I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionP(X).ThiscanbeshowninFig.4.5andFig.4.6.Fig.4.5.I(X;Y)isconvexfunctionofP(X)Fig.4.6.MessagepassingthroughthechannelE.g.4.1Consideringadualelementchannel,theprobabilitydistributionisandthematrixofchannelisWhereistheprobabilityoftransmissionerror.Thenthemutualinformationis,Andwecangetthefollowingresults,So,TheaveragemutualinformationdiagramisshowninthefollowingFig.4.7.Fig.4.7.MutualinformationofthedualsymmetricchannelFromthediagram,wecanseethatwhentheinputsymbolssatisfy“equalprobabilitydistribution”,theaveragemutualinformationI(X;Y)reachesthemaximumvalue,andonlyatthistimethereceivergetsthelargestinformationfromeverysymbolhereceived.Property2Relationshipbetweeninformationandchanneltransitionprobabilitydistribution.I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributionofp(Y|X).Fig.4.8.I(X;Y)isaconcavefunctionofP(X|Y)E.g.4.2(Thisisthefollow-upofE.g.4.1)Consideringdualchannel,nowweknowtheaveragemutualinformationis,whenthesourcedistributionisaveragemutualinformationI(X;Y)istheconcavefunctionofp,justseeitfromthefollowingdiagram,,theMutualinfo.offixedbinarysource
Fromthediagram,wecansee,oncethebinarysourcefixed,whenthechannelpropertypchanges,we’llgetthedifferentmutualinformationI(X;Y),whenp=1/2,I(X;Y)=0,thatmeansthereceivergetthelestinformationfromthischannel,andalltheinformationislostinthewayoftransmission,thischannelhasthemostloudlynoise.Property3Ifthechannelinputisdiscreteandwithoutmemory,wehavethefollowinginequalityProperty4Ifthechannelisdiscreteandwithoutmemory,wehave(Remembertheresults)4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation
E.g.4.3Thereisasource
Itsmessagespassthroughachannelwithnoise.Thesymbolsreceivedbytheotherendofthechannelare.Thechannel’stransfermatrixis,pleasecalculate(1)Theself-informationincludedinthesymbolandofevent.(2)Theinformationaboutthereceivergetswhenitobservesthemessage(3)TheentropyofsourceXandreceivedY(4)ThechanneldoubtdegreeH(X|Y)andthenoiseentropyH(Y|X).(5)TheaveragemutualinformationgotbyreceiverwhenitreceivesY.
Solution:(1):(2):(3):(5):(4):4.3DiscretechannelwithoutmemoryThreegroupsofvariablestodescribethechannel:(1)Channelinputprobabilityspace,(2)Channeloutputprobabilityspace,(3)Channeltransferprobability,So,thechannelcanberepresentedbyThiscanbeindicatedbythefollowingillustration,andthechanneltransfermatrixis
WhenK=1,itdegeneratestothesinglemessagechannel;andwhenn=m=2,itdegeneratestothebinarysinglemessagechannel.Ifitsatisfiessymmetry,itconstitutesthemostcommonlyusedBSC.Fig.4.11.Binarymessagesymmetricalchannel4.4Channelcapacity4.4.1Theconceptofchannelcapacity4.4.2Discretechannelwithoutmemoryanditschannelcapacity4.4.3Continuouschannelanditschannelcapacity4.4.1TheconceptofchannelcapacityThecapacityofchannelcanbedefinedasthemaximumvalueofaveragemutualinformation,TheunitofchannelcapacityCisbit/symbolor
nat/symbolFromthepropertymentionedbefore,weknowI(X;Y)isanupperconvexfunctionofprobabilitydistributionp(x)ofinputvariableX.Foraspecificchannel,therealwaysexistsasourcewhichmaximizestheinformationofeverymessagetransmittingthroughthechannel.ThatmeansthemaximumofI(X;Y)exists.Andtheprobabilitydistributionp(x)iscalledtheoptimuminputdistribution.4.4.2DiscretechannelwithoutmemoryanditschannelcapacityClassificationofthediscretemessagesequencechannelFordiscretechannelwithoutmemory,itsatisfiesthefollowingrelationship.Accordingtothe“property4”ofthemutualinformation
I(X;Y)ofthemessagesequence,forthediscretechannelwithoutmemory,wehaveNote:onlywhenthesourceisofwithoutmemory,theequalrelationshipinthisformulamaybesatisfied
SowecangetthefollowingdeductionwhichgetstheformulaofchannelcapacityCTheoremofdiscretechannelwithoutmemory
AssumingthatthetransmissionprobabilitymatrixofthediscretechannelwithoutmemoryisQ,thesufficientconditionsunderwhichtheinputletterprobabilitydistributionp*canmakethemutualinformationI(p;Q)toachievemaximumvalueare
Whereistheaverage
mutualinformationwhensourceletterissent;andCisthechannelcapacityofthischannel.UnderstandingthistheoremFirstly,underthiskindofdistribution,eachletterwhoseprobabilityisabovezeroprovidesmutualinformationC,andeachletterwhoseprobabilityiszeroprovidesmutuallyinformationlowerthanorequaltoC.Secondly,onlyunderthiskindofdistribution,itmaycauseI(p;Q)toobtainthemaximumvalueC.Thirdly,I(X;Y)istheaverageof.Thatistosay,itsatisfiesthisequation
(1)IfwewanttoenhanceI(X;Y),enhancingp(ak)maybeagoodidea.(2)However,oncep(ak)isenhanced,I(x=ak;Y)maybereduced.(3)Toadjustp(ak)repeatedly,makeI(x=ak;Y)allequaltoC
(4)ThistimeI(X,Y)=CThetheoremonlyprovidesasufficientconditionoftomake
distributionandthevalueofC;butitmayhelptogetthevalueofCofseveralkindsofchannelsinsimplesituation.ItdoesnotgivetheconcreteE.g.4.4
Assumethetransmissionmatrixofbinarydiscretesymmetricalchannelis(1)If
pleasecalculate
(2)Pleasecalculatethecapacityofchannel,andtheprobabilitydistributionwhenreachingthecapacityofchannel.(1)(2)ApplicationExample3.6Wheremrepresentsthenumberofoutputsymbolset;Hmiistheentropyoftherowvectorofchannelmatrix.4.4.3ContinuouschannelanditschannelcapacityCharacteristicofcontinuouschannelAnalogchannelBasicknowledgetoaddablechannelShannonformulaUsageofShannonformulaCharacteristicofcontinuouschannelCharacteristic1:Thetimeisdiscrete,thevaluescopeiscontinuous.Characteristic2:Ateachmoment,itisthesinglerandomvariablewhosevalueiscontinuous.AnalogchannelBasicknowledgetoaddablechannel
X:channelinput
N:channelnoiseY:channeloutputIftwoof
X,Y,NareGaussdistributions,thentheotherisalsotheGaussdistribution.Thedifferentialentropyofther.v.satisfyingGaussiandistributiononlyconcernswithitsvariance
andhasnothingtodowiththeaveragevalue.Fig.4.22.AddablechannelTheorem:Whenagenerallystationaryrandomprocesssourcewithlimitedfrequency(F)andtime(T)passesthroughawhiteGaussianchannelwhichhaslimitedpower(PN),thechannelcapacityis:ShannonformulaThisisthefamousShannonformulaforcontinuouschannel.WhenT=1,thecapacityis:ProofofShannonformula:Assume,whereXandNareindependent
discreter.v.’s;andSinceWehaveThebiggestentropytheoremoflimitedaveragepowerProofofShannonformula:DuetothelimitedfrequencyFforandaccordingtotheNyquistsampletheorem,thecontinuoussignalX(t,w)canbeequivalentto2Fdiscretesignalspersecond.Thatis:ConsideringtimedurationT:Fig.4.23.ShannonformulaUsageofShannonformul
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025-2030中国槟榔叶油行业市场发展趋势与前景展望战略研究报告
- 医疗产品界面设计的人性化改进
- 医疗安全教育与血透室工作者的专业成长
- 2025-2030中国智能化设施农业行业发展分析及发展前景与趋势预测研究报告
- 2025-2030中国旅游行业现状供需分析及市场深度研究发展前景及规划可行性分析研究报告
- 2025-2030中国文化地产行业现状供需分析及市场深度研究发展前景及规划可行性分析研究报告
- 基于深度学习的风格迁移技术研究-洞察阐释
- 2025-2030中国批量装料口行业市场发展趋势与前景展望战略研究报告
- 2025-2030中国快速凝血分析仪行业市场发展趋势与前景展望战略研究报告
- 2025-2030中国库存优化软件行业市场发展趋势与前景展望战略研究报告
- 《水浒传》读书汇报课
- 梅毒与hiv职业暴露及防护-图文
- 鲤科鱼类八亚科检索表(新)
- 烙铁头的寿命一般有多长
- GB∕T 37370-2019 中国常见色色名和色度特性
- 冀教英语六年级下册作文范文
- Continual Improvement持续改进程序(中英文)
- 10x2000对称式三辊卷板机设计机械毕业设计论文
- RCA应用于给药错误事情的分析结果汇报
- 申论答题纸-方格纸模板A4-可打印
- 土石方测量方案完整版
评论
0/150
提交评论