第4章-信道容量教学课件_第1页
第4章-信道容量教学课件_第2页
第4章-信道容量教学课件_第3页
第4章-信道容量教学课件_第4页
第4章-信道容量教学课件_第5页
已阅读5页,还剩58页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

Review–propertyofmutualinfo.function:Property.1Relationshipbetweenaveragemutualinfo.andchannelinputprobabilitydistributionProperty1:I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionp(x).I(X;Y)p(x)Review–propertyofmutualinfo.function:Property.2

RelationshipbetweenInfo.contentandchanneltransitionprobabilitydistribution

Property2:I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributesp(y/X).I(X;Y)p(y/x)Whatischannel?Thechannelisacarriertransmittingmessages—apassagethroughwhichsignalpasses.Theinformationisabstract,butthechannelisconcrete.Forinstance:Iftwopeopleconverse,theairisthechannel;Ifthetwocalleachother,thetelephonelineisthechannel;Ifwewatchthetelevision,listentotheradio,thespacebetweenthereceiverandthetransmitteristhechannel.4.1.ThemodelandclassificationofthechannelInthispart,wewillmainlyintroducetwoparts:ChannelmodelsChannelclassifications

4.1.1ChannelModelswecantreatchannelasaconverterwhichtransferevents.thechannelmodelcanbeindicatedasthefollowFig:BinarySymmetricChannel(BSC)isthesimplestchannelmodelABSCisshownasbelow:BSCDMCWeassumethatthechannelandthemodulationismemoryless.TheinputsandoutputscanthenberelatedbyasetofconditionalprobabilitiesDMCThischannelisknownasaDiscreteMemorylessChannel(DMC)andisdepictedas4.1.2ChannelclassificationsChannelcanbeclassifiedintoseveraltypes.(对流层)(电离层)4.2Channeldoubtdegreeandaveragemutualinformation4.2.1Channeldoubtdegree4.2.2Averagemutualinformation4.2.3Propertiesofmutualinformationfunction4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation4.2.1ChanneldoubtdegreeAssumer.v.Xindicatestheinputsetofchannel,andr.v.Yindicatestheoutputsetofchannel,thechanneldoubtdegreeis:Themeaningof“channeldoubtdegreeH(X|Y)”isthatwhenthereceivingterminalgetsmessageY,theaverageuncertaintystillleavesaboutsourceX.Infact,theuncertaintycomesfromthenoiseinchannel.ThismeansthatiftheaverageuncertaintyofsourceXisH(X),we’llgetmoreorlessinformationwhicheliminatestheuncertaintyofthesourceXwhengettheoutputmessageY.Sowehavethefollowingconceptofaveragemutualinformation.Sincewehave:4.2.2AveragemutualinformationTheaveragemutualinformationistheentropyofsourceXminusthechanneldoubtdegree.TheabovemeaningisthatwhenthereceivergetsamessageY,theaverageinformationhecangetaboutXfromeverysymbolhereceived.4.2.3PropertiesofmutualinformationfunctionProperty1:Relationshipbetweenmutualinformationandchannelinputprobabilitydistribution.I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionP(X).ThiscanbeshowninFig.4.5andFig.4.6.Fig.4.5.I(X;Y)isconvexfunctionofP(X)Fig.4.6.MessagepassingthroughthechannelE.g.4.1Consideringadualelementchannel,theprobabilitydistributionisandthematrixofchannelisWhereistheprobabilityoftransmissionerror.Thenthemutualinformationis,Andwecangetthefollowingresults,So,TheaveragemutualinformationdiagramisshowninthefollowingFig.4.7.Fig.4.7.MutualinformationofthedualsymmetricchannelFromthediagram,wecanseethatwhentheinputsymbolssatisfy“equalprobabilitydistribution”,theaveragemutualinformationI(X;Y)reachesthemaximumvalue,andonlyatthistimethereceivergetsthelargestinformationfromeverysymbolhereceived.Property2Relationshipbetweeninformationandchanneltransitionprobabilitydistribution.I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributionofp(Y|X).Fig.4.8.I(X;Y)isaconcavefunctionofP(X|Y)E.g.4.2(Thisisthefollow-upofE.g.4.1)Consideringdualchannel,nowweknowtheaveragemutualinformationis,whenthesourcedistributionisaveragemutualinformationI(X;Y)istheconcavefunctionofp,justseeitfromthefollowingdiagram,,theMutualinfo.offixedbinarysource

Fromthediagram,wecansee,oncethebinarysourcefixed,whenthechannelpropertypchanges,we’llgetthedifferentmutualinformationI(X;Y),whenp=1/2,I(X;Y)=0,thatmeansthereceivergetthelestinformationfromthischannel,andalltheinformationislostinthewayoftransmission,thischannelhasthemostloudlynoise.Property3Ifthechannelinputisdiscreteandwithoutmemory,wehavethefollowinginequalityProperty4Ifthechannelisdiscreteandwithoutmemory,wehave(Remembertheresults)4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation

E.g.4.3Thereisasource

Itsmessagespassthroughachannelwithnoise.Thesymbolsreceivedbytheotherendofthechannelare.Thechannel’stransfermatrixis,pleasecalculate(1)Theself-informationincludedinthesymbolandofevent.(2)Theinformationaboutthereceivergetswhenitobservesthemessage(3)TheentropyofsourceXandreceivedY(4)ThechanneldoubtdegreeH(X|Y)andthenoiseentropyH(Y|X).(5)TheaveragemutualinformationgotbyreceiverwhenitreceivesY.

Solution:(1):(2):(3):(5):(4):4.3DiscretechannelwithoutmemoryThreegroupsofvariablestodescribethechannel:(1)Channelinputprobabilityspace,(2)Channeloutputprobabilityspace,(3)Channeltransferprobability,So,thechannelcanberepresentedbyThiscanbeindicatedbythefollowingillustration,andthechanneltransfermatrixis

WhenK=1,itdegeneratestothesinglemessagechannel;andwhenn=m=2,itdegeneratestothebinarysinglemessagechannel.Ifitsatisfiessymmetry,itconstitutesthemostcommonlyusedBSC.Fig.4.11.Binarymessagesymmetricalchannel4.4Channelcapacity4.4.1Theconceptofchannelcapacity4.4.2Discretechannelwithoutmemoryanditschannelcapacity4.4.3Continuouschannelanditschannelcapacity4.4.1TheconceptofchannelcapacityThecapacityofchannelcanbedefinedasthemaximumvalueofaveragemutualinformation,TheunitofchannelcapacityCisbit/symbolor

nat/symbolFromthepropertymentionedbefore,weknowI(X;Y)isanupperconvexfunctionofprobabilitydistributionp(x)ofinputvariableX.Foraspecificchannel,therealwaysexistsasourcewhichmaximizestheinformationofeverymessagetransmittingthroughthechannel.ThatmeansthemaximumofI(X;Y)exists.Andtheprobabilitydistributionp(x)iscalledtheoptimuminputdistribution.4.4.2DiscretechannelwithoutmemoryanditschannelcapacityClassificationofthediscretemessagesequencechannelFordiscretechannelwithoutmemory,itsatisfiesthefollowingrelationship.Accordingtothe“property4”ofthemutualinformation

I(X;Y)ofthemessagesequence,forthediscretechannelwithoutmemory,wehaveNote:onlywhenthesourceisofwithoutmemory,theequalrelationshipinthisformulamaybesatisfied

SowecangetthefollowingdeductionwhichgetstheformulaofchannelcapacityCTheoremofdiscretechannelwithoutmemory

AssumingthatthetransmissionprobabilitymatrixofthediscretechannelwithoutmemoryisQ,thesufficientconditionsunderwhichtheinputletterprobabilitydistributionp*canmakethemutualinformationI(p;Q)toachievemaximumvalueare

Whereistheaverage

mutualinformationwhensourceletterissent;andCisthechannelcapacityofthischannel.UnderstandingthistheoremFirstly,underthiskindofdistribution,eachletterwhoseprobabilityisabovezeroprovidesmutualinformationC,andeachletterwhoseprobabilityiszeroprovidesmutuallyinformationlowerthanorequaltoC.Secondly,onlyunderthiskindofdistribution,itmaycauseI(p;Q)toobtainthemaximumvalueC.Thirdly,I(X;Y)istheaverageof.Thatistosay,itsatisfiesthisequation

(1)IfwewanttoenhanceI(X;Y),enhancingp(ak)maybeagoodidea.(2)However,oncep(ak)isenhanced,I(x=ak;Y)maybereduced.(3)Toadjustp(ak)repeatedly,makeI(x=ak;Y)allequaltoC

(4)ThistimeI(X,Y)=CThetheoremonlyprovidesasufficientconditionoftomake

distributionandthevalueofC;butitmayhelptogetthevalueofCofseveralkindsofchannelsinsimplesituation.ItdoesnotgivetheconcreteE.g.4.4

Assumethetransmissionmatrixofbinarydiscretesymmetricalchannelis(1)If

pleasecalculate

(2)Pleasecalculatethecapacityofchannel,andtheprobabilitydistributionwhenreachingthecapacityofchannel.(1)(2)ApplicationExample3.6Wheremrepresentsthenumberofoutputsymbolset;Hmiistheentropyoftherowvectorofchannelmatrix.4.4.3ContinuouschannelanditschannelcapacityCharacteristicofcontinuouschannelAnalogchannelBasicknowledgetoaddablechannelShannonformulaUsageofShannonformulaCharacteristicofcontinuouschannelCharacteristic1:Thetimeisdiscrete,thevaluescopeiscontinuous.Characteristic2:Ateachmoment,itisthesinglerandomvariablewhosevalueiscontinuous.AnalogchannelBasicknowledgetoaddablechannel

X:channelinput

N:channelnoiseY:channeloutputIftwoof

X,Y,NareGaussdistributions,thentheotherisalsotheGaussdistribution.Thedifferentialentropyofther.v.satisfyingGaussiandistributiononlyconcernswithitsvariance

andhasnothingtodowiththeaveragevalue.Fig.4.22.AddablechannelTheorem:Whenagenerallystationaryrandomprocesssourcewithlimitedfrequency(F)andtime(T)passesthroughawhiteGaussianchannelwhichhaslimitedpower(PN),thechannelcapacityis:ShannonformulaThisisthefamousShannonformulaforcontinuouschannel.WhenT=1,thecapacityis:ProofofShannonformula:Assume,whereXandNareindependent

discreter.v.’s;andSinceWehaveThebiggestentropytheoremoflimitedaveragepowerProofofShannonformula:DuetothelimitedfrequencyFforandaccordingtotheNyquistsampletheorem,thecontinuoussignalX(t,w)canbeequivalentto2Fdiscretesignalspersecond.Thatis:ConsideringtimedurationT:Fig.4.23.ShannonformulaUsageofShannonformul

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论