




已阅读5页,还剩11页未读, 继续免费阅读
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
本答案是英文原版的配套答案,与翻译的中文版课本题序不太一样但内容一样。翻译的中文版增加了题量。2.2、Entropy of functions. Let be a random variable taking on a finite number of values. What is the (general) inequality relationship of and if (a) ?(b) ?Solution: Let . Then .Consider any set of s that map onto a single . For this set ,Since is a monotone increasing function and . Extending this argument to the entire range of (and ), we obtain ,with equality iff if one-to-one with probability one.(a) is one-to-one and hence the entropy, which is just a function of the probabilities does not change, i.e., .(b) is not necessarily one-to-one. Hence all that we can say is that , which equality if cosine is one-to-one on the range of .2.16. Example of joint entropy. Let be given by 0101/31/3101/3Find(a) ,.(b) ,.(c) (d) .(e) (f) Draw a Venn diagram for the quantities in (a) through (e).Solution: Fig. 1 Venn diagram(a) .(b) ()()(c)(d)(e)(f) See Figure 1.2.29 Inequalities. Let , and be joint random variables. Prove the following inequalities and find conditions for equality.(a) (b) (c) (d) Solution:(a) Using the chain rule for conditional entropy,With equality iff ,that is, when is a function of and .(b) Using the chain rule for mutual information,With equality iff , that is, when and are conditionally independent given .(c) Using first the chain rule for entropy and then definition of conditional mutual information, ,With equality iff , that is, when and are conditionally independent given .(d) Using the chain rule for mutual information,And therefore this inequality is actually an equality in all cases.4.5 Entropy rates of Markov chains.(a) Find the entropy rate of the two-state Markov chain with transition matrix(b) What values of ,maximize the rate of part (a)?(c) Find the entropy rate of the two-state Markov chain with transition matrix(d) Find the maximum value of the entropy rate of the Markov chain of part (c). We expect that the maximizing value of should be less than, since the 0 state permits more information to be generated than the 1 state.Solution:(a) The stationary distribution is easily calculated.Therefore the entropy rate is (b) The entropy rate is at most 1 bit because the process has only two states. This rate can be achieved if( and only if) , in which case the process is actually i.i.d. with .(c) As a special case of the general two-state Markov chain, the entropy rate is .(d) By straightforward calculus, we find that the maximum value of of part (c) occurs for . The maximum value is(wrong!)5.4 Huffman coding. Consider the random variable(a) Find a binary Huffman code for .(b) Find the expected codelength for this encoding.(c) Find a ternary Huffman code for .Solution:(a) The Huffman tree for this distribution is(b)The expected length of the codewords for the binary Huffman code is 2.02 bits.( )(c) The ternary Huffman tree is 5.9 Optimal code lengths that require one bit above entropy. The source coding theorem shows that the optimal code for a random variable has an expected length less than . Given an example of a random variable for which the expected length of the optimal code is close to , i.e., for any , construct a distribution for which the optimal code has .Solution: there is a trivial example that requires almost 1 bit above its entropy. Let be a binary random variable with probability of close to 1. Then entropy of is close to 0, but the length of its optimal code is 1 bit, which is almost 1 bit above its entropy.5.25 Shannon code. Consider the following method for generating a code for a random variable which takes on values with probabilities . Assume that the probabilities are ordered so that . Define , the sum of the probabilities of all symbols less than . Then the codeword for is the number rounded off to bits, where .(a) Show that the code constructed by this process is prefix-free and the average length satisfies .(b) Construct the code for the probability distribution (0.5, 0.25, 0.125, 0.125).Solution:(a) Since , we have Which implies that .By the choice of , we have . Thus , differs from by at least , and will therefore differ from is at least one place in the first bits of the binary expansion of . Thus the codeword for , , which has length , differs from the codeword for at least once in the first places. Thus no codeword is a prefix of any other codeword.(b) We build the following tableSymbolProbabilityin decimalin binaryCodeword10.50.00.01020.250.50.1021030.1250.750.110311040.1250.8750.11131113.5 AEP. Let be independent identically distributed random variables drawn according to the probability mass function. Thus . We know that in probability. Let , where q is another probability mass function on .(a) Evaluate , where are i.i.d. .Solution: Since the are i.i.d., so are ,,and hence we can apply the strong law of large numbers to obtain 8.1 Preprocessing the output. One is given a communication channel with transition probabilities and channel capacity . A helpful statistician preprocesses the output by forming . He claims that this will strictly improve the capacity.(a) Show that he is wrong.(b) Under what condition does he not strictly decrease the capacity?Solution:(a) The statistician calculates . Since forms a Markov chain, we can apply the data processing inequality. Hence for every distribution on ,.Let be the distribution on that maximizes . Then .Thus, the statistician is wrong and processing the output does not increase capacity.(b) We have equality in the above sequence of inequalities only if we have equality in data processing inequality, i.e., for the distribution that maximizes , we have forming a Markov chain.8.3 An addition noise channel. Find the channel capacity of the following discrete memoryless channel:Where . The alphabet for is . Assume that is independent of . Observe that the channel capacity depends on the value of .Solution: A sum channel. , We have to distinguish various cases depending on the values of . In this case, ,and . Hence the capacity is 1 bit per transmission. In this case, has four possible values . Knowing ,we know the which was sent, and hence . Hence the capacity is also 1 bit per transmission. In this case has three possible output values, 0,1,2, the channel is identical to the binary erasure channel, with . The capacity of this channel is bit per transmission. This is similar to the case when and the capacity is also 1/2 bit per transmission.8.5 Channel capacity. Consider the discrete memoryless channel , whereand . Assume that is independent of .(a) Find the capacity.(b) What is the maximizing ?Solution: The capacity of the channel is , which is obtained when has an uniform distribution, which occurs when has an uniform distribution.(a) The capacity of the channel is /transmission.(b) The capacity is achieved by an uniform distribution on the inputs. 8.12 Time-varying channels. Consider a time-varying discrete memoryless channel. Let be conditionally independent given , with conditional distribution given by . Let , . Find .Solution: With equlity if is chosen i.i.d. Hence .10.2 A channel with two independent looks at . Let and be conditionally independent and conditionally identically distributed given .(a) Show .(b) Conclude that the capacity of the channel is less than twice the capacity of the channelSolution:(a)(b) The capacity of the single look channel is .The capacity of the channel is 10.3 The two-look Gaussian channel. Consider the ordinary Shannon Gaussian channel with two correlated looks at , i.e., , where with a power constraint on , and , where . Find the capacity for(a)(b)(c)Solution:It is clear that the two input distribution that maximizes the capacity is . Evaluating the mutual information for this distribution,Now since, we have .Since , and , we have,And .Hence (a) . In this case, , which is the capacity of a single look channel. (b) . In this case, , which corresponds to using twice
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 医学新质生产力赋能乡村振兴
- 户外露营活动创意方案
- 民族服饰简笔画课件
- 新质生产力新锐青年设计奖
- 海关推动新质生产力发展
- 物流行业新质生产力的表现形态
- 力的平移定理
- 民族娃娃剪纸课件
- 2025年心脏病学患者的护理技能考核答案及解析
- 2025年急救技术科急救操作流程与团队协作训练模拟考核卷答案及解析
- 智能城市建设中的能源消耗预测与节能措施可行性研究报告
- 2025年上半年威海桃威铁路有限公司校招笔试题带答案
- 学校智慧黑板采购方案 投标文件(技术方案)
- 《无人机基础概论》无人机专业全套教学课件
- 滇桂黔文旅产业融合水平测度与比较
- 安全总监培训课件
- 陕西物业资质管理办法
- 甘油二酯油与心脏健康科学指南
- 英语电影配音教学课件
- 办公场所消防培训课件
- 2025-2030年中国铜包铝线行业市场现状供需分析及投资评估规划分析研究报告
评论
0/150
提交评论