DecodingofLBC线性分组码译码.ppt_第1页
DecodingofLBC线性分组码译码.ppt_第2页
DecodingofLBC线性分组码译码.ppt_第3页
DecodingofLBC线性分组码译码.ppt_第4页
DecodingofLBC线性分组码译码.ppt_第5页
已阅读5页,还剩13页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

1、Hard Decision Decoding of Linear Block Codes,(n,k,d) linear block codeC,BSC,C=(Cn-1,C0),mi,Channel decoder,The error prob. of the binary symmetric channel (BSC) is p. Assume that p1/2. Assume that the transmitted information are equally likely. We want to design an optimal decoder and analyze its pe

2、rformance.,y,概率,Because of noise, the received vector y =(yn-1, , y0) may be different from the transmitted codeword C. The difference vector e = y C = (en-1, , e0) is called an error vector. ej = 0 with probability 1-p, and ej = 1 with probability p. Since there are 2K possible error vectors, the d

3、ecoder can not decide which codeword was actually transmitted.,Hard Decision Decoding of Linear Block Codes,Since the transmitted information is equally likely, we have,Hard Decision Decoding of Linear Block Codes,Minimum Hamming Distance Decoding Rule,The optimal decision rule now becomes the minim

4、um Hamming distance decision rule. The optimal decoder decodes y as the nearest codeword In other words, the decoder picks the error vector that has the least weight and then forms an estimate,Error Detection Capability,Let C be a block code with length n and minimum Hamming distance d. Suppose C is

5、 used only for error detection, i.e., the receiver just tests if the received vector is codeword or not. If it is not, the receiver detects that an error has occurred. The code C can detect up to d-1 errors. When more than d-1 errors occur, the receiver may be fooled since it is possible for an erro

6、r vector and weight d to transform one codeword into another codeword. Thus C is capable of detecting up to d-1 errors.,We associate with each codeword Ci a ball of radius t and center Ci, where,Error Correction Capability,When C is used only for error correction, C can correct up to errors.,C1,C2,C

7、3,Ci,t,t,t,t,All these balls are non-overlapping.,If Ci is transmitted and t or fewer errors occur, then y is inside the ball centered at Ci, and is closer to Ci than any other codeword. Thus the nearest neighbor decoding will correct these errors. On the other hand, if more than t errors occur, y m

8、ay be closer to some other codeword. In this case, then the decoder will be fooled.,The block code can also be used for both error correction and error detection. Suppose d=7. Then C can correct/detect 3 errors. We may take a different decoding strategy to increase the error detection capability at

9、the expense of the error correction capability. For example, C can be used to correct up to 2 errors and at the same time detect 4 possible errors.,Error Detection and Error Correction,Ci,Cj,tc,tc,d,td,In general, a block code C with the minimum distance d can simultaneously correct tc errors and de

10、tect td errors as long as tc+tdd-1.,For any two disjoint codewords Ci and Cj, the ball of radius td and centre Ci and the ball of radius tc and centre Cj are disjoint.,Syndrome Decoding,A brute force method for performing the nearest neighbor decoding will involve 2k possible comparisons. More effic

11、ient method is syndrome decoding: Given the received vector y, one first computes the vector s = Hy = H(Cj + e) where H is a parity check matrix for C. The (n-k)-dimensional vector s = (sn-k-1, sn-k-2, , s0) is called the syndrome of y. The syndrome provides some information about the possible error

12、 vector e. There is one to one correspondence between syndromes and sets of all possible error vectors.,Syndrome Decoding,Given the received vector y, the set of all possible error vectors is The set y + C is called a coset of C. Thus A coset containing y = the set of all possible error vectors with

13、 respect to y. Property: Two vectors y1 and y2 have the same syndrome iff y1 and y2 are in the same coset.,Syndrome Decoding,There is a one to one correspondence between syndromes and cosets. Each coset contains 2K vectors. This implies that there are 2n-k cosets and hence 2n-k syndromes. In view of

14、 the nearest neighbor decoding rule, we get the following decoding algorithm: Step 1: Given a received vector y, compute the syndrome s of y. Step 2: Find the least weight vector in the coset given by the syndrome s. Step 3: Decode y as,The least weight vector in a coset is called the coset leader o

15、f the coset.,Syndrome Decoding,Codewords 0 C1 C2k-1 s0 Coset e1 C1+e1 C2k-1+ e1 s1 Coset e2n-k-1 C1+ e2n-k-1 C2k-1+ e2n-k-1 s2n-k-1,Standard array for syndrome decoding ( n -k is small ),Coset leaders,Syndrome,When y is received, its position in the standard array is located. The decoder then decide

16、s that the error vector is the left-most vector in the row containing y, and y is decoded as the codeword at the top of the column containing y.,Example,Codewords 00000 01011 10101 11110 00001 01010 10100 11111 00010 01001 10111 11100 00100 01111 10001 11010 01000 00011 11101 10110 10000 11011 00101

17、 01110 11000 10011 01101 00110 10010 11001 00111 01100,Coset leaders,Syndrome,dmin= 3 How about the actual error is (10100)?,Syndrome Computing in the Case of Cyclic Codes,Let g(x) = xn-k+gn-k-1xn-k-1+g1x+g0 be the generator polynomial of C. Let y=(yn-1, y0) be the received vector. Associate with y

18、a polynomial y(x) = yn-1xn-1 + +y1x + y0 Assume that we compute the syndrome s=(sn-k-1, s0) of y by using the systematic parity check matrix H. Associate with s a polynomial s(x) = sn-1xn-1 + +s1x + s0 One can show that s(x) is the remainder obtained by dividing y(x) by g(x) y(x) = f(x)g(x) + s(x),E

19、xample,The 7,4,3 binary Hamming code revisited,g(x) = x3+x+1,H=?,Suppose y = 1 1 0 1 1 0 1 What is syndrome s,Use matrix multiplication Use polynomial division,Shift Register Implementation,Quotient,y0y1y6 1011011,Performance of Hard Decision Decoding,Block error probability,The error probability of BSC,eis are the coset l

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论