Зарегистрироваться
Восстановить пароль
FAQ по входу

Johannesson R., Zigangirov K.S. Fundamentals of Convolutional Coding

  • Файл формата pdf
  • размером 2,92 МБ
  • Добавлен пользователем
  • Описание отредактировано
Johannesson R., Zigangirov K.S. Fundamentals of Convolutional Coding
Издательство IEEE Press/John Wiley, 2015, -689 pp.
Our goal with this book is to present a comprehensive treatment of convolutional codes, their construction, their properties, and their performance. The purpose is that the book could serve as a graduate-level textbook, be a resource for researchers in academia, and be of interest to industry researchers and designers.
This book project started in 1989 and the first edition was published in 1999. The work on the second edition began in 2009. By now the material presented here has been maturing in our minds for more than 40 years, which is close to our entire academic lives. We believed that the appearance of some of David Forney’s important structural results on convolutional encoders in a textbook was long overdue. For us, that and similar thoughts on other areas generated new research problems. Such interplays between research and teaching were delightful experiences. This second edition is the final result of those experiences.
Chapter 1 provides an overview of the essentials of coding theory. Capacity limits and potential coding gains, classical block codes, convolutional codes, Viterbi decoding, and codes on graphs are introduced. In Chapter 2, we give formal definitions of convolutional codes and convolutional encoders. Various concepts of minimality are discussed in-depth using illuminative examples. Chapter 3 is devoted to a flurry of distances of convolutional codes. Time-varying convolutional codes are introduced and upper and lower distance bounds are derived. An in-depth treatment of Viterbi decoding is given in Chapter 4, including both error bounds and tighter error bounds for time-invariant convolutional codes as well as a closed-form expression for the exact bit error probability. Both the BCJR (Bahl-Cocke-Jelinek-Raviv) and the one-way algorithms for a posteriori probability decoding are discussed. A simple upper bound on the bit error probability for extremely noisy channels explains why it is important that the constituent convolutional encoders are systematic when iterative decoding is considered. The chapter is concluded by a treatment of tail-biting codes, including their BEAST (Bidirectional Efficient Algorithm for Searching Trees) decoding. In Chapter 5, we derive various random ensemble bounds for the decoding error probability. As an application we consider quantization of channel outputs. Chapter 6 is devoted to list decoding of convolutional codes, which is thoroughly analyzed. Once again we make the important conclusion that there are situations when it is important that a convolutional encoder is systematic. In Chapter 7, we discuss a subject that is close to our hearts, namely sequential decoding. Both our theses were on that subject. We describe and analyze the stack algorithm, the Fano algorithm, and Creeper. James Massey regarded the Fano algorithm as being the most clever algorithm among all algorithms! Chapters 8 and 9 rectify the lack of a proper treatment of low-density parity-check (LDPC) codes and turbo codes in the first edition, where these important areas got a too modest section. These codes revolutionized the world of coding theory at the end of the previous millennium. In Chapter 8, the LDPC block codes, which were invented by Robert Gallager and appeared in his thesis, are discussed. Then they are generalized to LDPC convolutional codes, which were invented by the second author and his graduate student Alberto Jiménez-Feltström. They are discussed in-depth together with bounds on their distances. Iterative decoding is introduced and iterative limits and thresholds are derived. The chapter is concluded by the introduction of the related braided block codes. Turbo codes are treated in Chapter 9 together with bounds on their distances and iterative decoding. Moreover, the braided block codes are generalized to their convolutional counterpart. In Chapter 10, we introduce two efficient algorithms, FAST (Fast Algorithm for Searching Trees) and BEAST, for determining code distances. FAST was designed to determine the Viterbi spectrum for convolutional encoders while using BEAST we can determine the spectral components for block codes as well. Extensive lists are given of the best known code generators with respect to free distance, numbers of nearest neighbors and of information bit errors, Viterbi spectrum, distance profile, and minimum distance. These lists contain both nonsystematic and systematic generators. In Appendix A we demonstrate how to minimize two examples of convolutional encoders and in Appendix B we present Wald’s identity and related results that are necessary for our analyses in Chapters 3-7.
For simplicity’s sake, we restricted ourselves to binary convolutional codes. In most of our derivations of the various bounds we only considered the binary symmetric channel (BSC). Although inferior from a practical communications point of view, we believe that its pedagogical advantages outweigh that disadvantage.
Each chapter ends with some comments, mainly historical in nature, and sets of problems that are highly recommended. Many of those were used by us as exam questions. Note that sections marked with asterisk (_) can be skipped at the first reading without loss of continuity.
Convolutional encoders—Structural properties
Distance properties of convolutional codes
Decoding of convolutional codes
Random ensemble bounds for decoding error probability
List decoding
Sequential decoding
Low-density parity-check codes
Turbo coding
Convolutional codes with good distance properties
A: Minimal encoders
B: Wald’s identity
  • Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
  • Регистрация