Zum Hauptinhalt springen
Nicht aus der Schweiz? Besuchen Sie lehmanns.de

Joint Source-Channel Coding (eBook)

eBook Download: EPUB
2022
John Wiley & Sons (Verlag)
9781118693797 (ISBN)

Lese- und Medienproben

Joint Source-Channel Coding - Andres Kwasinski, Vinay Chande
Systemvoraussetzungen
76,99 inkl. MwSt
(CHF 75,20)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
Joint Source-Channel Coding

Consolidating knowledge on Joint Source-Channel Coding (JSCC), this book provides an indispensable resource on a key area of performance enhancement for communications networks

Presenting in one volume the key theories, concepts and important developments in the area of Joint Source-Channel Coding (JSCC), this book provides the fundamental material needed to enhance the performance of digital and wireless communication systems and networks.

It comprehensively introduces JSCC technologies for communications systems, including coding and decoding algorithms, and emerging applications of JSCC in current wireless communications. The book covers the full range of theoretical and technical areas before concluding with a section considering recent applications and emerging designs for JSCC. A methodical reference for academic and industrial researchers, development engineers, system engineers, system architects and software engineers, this book:

  • Explains how JSCC leads to high performance in communication systems and networks
  • Consolidates key material from multiple disparate sources
  • Is an ideal reference for graduate-level courses on digital or wireless communications, as well as courses on information theory
  • Targets professionals involved with digital and wireless communications and networking systems


Andres Kwasinski, Rochester Institute of Technology, USA

Dr. Kwasinski received his Ph.D. degree in Electrical and Computer Engineering from the University of Maryland in 2004. He is currently a Professor with the Department of Computer Engineering, Rochester Institute of Technology, Rochester, New York. Prior to this he was with Texas Instruments Inc., the Department of Electrical and Computer Engineering at the University of Maryland, and Lucent Technologies. Dr. Kwasinski has been a member of the IEEE Signal Processing Magazine Editorial Board, as Associate Editor and Area Editor for over twelve years. He was Editor for the IEEE Transactions on Wireless Communications and IEEE Wireless Communications Letters, the Globecom 2010 Workshop Co-Chair and the Chair of the IEEE Multimedia Technical Committee Interest Group on Distributed and Sensor Networks for Mobile Media Computing and Applications. He is a Senior Member of the IEEE.

Vinay Chande, Qualcomm Inc., USA

Vinay Chande has a Ph.D. in Electrical Engineering from the University of Maryland and his engineering education from Indian Institute of Technology, Mumbai. Dr. Chande works as a Systems Engineer at Wireless Research and Development at Qualcomm Technologies Inc. His current work gives him an opportunity to participate in and witness the advances in millimeter-wave radio bands, unlicensed spectrum access and machine learning for Industrial IoT.


Joint Source-Channel Coding Consolidating knowledge on Joint Source-Channel Coding (JSCC), this book provides an indispensable resource on a key area of performance enhancement for communications networks Presenting in one volume the key theories, concepts and important developments in the area of Joint Source-Channel Coding (JSCC), this book provides the fundamental material needed to enhance the performance of digital and wireless communication systems and networks. It comprehensively introduces JSCC technologies for communications systems, including coding and decoding algorithms, and emerging applications of JSCC in current wireless communications. The book covers the full range of theoretical and technical areas before concluding with a section considering recent applications and emerging designs for JSCC. A methodical reference for academic and industrial researchers, development engineers, system engineers, system architects and software engineers, this book: Explains how JSCC leads to high performance in communication systems and networks Consolidates key material from multiple disparate sources Is an ideal reference for graduate-level courses on digital or wireless communications, as well as courses on information theory Targets professionals involved with digital and wireless communications and networking systems

Andres Kwasinski, Rochester Institute of Technology, USA Dr. Kwasinski received his Ph.D. degree in Electrical and Computer Engineering from the University of Maryland in 2004. He is currently a Professor with the Department of Computer Engineering, Rochester Institute of Technology, Rochester, New York. Prior to this he was with Texas Instruments Inc., the Department of Electrical and Computer Engineering at the University of Maryland, and Lucent Technologies. Dr. Kwasinski has been a member of the IEEE Signal Processing Magazine Editorial Board, as Associate Editor and Area Editor for over twelve years. He was Editor for the IEEE Transactions on Wireless Communications and IEEE Wireless Communications Letters, the Globecom 2010 Workshop Co-Chair and the Chair of the IEEE Multimedia Technical Committee Interest Group on Distributed and Sensor Networks for Mobile Media Computing and Applications. He is a Senior Member of the IEEE. Vinay Chande, Qualcomm Inc., USA Vinay Chande has a Ph.D. in Electrical Engineering from the University of Maryland and his engineering education from Indian Institute of Technology, Mumbai. Dr. Chande works as a Systems Engineer at Wireless Research and Development at Qualcomm Technologies Inc. His current work gives him an opportunity to participate in and witness the advances in millimeter-wave radio bands, unlicensed spectrum access and machine learning for Industrial IoT.

1
Introduction and Background


This textbook is about jointly performing source and channel coding for an information source. The idea of coding is arguably the most significant contribution from Claude Shannon in his mathematical theory of communication [1] that gave birth to information theory. Up until Shannon's work, communication engineers believed that to improve reliability in transmitting a message, all they could do was to increase the transmit power or repeat the transmission many times until it was received free of errors. Instead, Shannon postulated that a message could be transmitted free of errors when not exceeding a capacity for the communication channel. All that was required to achieve this feat was to use a suitable coding mechanism. With coding, the message from the source is mapped into a signal that is matched to the channel characteristics. The ideal coding operation is able to both represent the message from the source in the most efficient form (removing redundant information) and add redundant information in a controlled way to enable the receiver to combat errors introduced in the channel. The efficient representation of the source message is called source coding, and the controlled introduction of redundant information to combat communication errors is called channel coding.

In [1], Shannon showed that under some ideal conditions, for example, for point‐to‐point “memoryless” channels, asymptotically in the codeword length, there is no performance difference whether the coding operation is performed as a single block or as a concatenation of a source coding operation and a separate channel coding operation. This is typically described as a separation theorem. The separation theorems are a class of asymptotic results of profound importance. It may not be an exaggeration to say that those forms of results motivated the development of today's digital revolution, where every form of information – media, gaming, text, video, multidimensional graphics, and data of all forms – can be encoded without worrying about the medium or channel where it will be shared, and shared without the concern for how it was encoded. Nevertheless, there are multiple practical scenarios where the ideal conditions given by Shannon do not hold. In these cases, we pay a performance penalty for doing the source and channel coding as two separate operations instead of jointly doing source and channel coding. This book focuses on some of these scenarios and helps the reader study the theory and techniques associated with performing source and channel coding as a single operation. Before delving into these topics, in this chapter, we provide an introduction and needed background for the rest of this book.

1.1 Simplified Model for a Communication System


A communication system enables transmission of information so that a message that originates at one place is reproduced exactly, or approximately, at another location. To develop his mathematical theory of communication [1, 2], Claude Shannon considered a simplified model for a communication system, as shown in Figure 1.1. In this model, a communication system is formed by five basic elements:

  • An information source, which generates a message that contains some information to be transmitted to the receiving end of the communication chain
  • A transmitter, which converts the message into a signal that can propagate through a communication medium
  • A channel, which constitutes the medium through which the signal propagates between the transmitter and receiver. This propagating signal is subject to different distortions and impairments, such as selective attenuation, delays, erasure, and the addition of noise.
  • A receiver, which attempts to recover the original message (possibly affected by distortion and impairments) by reversing the sequence of operations done at the transmitter while also attempting to correct or compensate for the distortion effects introduced in the channel
  • A destination, which receives the transmitted version of the message and makes sense of it.

The design of a communication system focuses on the transmitter and receiver. By designing the transmitter output to match the channel characteristics and requirements, the transmitter converts, or maps, the source message into a signal appropriate for transmission. This mapping operation is called encoding. The reverse operation, where a source message is estimated from the encoded signal, is called decoding. For reasons that will be seen later, the mapping operation at the encoder is frequently divided into two stages. The first stage, called source encoding, aims to represent the source output in a compact and efficient way that will require as few communication resources as possible while achieving some level of fidelity for the source message recovered after source decoding at the receiver. The compact and efficient representation that results from source encoding is matched to the source but is likely not matched to the channel characteristics and requirements. For example, the representation may be so compact that each of its parts may be critical for the recovery of the source message at the required level of fidelity, so any error introduced during transmission renders the received message completely unrecoverable. Because of this, the second stage of the transmitter, called channel encoding, has the function of converting the compact source representation into a signal matched to the channel. This likely results in a representation that is less compact but more resilient to the effects of channel impairments. On the receiver side, it is natural to think of the structure also separated into a sequence of decoding stages because the goal is to reverse the sequence of encoding operations that were performed at the transmitter.

Figure 1.1 A block diagram of a general communication system.

Designing a communication system entails designing the transmitter and thus the mapper from the source message to the channel signal. Would it be better to design the mapping as a single operation from source directly to the channel? Or, would it be better to design the mapping as a first source encoding stage followed by the channel encoding stage? Is there any performance advantage with either approach? One answer to these questions is given in an asymptotic sense by Shannon's source–channel separation theorem, which states that under some conditions, there is no difference in performance between a transmitter designed as a single mapper and a transmitter designed as the two cascaded source and channel encoding stages. This result is appealing from a designer's perspective. It appears to be a simpler divide‐and‐conquer approach, involving design of the source and channel coder–decoder pairs (codecs) separately. Nevertheless, the tricky element of Shannon's source–channel separation theorem is that the conditions under which it holds are difficult to find in practical scenarios. To discuss this, we first need to establish some key principles from information theory. The next sections provide an overview of important information theory concepts that will be used throughout the rest of this book.

1.2 Entropy and Information


The concept of information as understood in Shannon's information theory originates in Hartley's paper [3] and provides a measure on how much the uncertainty associated with a random phenomenon (mathematically characterized as the outcome of a random variable) is reduced by the observation of a particular outcome. The information provided by the outcome of a discrete random variable is defined as:

(1.1)

where is the probability of the outcome and the logarithm can be of any base in principle, but it is most frequently taken as base 2 (in which case the unit of information is the bit, a condensed merging of the words binary and digit). Intuitively, the rarer an event is, the more information its occurrence provides.

The notion of information as introduced in (1.1) provides a measure that is related to a specific outcome of a random variable. To measure the uncertainty associated with a random variable, Shannon introduced the concept of entropy (in an information theoretic context) as the expectation of the information function associated with a random variable. Then the entropy of a discrete random variable is defined as:

(1.2)

where is, in this context, the probability mass function (PMF). The entropy of a random variable can be interpreted as the mean value of the information provided by all its outcomes.

In the case of a continuous random variable with probability density function (PDF) , the concept of entropy has been extended to become the differential entropy:

(1.3)

For example, it can be shown that differential entropy of a zero‐mean Gaussian random variable with variance is [4].

By using joint probability distributions, one can extend the definition of entropy to calculate the joint entropy between multiple random variables. In the case of two discrete random variables, and , their joint entropy is

(1.4)

where is the joint PMF and and are marginal PMFs....

Erscheint lt. Verlag 8.11.2022
Reihe/Serie IEEE Press
Wiley - IEEE
Sprache englisch
Themenwelt Technik Elektrotechnik / Energietechnik
Schlagworte Andres Kwasinski • coding and decoding algorithms • Communication Technology - Networks • digital and wireless communication systems and networks • Drahtlose Kommunikation • Electrical & Electronics Engineering • Elektrotechnik u. Elektronik • joint source-channel coding • JSCC • Kommunikationsnetz • Kommunikationsnetze • Mobile & Wireless Communications • Signal Processing • Signalverarbeitung • Vinay Chande
ISBN-13 9781118693797 / 9781118693797
Informationen gemäß Produktsicherheitsverordnung (GPSR)
Haben Sie eine Frage zum Produkt?
EPUBEPUB (Adobe DRM)

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Kommunikationssysteme mit EIB/KNX, LON, BACnet und Funk

von Thomas Hansemann; Christof Hübner; Kay Böhnke

eBook Download (2025)
Hanser (Verlag)
CHF 38,95
Verfahren zur Berechnung elektrischer Energieversorgungsnetze

von Karl Friedrich Schäfer

eBook Download (2023)
Springer Fachmedien Wiesbaden (Verlag)
CHF 107,45