Hugendubel.info - Die B2B Online-Buchhandlung 

Merkliste
Die Merkliste ist leer.
Bitte warten - die Druckansicht der Seite wird vorbereitet.
Der Druckdialog öffnet sich, sobald die Seite vollständig geladen wurde.
Sollte die Druckvorschau unvollständig sein, bitte schliessen und "Erneut drucken" wählen.

Entropy and Information Theory

Previously published in hardcover
BuchKartoniert, Paperback
409 Seiten
Englisch
Springererschienen am18.09.20142. Aufl.
This fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.mehr
Verfügbare Formate
BuchGebunden
EUR202,50
BuchKartoniert, Paperback
EUR181,89

Produkt

KlappentextThis fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.
Details
ISBN/GTIN978-1-4899-8132-5
ProduktartBuch
EinbandartKartoniert, Paperback
Verlag
Erscheinungsjahr2014
Erscheinungsdatum18.09.2014
Auflage2. Aufl.
Seiten409 Seiten
SpracheEnglisch
Gewicht670 g
IllustrationenXXVII, 409 p.
Artikel-Nr.32977158

Inhalt/Kritik

Inhaltsverzeichnis
Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- References.- Indexmehr

Autor

Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.