Shlomo Shamai
Distinguished Professor, William Fondiller Professor of Telecommunications, Department of Electrical Engineering, Technion-Israel Institute of Technology. Ph.D. - Technion—Israel Institute of Technology.


Hamid Jafarkhani
Chancellor’s Professor, Electrical Engineering and Computer Science, University of California, Irvine. Ph.D. - University of Maryland at College Park.


Nambi Seshadri
Former CTO, Mobile & Wireless, Broadcom Corporation, Ph.D. - Rensselaer Polytechnic Institute.


Kees Schouhamer Immink
President and founder of Turing Machines Inc., a Dutch-based research and consulting firm that contributes to science and technology., Ph.D. - Eindhoven University of Technology.


Alexander Vardy
Jack K. Wolf Professor of Electrical and Computer Engineering at University of California San Diego, Ph.D. - Tel Aviv University.


Vincent Poor
Michael Henry Strater University Professor of Electrical Engineering, Princeton University, Princeton, New Jersey. Ph.D. - Princeton University.


Georgios Giannakis
Endowed Chair in Wireless Telecommunications, and McKnight Presidential Chair in ECE Digital Technology Center, Director University of Minnesota, Minneapolis. Ph.D. - University of Southern California.


Abbas El Gamal
Hitachi America Professor of Engineering, Department of Electrical Engineering at Stanford University. Ph.D. - Stanford.


Thomas Kailath
Hitachi America Emeritus Professor of Engineering, Department of Electrical Engineering at Stanford University. Sc.D. - MIT.


David Forney
Adjunct Professor Emeritus, at the department of electrical engineering and computer science of MIT. Sc.D. - MIT.

Note: the following books are not recommended by Professor Forney. They are books that have been used as reference texts in one/some courses he has taught.

Muriel Medard
Cecil H. Green Professor in the Electrical Engineering and Computer Science (EECS) Department at MIT. Sc.D. MIT.

Note: the following books are not recommended by Professor Medard. They are books that have been used as reference texts in one/some courses she has taught.

Robert Gallager
Professor Emeritus of Electrical Engineering and Computer Science at MIT. Sc.D. - MIT.

Note: the following books are not recommended by Professor Gallager. They are books that have been used as reference texts in one/some courses he has taught.

Bin Yu
Chancellor’s Distinguished Professor and Class of 1936 Second Chair Departments of Statistics and Electrical Engineering and Computer Sciences at UC Berkeley. Ph.D. - UC Berkeley.

Note: the following books are not recommended by Professor Yu. They are books that have been used as reference texts in one/some courses she has taught.

David Tse
Thomas Kailath and Guanghan Xu Professor in the School of Engineering Department of Electrical Engineering, Stanford University, Ph.D. - MIT.

Note: the following books are not recommended by Professor Tse. They are books that have been used as reference texts in one/some courses he has taught.

Jurgen Schmidhuber
Scientific director of the Dalle Molle Institute for Artificial Intelligence (IDSIA) Research in Lugano, in Ticino in southern Switzerland, and Professor of Artificial Intelligence (Ordinarius) at the Faculty of Computer Science at the University of Lugano., Ph.D. - Technical University of Munich.

Note: the following books are not recommended by Professor Schmidhuber. They are books that have been used as reference texts in one/some courses he has taught.

Information theory is a sub-field of electrical engineering and computer science that deals with the quantification, storage, and communication of information. It was developed in the 1948 by Claude Shannon, a Bell Labs engineer, in his seminal paper “A Mathematical Theory of Communication”.

One of the key concepts in information theory is entropy, which is a measure of the uncertainty or randomness of a set of data. Shannon’s entropy formula provides a way to calculate the amount of information contained in a message, and it forms the basis for data compression algorithms.

Another important concept in information theory is the channel capacity, which is the maximum amount of information that can be transmitted over a communication channel. Shannon’s channel capacity theorem shows that there is a fundamental limit to the amount of information that can be transmitted over a channel with a certain level of noise.

Information theory has many practical applications in electrical engineering and computer science, including data compression, error-correcting codes, and communication systems. For example, data compression algorithms such as Huffman coding and Lempel-Ziv-Welch (LZW) are based on entropy and are widely used to reduce the size of digital files.

Error-correcting codes, such as Hamming codes and Reed-Solomon codes, are used to detect and correct errors in digital communications. These codes use concepts from information theory to add redundant data to a message, which can be used to detect and correct errors.

In the field of communication systems, information theory is used to design and analyze wireless communication systems, such as cellular networks, WiFi, and satellite communication systems. These systems use concepts such as modulation, channel coding, and multiple access techniques to enable efficient and reliable communication.

Finally, Information theory has also been used as a tool in various other fields such as cryptography, linguistics, neuroscience and genetics. The principles of information theory provide a mathematical framework for understanding the limits of data compression, cryptography and communication. It has been used to understand the structure of languages, the workings of the brain, and the evolution of genetic information.