Abstract :
This is an updated and extended version of the author´s 2002 book A First Course in Information Theory. The current book consists of two parts. Part I contains 16 chapters, 14 of them basically identical with those of the first version, not addressing networks. The two new chapters cover differential entropy and continuous alphabet (mainly Gaussian) channels. The second part is of main interest; it provides a comprehensive, state-of-the-art presentation of the theory of network coding. Much of the material is new compared with the 2002 book, and did not even exist in 2002; the rest has also been substantially updated. The book is well written. A commendable new feature is that each chapter ends with a short and clear summary. As before, each chapter is complemented with problems. This is a valuable book that many scientists will use as a reference. It can also be used as a textbook and looks particularly suitable for special purpose courses.