トップページへ

2021 Faculty Courses School of Computing Undergraduate major in Mathematical and Computing Science

Information Theory

Academic unit or major
Undergraduate major in Mathematical and Computing Science
Instructor(s)
Yoshiyuki Kabashima
Class Format
Lecture
Media-enhanced courses
-
Day of week/Period
(Classrooms)
1-2 Tue (W934) / 1-2 Fri (W934)
Class
-
Course Code
MCS.T333
Number of credits
200
Course offered
2021
Offered quarter
3Q
Syllabus updated
Jul 10, 2025
Language
Japanese

Syllabus

Course overview and goals

While possibly so obvious that we normally are not even aware of the fact, substances existing in the real world naturally feature attributes physically quantifiable by man in weight and length. The academic disciplines of physics, chemistry, and biology have been developed to discuss nature quantitatively and objectively by focusing on these attributes. Now, can these academic disciplines be developed into "information" existing in the abstract world? One answer to this is "information theory". Information can be discussed quantitatively and objectively by focusing on the quantity of "code length" required for recording and transmitting information. Specifically, lecture topics will include information source modeling, self-information and entropy, source coding, and channel coding.

Course description and aims

Attainment target: At the end of the course, students will have the skill of quantitatively handling "information" by using notions of information quantity.
Theme: The purpose of this course is to grasp the following three issues: 1) notions of information quantities, such as self-information, entropy, joint entropy, conditional entropy, mutual information, etc., 2) elements of source coding, and 3) elements of channel coding.

Keywords

self-information, entropy, mutual information, source coding, channel coding

Competencies

  • Specialist skills
  • Intercultural skills
  • Communication skills
  • Critical thinking skills
  • Practical and/or problem-solving skills

Class flow

Online course by Zoom.

Course schedule/Objectives

Course schedule Objectives
Class 1

What is information theory?

Know overview of information theory

Class 2

Models of information source

Understand statistical properties of information source, representative models

Class 3

Entropy (1): Derivation of entropy, properties of entropy

Understand definition of entropy, derivation of related equations and inequalities.

Class 4

Entropy (2): Extension of entropy, entropy of Markov information source

Understand entropy of extended source, joint entropy, conditional entropy, entropy of Markov information source

Class 5

Data compression/source coding

Understand examples of coding and decoding, desired properties, code tree and prefix condition

Class 6

Craft inequality and lower bound of average code length

Understand craft inequality, lower bound of average code length

Class 7

Source coding theorem

Understand shannon code and Fano code, source coding theorem, Huffman code

Class 8

Huffman coding

Acquire the algorithm of Huffman coding. Understand the optimality of Huffman coding.

Class 9

Models of communication channel. Channel capacity

Know statistical properties of channel models, representative channel models. Understand the notion of channel capacity

Class 10

Channel capacity. Channel coding/error correcting codes

Understand evaluation of channel capacity for representative models. Understand channel coding and its relevant parameters, decoding and error rates

Class 11

Asymptotic equipartition property and typical sequences

Understand asymptotic equipartition property and typical set, typical set and source coding theorem, jointly typical set

Class 12

Channel coding theorem (I)

Understand the channel coding theorem

Class 13

Channel coding theorem (II). Gambling and information theory

Understand the converse theorem to the channel coding theorem. Understand the relation between gambling and information theory

Study advice (preparation and review)

To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class.
They should do so by referring to textbooks and other course material.

Textbook(s)

Slides for lectures will be distributed via OCW-i.

Reference books, course materials, etc.

Shigeichi Hirasawa, Introduction of information theory, Baifukan publishing (Japanese) ISBN: 978-4-563-01486-5
Thomas M. Cover and Joy A. Thomas, Elements of information theory (2nd Edition), John Wiley & Sons, Inc ISBN: 978-0-471-24195-9

Evaluation methods and criteria

Students' knowledge of information quantities, skills for handling them, and understanding of their application such as data compression and channel coding will be assessed based on report assignments.

Related courses

  • MCS.T212 : Fundamentals of Probability
  • MCS.T223 : Mathematical Statistics
  • MCS.T312 : Markov Analysis
  • MCS.T332 : Data Analysis

Prerequisites

Students must have successfully completed both Fundamentals of ProbabilityI (MCS.T212) and Mathematical Statistics (MCS.T223) or have equivalent knowledge.