トップページへ

2021 Faculty Courses School of Engineering Undergraduate major in Information and Communications Engineering

Machine Learning (ICT)

Academic unit or major
Undergraduate major in Information and Communications Engineering
Instructor(s)
Itsuo Kumazawa / Hiroki Nakahara
Class Format
Lecture
Media-enhanced courses
-
Day of week/Period
(Classrooms)
7-8 Mon (W242) / 7-8 Thu (W242)
Class
-
Course Code
ICT.S311
Number of credits
200
Course offered
2021
Offered quarter
3Q
Syllabus updated
Jul 10, 2025
Language
Japanese

Syllabus

Course overview and goals

Various machine learning techniques and mathematics for their understanding are studies together with their programming techniques for practice. The techniques studied are (1) Multi-layer neural networks, (2) Convolutional Neural networks (CNN), (3) Other non-neural popular machine learning techniques. Mathematics behind these techniques such as differentiation, gradient descend, chain rule, backpropagation, and nonlinear functions used for activation functions are studied. Programing techniques, the libraries of standard functions for efficient programming and implementation are practiced.

Course description and aims

Typical machine learning techniques such as (1) Multi-layer neural networks, (2) Convolutional Neural networks (CNN), (3) Other non-neural popular machine learning techniques and their mathematical backgrounds are studied for implementation by programming.

Keywords

Neural Networks, Deep Learning, Convolutional Neural Network(CNN), Backpropagation, Gradient descend, Minimization technique for loss functions, Bayes estimation, Principal component analysis, Boosting technique, Support Vector Machine, k-means method.

Competencies

  • Specialist skills
  • Intercultural skills
  • Communication skills
  • Critical thinking skills
  • Practical and/or problem-solving skills

Class flow

In the first half of the course (7 classes), the basic principles of multi-layer neural networks and machine learning are studied with the basic mathematics needed for understandings. CNN(Convolutional Neural Network), which is the core technique for deep neural networks, and its learning mechanism are theoretically explained. Programming techniques for deep learning are practiced. Exercises and examinations are held in the eighth class.

Course schedule/Objectives

Course schedule Objectives
Class 1 Backgrounds and the summary of the first half of the course. Biological neural networks and their modelings for engineering. Computation and programming of the models. Background and basic knowledge
Class 2 Basic mathematics needed for computation, learning, and programming of the multi-layer neural networks part 1 (activation functions, SoftMax, logistic regression, gradient descend, chain rule) Study basic mathematical techniques to analyze multi-layer neural networks
Class 3 Basic mathematics needed for computation, learning, and programming of the multi-layer neural networks part 2 (backpropagation and its recursive computation) Study basic mathematical techniques for backpropagation
Class 4 Programming techniques for multi-layer neural networks and their learning. Study programming techniques for multi-layer neural networks and learning
Class 5 Computation of Convolutional Neural Network (convolution, pooling, SoftMax and their roles) and techniques to improve its performance (generalization capability and avoiding overfitting) Study computation techniques of Convolutional Neural Network
Class 6 Mathematics for learning of Convolutional Neural Network (Gradient Descent and Backpropagation) Study basic mathematics for Convolutional Neural Network
Class 7 Tips for Programing of Convolutional Neural Network Study programming techniques of Convolutional Neural Network
Class 8 Introduction of machine learning, and Python programming for machine learning.
Class 9 Least squares method, overfitting, sparse learning, and robust learning.
Class 10 Classification problem programming using the sci-kit-learn library. Logistic regression, support vector machine (SVM), and decision tree.
Class 11 Clustering including the k-means method. Post/pre-processing for a dataset. L1 regularity, measurement of feature, and missing value.
Class 12 Maximum likelihood estimation, EM algorithm, Bayesian inference, confidence value.
Class 13 Data compression, principal component analysis (PCA), Linear Discriminant Analysis (LDA), Kernel PCA.
Class 14 Ensemble learning, majority method, random forest, bugging, bootstrap method, under boost.

Study advice (preparation and review)

To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterward (including assignments) for each class.
They should do so by referring to textbooks and other course material.

Textbook(s)

A textbook to study mathematical bases for deep learning: Learning and Neural Networks, published by Morikita Publishing Co., written by Itsuo Kumazawa

Reference books, course materials, etc.

Books to exercise deep learning programming: Machine Learning by raspberry Pi, published by Kodansha Publishing Co., written by Takashi Kanamal. S. Raschka and V. Mirjalili, "Python Machine Learning (Second Ed.)," Packt Publishing. C. M. Bishop, "Pattern Recognition and Machine Learning," Springer.

Evaluation methods and criteria

Final Examination and Programming Exercise.

Related courses

  • LAS.M102 : Linear Algebra I / Recitation
  • LAS.M101 : Calculus I / Recitation
  • LAS.M105 : Calculus II

Prerequisites

Basic knowledge about Differential and Integral Calculus.