MACHINE LEARNING

Pasquale FOGGIA MACHINE LEARNING

0623200005
DIPARTIMENTO DI INGEGNERIA DELL'INFORMAZIONE ED ELETTRICA E MATEMATICA APPLICATA
EQF7
INFORMATION ENGINEERING FOR DIGITAL MEDICINE
2022/2023



OBBLIGATORIO
YEAR OF COURSE 1
YEAR OF DIDACTIC SYSTEM 2022
SPRING SEMESTER
CFUHOURSACTIVITY
324LESSONS
324LAB
324EXERCISES


Objectives
THE COURSE IS AIMED AT PROVIDING THE STUDENT WITH THE THEORETICAL, METHODOLOGICAL AND TECHNOLOGICAL KNOWLEDGE ON MACHINE LEARNING AND ON THE ANALYSIS OF LARGE DATA SETS, INCLUDING BOTH TRADITIONAL TECHNIQUES AND INNOVATIVE PARADIGMS SUCH AS DEEP LEARNING.

KNOWLEDGE AND UNDERSTANDING
PARADIGMS OF STRUCTURAL LEARNING, STATISTICAL LEARNING AND NEURAL LEARNING. UNSUPERVISED LEARNING. DEEP LEARNING. PARADIGMS AND TOOLS FOR BIG DATA ANALYSIS.


APPLYING KNOWLEDGE AND UNDERSTANDING
DESIGN AND REALIZATION OF SOLUTIONS TO LEARNING AND DATA ANALYTICS PROBLEMS BY INTEGRATING EXISTING TOOLS AND TUNING IN AN EFFECTIVE WAY THEIR OPERATING PARAMETERS.
Prerequisites
THE COURSE REQUIRES BASIC KNOWLEDGE OF THE PYTHON PROGRAMMING LANGUAGE.
Contents
Learning Unit 1: Foundations
(HOURS OF LECTURES/EXERCISES/LABORATORY 6/0/4)
- 1 (2 HOURS Lecture): Definition of Machine Learning. Historical outline. Machine Learning tasks: Supervised Learning, Unsupervised Learning, Semi-Supervised Learning, Reinforcement Learning.
- 2 (2 HOURS Lecture): Training data. Data kinds: numerical, categorical, structured data. Learning as an optimization problem. Parameters and hyper-parameters. Overfitting. Bias and variance errors. The “no free lunch” theorem.
- 3 (2 HOURS Laboratory): Google Colab. The numpy library.
- 4 (2 HOURS Lecture): Performance evaluation. Test set. Hyper-parameter tuning. Validation set. K-Fold Cross Validation. Data augmentation. Regularization. The curse of dimensionality. Performance evaluation metrics. Accuracy. Precision. Recall. Receiver Operating Curve.
- 5 (2 HOURS Laboratory): Exercise on a simple classification problem.
KNOWLEDGE AND UNDERSTANDING: Fundamental learning paradigms.
APPLIED KNOWLEDGE AND UNDERSTANDING: Designing and implementing simple learning solutions.

Learning Unit 2: Introductions to artificial neural networks. The MLP, LVQ and SOM networks.
(HOURS OF LECTURES/EXERCISES/LABORATORY 10/0/8)
- 6 (2 HOURS Lecture): Biological neurons. Artificial neural networks. McCulloch and Pitts neuron. Rosenblatt’s Perceptron. Combination of neurons. Feed-forward, lateral connections and recurring architectures. Fully-connected and sparsely-connected networks. Multi-Layer Perceptrons.
- 7 (2 HOURS Lecture): The universal approximation theorem. Training a MLP. Gradient Descent. The Back Propagation Algorithm.
- 8 (2 HOURS Lecture): Stochastic Gradient Descent. Early Stopping. Momentum. Adaptive Learning Rate. Regularization. Sigmoid and Tanh activation functions. MLP as binary classifiers. Binary cross-entropy loss. MLP as multi-class classifiers. Categorical cross-entropy loss.
- 9 (2 HOURS Laboratory): The Keras framework.
- 10 (2 HOURS Laboratory): Exercise on the use of MLP as a classifier.
- 11 (2 HOURS Laboratory): Exercise on the use of MLP as a regressor.
- 12 (2 HOURS Lecture): Competitive Neural Network. Learning Vector Quantization. Supervised and Unsupervised Learning with LVQ.
- 13 (2 HOURS Lecture): The manifold learning problem. Self Organized Maps. Training of a SOM.
- 14 (2 HOURS Laboratory): Exercise on LVQ and SOM networks.
KNOWLEDGE AND UNDERSTANDING: The neural learning paradigm. Architecture and operation of the MLP, LVQ and SOM networks.
APPLIED KNOWLEDGE AND UNDERSTANDING: Designing and implementing machine learning solutions based on the neural paradigm. Using the Keras framework for implementing and tuning neural networks.

Learning Unit 3: Deep Learning.
(HOURS OF LECTURES/EXERCISES/LABORATORY 10/2/8)
- 15 (2 HOURS Lecture): Third generation neural networks. Vanishing gradient and the relu activation function. Sparse connections. Weight sharing. Deep architectures. Advantages of deep networks. Representation learning. Transfer learning.
- 16 (2 HOURS Lecture): Convolutional Neural Networks. Operation and structure of a convolutional layer. Stride and padding. Pooling layers. Dropout layers. Output layer of a CNN.
- 17 (2 HOURS Laboratory): Exercises on CNN.
- 18 (2 HOURS Laboratory): Exercises on CNN - part 2.
- 19 (2 HOURS Lecture): Advanced aspects of the Keras framework. The computational graph. Non-sequential models. Weight sharing.
- 20 (2 HOURS Lecture): Customization of loss functions. Generators. Data augmentation for images.
- 21 (2 HOURS Laboratory): Exercise on fine tuning and data augmentation.
- 22 (2 HOURS Lecture): Learning strategies for deep networks. Learning degradation. Skip connections and residual learning. Batch normalization. Greedy supervised pre-training. Auxiliary heads.
- 23 (2 HOURS Laboratory): Exercise on skip connections and residual learning.
- 24 (2 HOURS Laboratory): Exercise on residual learning with fine tuning.
- 25 (2 HOURS Exercises): Project Work presentation.
KNOWLEDGE AND UNDERSTANDING: Deep Learning Models and Architectures, with special reference to convolutional networks. Techniques to improve the training of deep networks.
APPLIED KNOWLEDGE AND UNDERSTANDING: Designing and implementing machine learning solutions based on deep neural networks, including solutions based on fine tuning of existing networks.

Learning Unit 4: Advanced architectures
(HOURS OF LECTURES/EXERCISES/LABORATORY 10/2/10)
- 26 (2 HOURS Lecture): Transposed Convolution. MobileNets. Autoencoders. Variational Autoencoders.
- 27 (2 HOURS Laboratory): Exercise on autoencoders.
- 28 (2 HOURS Lecture): Reinforcement learning. Problem definition. Episodes. The state-action function Q. The Q Learning algorithm.The actor-critic model. Replay buffer.
- 29 (2 HOURS Laboratory): Exercise on Q Learning.
- 30 (2 HOURS Lecture): Discriminative and generative models. Generative Adversarial Networks. GAN training. Loss functions for the discriminator and the generator.
- 31 (2 HOURS Laboratory): Exercise on GAN.
- 32 (2 HOURS Lecture): Recurrent Neural Networks. Unfolding. Learning tasks: sequence-to-sequence, sequence-to-value, value-to-value, sequence-to-sequence of different length. Back Propagation Through Time. LSTM and GRU architectures.
- 33 (2 HOURS Laboratory): Exercise on LSTM.
- 34 (2 HOURS Lecture): Limitation of recurrent networks. The Transformer architecture. Attention module. Encoder block. Decoder block. The BERT Language Model.
- 35 (2 HOURS Laboratory): Exercise on Transformers.
- 36 (2 HOURS Exercises): Check on project work.
KNOWLEDGE AND UNDERSTANDING: Reinforcement learning and Q Learning algorithm. Advanced architectures for Deep learning, autoencoders, Generative Adversarial Networks, RNN and Transformers.
APPLIED KNOWLEDGE AND UNDERSTANDING: Designing and realizing learning solutions based on advanced architectures, with specific reference to reinforcement learning, generative models and solutions for sequential data processing.


TOTAL HOURS LECTURES/EXERCISES/LABORATORY: 36/4/32
Teaching Methods
THE COURSE CONTAINS THEORETICAL LECTURES, IN-CLASS EXERCITATIONS AND PRACTICAL LABORATORY EXERCITATIONS.
Verification of learning
THE EXAM IS COMPOSED BY THE DISCUSSION OF A TEAM PROJECTWORK (FOR 3-4 PERSONS TEAMS) AND AN ORAL INTERVIEW. THE DISCUSSION OF THE PROJECTWORK AIMS AT EVALUATING THE ABILITY TO BUILD A SIMPLE APPLICATION OF THE TOOLS PRESENTED IN THE COURSE TO A PROBLEM ASSIGNED BY THE TEACHER, AND INCLUDES A PRACTICAL DEMONSTRATION OF THE REALIZED APPLICATION, A PRESENTATION OF A QUANTITATIVE EVALUATION OF THE APPLICATION PERFORMANCE AND A DESCRIPTION OF THE TECHNICAL CHOICES INVOLVED IN ITS REALIZATION. THE INTERVIEW EVALUATES THE LEVEL OF THE KNOWLEDGE AND UNDERSTANDING OF THE THEORETICAL TOPICS, TOGETHER WITH THE EXPOSITION ABILITY OF THE CANDIDATE.
Texts
"DEEP LEARNING", IAN GOODFELLOW AND YOSHUA BENGIO AND AARON COURVILLE, MIT PRESS.

LECTURE NOTES AND OTHER MATERIAL PROVIDED DURING THE COURSE

SUPPLEMENTARY TEACHING MATERIAL WILL BE AVAILABLE ON THE UNIVERSITY E-LEARNING PLATFORM (HTTP://ELEARNING.UNISA.IT) ACCESSIBLE TO STUDENTS USING THEIR OWN UNIVERSITY CREDENTIALS.
More Information
THE COURSE IS HELD IN ENGLISH
  BETA VERSION Data source ESSE3 [Ultima Sincronizzazione: 2024-08-21]