XSL Content

Machine Learning & Neural Networks 26223

Centre
Faculty of Informatics
Degree
Grado en Inteligencia Artficial
Academic course
2023/24
Academic year
3
No. of credits
6
Languages
Spanish
Basque
Code
26223

TeachingToggle Navigation

Distribution of hours by type of teaching
Study typeHours of face-to-face teachingHours of non classroom-based work by the student
Lecture-based4040
Applied laboratory-based groups2050

Teaching guideToggle Navigation

Description and Contextualization of the SubjectToggle Navigation

This is an mandatory course within Artificial Intelligence grade and optional within the Computer Science Engineering grade. It is coursed during the third of the AI grade, and the fourth year for the CSE grade. The course is part of the machine learning and data science fields, and its objective is to enable the student to understand, develop, and implement models and algorithms capable of autonomously learning. The students will learn which are the main paradigms of deep learning approaches and the classes of problems where they can be applied. In particular, methods based on neural networks will be the focus of the course. Neural networks are computational models inspired in the neural organization and mechanisms of the brain. The course will introduce the most relevant types of neural networks, explaining the rationale behind their conception and scope of application. Special consideration will be given to deep neural networks as a state-of-the-art machine learning techniques. The student will be taught how to recognize the problems where neural networks can be applied, identify the corresponding appropriate class of neural network, and use and/or extend the implementations available to solve the problem. While the course will cover the necessary theoretical knowledge, it will devote a significant portion of the time to practical classes with the goal of implementing, using the Python language, neural network approaches that solve real-world machine learning problems.

Skills/Learning outcomes of the subjectToggle Navigation

- Identify which are the main types of problems in deep learning, and the most common methods used to address them

- Learn which are the main paradigms of neural networks and their characteristic principles

- Formulate and implement deep learning approaches to real-world problems using open-source Python-based tools

- Recognize the most appropriate deep neural network models for different types of problems

- Get a basic understanding of the most successful domains of application, hot topics, and open problems in deep learning

Theoretical and practical contentToggle Navigation

Course outline (Tentative)



1. Introduction to machine learning and DL

1. The pervasiveness of machine learning techniques in the real-world

2. Deep Learning: Biological and mathematical foundations

3. Mathematical background for DL

4. Learning paradigms: Supervised learning, unsupervised learning, transfer learning

5. Linear and logistic regression



2. Multi-layer perceptron

1. Motivation and applications

2. Structure of MLPs

3. Weights and biases

4. Non-linearities

5. Forward and backward pass

6. Back propagation

7. Gradient descend



3. Training MLP networks

1. Overfitting/underfitting

2. Regularizations

3. Model selection

4. Optimization

5. Scheduling



4. Convolutional Neural Networks (CNN)

1. Motivation and applications

2. Definition and properties

3. The convolutional operator

4. Pooling

5. CNNs in practice



5. Recurrent Neural Networks (RNN)

1. Motivation and applications

2. Definition and properties

3. Back propagation though time

4. LSTM and GRU

5. RNNs in practice



6. The Transformers architecture

1. Motivation and applications

2. Attention mechanism

3. Transformers in practice



7. Generative methods

1. Motivation and applications

2. Autoencoders

3. Variational autoencoders

4. Generative Adversarial Networks (GANs)



8. Graph Neural Networks

1. Motivation and applications

2. GNNs in practice



9. Advanced topics in neural networks studies

1. Neuro-evolution or the automatic generation of neural network architectures by means of artificial evolution

2. Spiking networks and polychronous chains

3. Adversarial networks

4. Other advanced topics

MethodologyToggle Navigation

The course will comprise an approximate equal number of lectures and practical classes, the latter will consist in solving exercises using Python. In the laboratories, students will work in teams and individually to find appropriate solutions to practical problems using deep learning methods. As part of the continuous evaluations, students will implement, using the available Python software (e.g., scikit-learn, pytorch, etc.), solutions to practical problems and present them as projects.



The course will have a virtual space in Egela where links to useful software, bibliography, and problem repositories will be available. The slides of the course lecture will be also available from the virtual class in Egela. Students could use the forum in the virtual space to discuss the problems. Students are also advised to create and use github accounts for the completion of the individual and team projects.

Assessment systemsToggle Navigation

  • Continuous Assessment System
  • Final Assessment System
  • Tools and qualification percentages:
    • See sections below (%): 100

Ordinary Call: Orientations and DisclaimerToggle Navigation

The student can choose between global (at the end of the semester) or continuous assessment. The continuous assessment is optional and requires the student's active participation in the learning process, and to work regularly during all the semester. Therefore, s/he is supposed to go to class, take part in the discussions, study the theoretical aspects of the subject, do the set of exercises and work with the computers at the laboratory sessions.



The continuous assessment method used reflects not only a student's performance on exams, but also the quality of his or her active participation in class, the laboratory tasks, the clarity of the written report, the communication capacity shown in the oral presentation, and so on.



The continuous assessment is chosen at the beginning of the semester, and the student can make the definitive decision (at the 60%-80% of the semester) after the teacher has supervised the students' performance. The student has to fill in a form where the percentage of the assessment and the mark obtained by the student at that time are stated.



In case there is no confirmation of final registration for the continuous assessment it is assumed that the student gives up to it.



CONTINUOUS EVALUATION



A: Written examination -> 6/10

B: Projects -> 4/10



NOTE: The written examination and the project are compulsory and a minimum grade will be required in both of them (4.5).



FINAL EVALUATION



A: Written examination -> 7/10

B: Projects -> 3/10



NOTE: The written examination and the project are compulsory and a minimum grade will be required in both of them (5).

Extraordinary Call: Orientations and DisclaimerToggle Navigation

Final written examination, -> 10/10



NOTE: The written examination is compulsory and a minimum grade will be required (5).

BibliographyToggle Navigation

Basic bibliography

Main bibliography



- Deep Learning. Ian Goodfellow and Yoshua Bengio and Aaron Courville. 2016. https://www.deeplearningbook.org/

- Dive into Deep Learning. Aston Zhang and Zachary C. Lipton and Mu Li and Alexander J. Smola. 2020. http://d2l.ai/

- Neural networks for pattern recognition. Christopher M. Bishop.

In-depth bibliography

Additional bibliography:

- Machine learning. A probabilistic perspective. Kevin P. Murphy
- Pattern recognition and machine learning. Christopher M. Bishop.
- Bayesian Netwoks and Decision Graphs. Finn V. Jensen. Springer-Verlag. 2001.
- The synaptic organization of the brain. Gordon M. Shepherd. Firth Edition
- Introduction to the theory of neural computation. Hertz, J., Krogh, A., Palmer, R. G. Addison-Wesley, 1991
- 30 years of adaptive neural networks: Perceptron, Madaline, and backpropagation., Widrow, B., Lehr, M.A., Proceedings of IEEE, 78(9), 1415-1442, 1990.
- Learning with Kernels, Scholkopf and Smola, 2002
- The Elements of Statistical Learning,, Hastie, Friedman, and Tibshirani, 2001
- The nature of statistical learning theory, Vapnik, V

Journals

Proceedings of NeurIPS: https://nips.cc/
Proceedings of ICLR: https://iclr.cc/
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
IEEE Transactions on Neural Networks

Web addresses

NYU Deep Learning: https://nyudatascience.medium.com/yann-lecuns-deep-learning-course-at-cds-is-now-fully-online-accessible-to-all-787ddc8bf0af
Berkeley Deep Learning: https://t.co/0PaBOJElo9
Stanford Convolutional Neural Networks for Visual Recognition: http://cs231n.stanford.edu/
Stanford Natural Language Processing with Deep Learning: http://web.stanford.edu/class/cs224n/
Machine learning with graphs (graph convolutional layers): https://www.youtube.com/watch?v=aBHC6xzx9YI&list=PLoROMvodv4rPLKxIpqhjhPgdQy7imNkDn&index=2

NN software
Pytorch: https://pytorch.org/
scikit-learn: https://scikit-learn.org/stable/
scikit-neuralnetwork: https://github.com/rsantana-isg/scikit-neuralnetwork

GroupsToggle Navigation

01 Teórico (Spanish - Mañana)Show/hide subpages

Calendar
WeeksMondayTuesdayWednesdayThursdayFriday
1-15

09:00-10:30 (1)

12:00-13:30 (2)

Teaching staff

01 Applied laboratory-based groups-1 (Spanish - Mañana)Show/hide subpages

Calendar
WeeksMondayTuesdayWednesdayThursdayFriday
1-15

10:30-12:00 (1)

Teaching staff

46 Teórico (Basque - Tarde)Show/hide subpages

Calendar
WeeksMondayTuesdayWednesdayThursdayFriday
1-15

14:00-15:30 (1)

17:00-18:30 (2)

Teaching staff

46 Applied laboratory-based groups-1 (Basque - Tarde)Show/hide subpages

Calendar
WeeksMondayTuesdayWednesdayThursdayFriday
1-15

15:30-17:00 (1)

Teaching staff