Established by: Faculty Board of Science and Technology, 2023-09-07
Contents
The course is about neural networks and gives an introduction to the field of deep learning. The content includes the components used to construct deep neural networks, e.g., activation functions, loss functions, regularization techniques (e.g., normalization and dropout), optimization methods (specifically variants of stochastic gradient descent), network architectures. Also covered is deep generative models. The students learn to apply their knowledge by implementing and training modern network architectures and deep learning methods on large data sets.
The course is split into two modules:
Theory, 5.5 credits Laboration, 2.0 credits
Expected learning outcomes
Knowledge and understanding After completing the course, the student should be able to:
(FSR 1) thoroughly describe central concepts in deep learning, for example depth, learning rate, hyper parameters, early stopping, overfitting, and regularization;
(FSR 2) thoroughly describe and explain the purpose of the components in a neural network, for example nodes, activation function, dropout, loss function, and residual connections;
(FSR 3) describe the stochastic gradient descent method and the backpropagation algorithm as well as the hyper-parameters that affect the result of training;
(FSR 4) thoroughly describe common architectures and models, and account for the main advantages and disadvantages of the various architectures.
Competence and skills After completing the course, the student should be able to:
(FSR 5) independently design network architectures and select activation functions and other hyper-parameters to appropriately solve a given problem;
(FSR 6) use modern software libraries to build, train, and use deep neural networks for a given application;
(FSR 7) scientifically evaluate and compare the performance and generalisation of deep learning models.
Judgement and approach After completing the course, the student should be able to:
(FSR 8) judge the appropriateness of using various deep learning models to solve a given problem;
(FSR 9) discuss how deep learning can affect people, environment, and society;
(FSR 10) discuss the current state of the deep learning research area and its place within the larger machine learning area, as well as outline how it relates to artificial intelligence;
(FSR 11) critically review results and statements related to deep learning.
Required Knowledge
At least 90 ECTS including at least 60 ECTS computing science, or at least 120 ECTS within a study programme. At least 7.5 ECTS programming; 7.5 ECTS data structures and algorithms; 7.5 ECTS linear algebra; 7.5 ECTS mathematical analysis (predominantly differential calculus); 7.5 ECTS mathematical statistics and probability theory; 7.5 ECTS machine learning. Proficiency in English equivalent to the level required for basic eligibility for higher studies.
Form of instruction
Instruction is in the form of lectures and programming exercises. In addition to scheduled activities, self-study of the material is required.
Examination modes
Module 1 (Theory, 5.5 credits) is assessed by a written exam in halls. The module uses the grade scale Pass with distinction (VG), Pass (G), or Fail (U).
Module 2 (Laboration, 2.0 credits) covers FSR 5-7 and is assessed by written assignments. The module uses the grade scale Pass (G) or Fail (U).
The course as a whole uses the grade scale Pass with distinction (VG), Pass (G), or Fail (U) and is determined by the grade on Module 1.
Adapted examination The examiner can decide to deviate from the specified forms of examination. Individual adaptation of the examination shall be considered based on the needs of the student. The examination is adapted within the constraints of the expected learning outcomes. A student that needs adapted examination shall no later than 10 days before the examination request adaptation from the Department of Computing Science. The examiner makes a decision of adapted examination and the student is notified.
Other regulations
If the syllabus has expired or the course has been discontinued, a student who at some point registered for the course is guaranteed at least three examinations (including the regular examination) according to this syllabus for a maximum period of two years from the syllabus expiring or the course being discontinued.
Literature
Valid from:
2024 week 1
Prince Simon J.D. Understanding Deep Learning MIT Press : 2023 : http://udlbook.com Mandatory Reading instructions: The book will be published in December 2023. It can either be bought in printed form or downloaded as a PDF (for free).