Established by: Faculty Board of Science and Technology, 2024-02-14
Contents
This course investigates how large language models (LLMs) might contribute to solving long standing problems in data management. Specifically it addresses three related questions:
How can LLMs assist in building chat interfaces over SQL databases?
How can LLMs assist in the conceptual modelling and definition of SQL databases?
How can LLMs assist in data integration where multiple databases are made interoperable?
Since these questions can only be addressed after understanding LLMs and basic data management, the first half of the course is devoted to covering these concepts. This starts with feed forward neural networks, RNNs, LSTMs and Seq2Seq models. Then we cover Transformers, BERT and GPT. After this we quickly review the main concepts in data management including conceptual modelling via entity relationship diagrams (ERDs), basic SQL and typical architectures. After covering basic concepts, the second half of the course turns to the three questions above of how LLMs might address long standing data management problems. This includes direct few shot approaches, approaches based on LangChain and vector databases and finally retrieval-augmented generation (RAG) approaches. Additional approaches may also be covered.
Expected learning outcomes
Knowledge and understanding After completing the course, the student should be able to:
(FSR 1) Explain the technical underpinnings of neural networks including common activation functions, calculation of loss and gradient descent.
(FSR 2) Explain the components, processes and workflow of LSTMs, Seq2Seq models, Transformers, BERT and GPT.
(FSR 3) Describe few-shot, vector database, RAGs-based and other approaches to long standing data management problems.
Competence and skills After completing the course, the student should be able to:
(FSR 4) Set up and run basic neural network learning problems using PyTorch.
(FSR 5) Set up and run transformer learning problems using PyTorch.
(FSR 6) Set up LangChain to orchestrate chat-like dialogues over SQL databases.
Judgement and approach After completing the course, the student should be able to:
(FSR 7) Understand and discuss concepts and terminology of state of the art LLMs.
(FSR 8) Critically evaluate proposed LLM-based solutions to data management problems.
(FSR 9) Develop an ability to distinguish fact from fantasy in this fast moving field.
Required Knowledge
At least 30 ECTS in Computing Science or Mathematics including completed courses in programming (ideally Python), data structures and algorithms, databases, calculus, and linear algebra.
Form of instruction
This course is a distance course which requires no local physical presence. Lectures are held over Zoom and course material is delivered via Canvas. Lectures are recorded, so attendance is flexible. Students answer problems and demonstrate their programming solutions in recitation sessions held over Zoom. While attendance in recitations is mandatory, there are multiple opportunities to accommodate student schedules. The course ends with a take home exam that students upload to Canvas. All work is conducted by the student individually.
Examination modes
Students must show their work at four separate Zoom-based recitations. There is also a take home final exam which students upload to Canvas for grading. The grade scale is Pass with distinction (VG), Pass (G), or Fail (U).
Adapted examination The examiner can decide to deviate from the specified forms of examination. Individual adaptation of the examination shall be considered based on the needs of the student. The examination is adapted within the constraints of the expected learning outcomes. A student that needs adapted examination shall no later than 10 days before the examination request adaptation from the Department of Computing Science. The examiner makes a decision of adapted examination and the student is notified.
Other regulations
If the syllabus has expired or the course has been discontinued, a student who at some point registered for the course is guaranteed at least three examinations (including the regular examination) according to this syllabus for a maximum period of two years from the syllabus expiring or the course being discontinued.
Literature
Valid from:
2024 week 22
Deep learning Goodfellow Ian, Bengio Yoshua, Courville Aaron Cambridge, MA : MIT Press : [2016] : xxii, 775 pages : ISBN: 9780262035613 Mandatory Search the University Library catalogue Reading instructions: https://www.deeplearningbook.org/
In addition, a number of scientific articles will be used.