Foundations of Deep Learning

Many view the advent of deep learning as a revolution that has fundamentally transformed modern ML and AI. This module will cover the basics of deep learning as a precursor for more advanced modules in the Master’s programme. It will start with a quick recap of ML fundamentals, namely training, generalisation, overfitting, cross-validation, regularisation, and hyper-parameter optimisation. The following topics specific to neural networks will then be covered: multi-layer perceptrons, deep feedforward neural networks, gradient-based training and backpropagation, convolutional neural networks, recurrent neural networks, attention mechanisms, autoencoders and deep generative models.

Upon completion of the module the student will be able to:

  • give a broad account of the history of deep learning, and how various advances in the field are interconnected;
  • describe, summarise and compare standard deep neural network architectures, as a precursor for more advanced modules in the programme;
  • build and practically implement those standard networks to perform a variety of supervised and unsupervised learning tasks;
  • analyse, critically assess, and explain observed performance of those networks under various settings.