Parsing, understanding, and generating natural human language is a crucial component of AI. Advances in deep learning are beginning to enable impressive end-to-end language understanding systems. Topics in this module will include word embeddings and representations (e.g. word2vec), part-of-speech tagging and syntactic parsing, topic modelling, language modelling with recurrent and convolutional neural networks, machine translation with seq2seq models and attention, and sentence classification in applications like sentiment labelling and language identification.
Upon completion of the module the student will be able to:
- summarise, implement and critically assess basic as well as state-of-the-art techniques to solve a variety of learning problems in Natural Language Processing;
- formulate and interpret the mathematical theory underlying common Natural Language Processing approaches;
- apply a number of devices necessary to build practical solutions to Natural Language Processing problems;
- discuss problems and approaches at the cutting edge of Natural Language Processing research, from an ML and AI perspective.