Summary
Natural Language Processing (NLP) has become one of the most important technologies in Artificial Intelligence. Automatic methods for processing natural language now find application in almost every aspect of our communication in person or online, in particular through social media. The increased use of Neural Networks has played an important role in the most recent progress of NLP, as NN techniques have delivered improved performance in applications ranging from language modelling (next word prediction) to speech to machine translation to sentiment analysis. This module provides a thorough introduction to this cutting-edge approach to developing NLP systems.
Aims
Students should expect to learn how to implement and train neural network models for a variety of NLP applications, such as language prediction, sentiment analysis, machine translation, and dialogue systems.
Syllabus
1. Introduction to Neural Networks and Backpropagation
2. Sequence Modelling in NLP with Recurrent Neural Networks
3. Word embeddings
4. More advanced RNNs: LSTMs and GRUs
5. Neural networks for text classification and sentiment analysis
6. Machine translation with the encoder-decoder architecture and attention
7. Transformer architectures and BERT
8. Social media processing
9. Information extraction and named entity recognition
10. Anaphora resolution (coreference)
11. Dialogue systems.
Learning Outcomes in Subject Specific Skills
By the end of the module the student will be able to:
1. Understand the NN approaches most widely used in NLP, such as Recurrent NN, Convolutional NNs, LSTMs, and the Encoder-Decoder architecture.
2. Understand how such architectures are used in NLP tasks such as language modelling, sentiment analysis, machine translation, coreference resolution, and dialogue systems
Natural Language Processing (NLP) has become one of the most important technologies in Artificial Intelligence. Automatic methods for processing natural language now find application in almost every aspect of our communication in person or online, in particular through social media. The increased use of Neural Networks has played an important role in the most recent progress of NLP, as NN techniques have delivered improved performance in applications ranging from language modelling (next word prediction) to speech to machine translation to sentiment analysis. This module provides a thorough introduction to this cutting-edge approach to developing NLP systems.
Aims
Students should expect to learn how to implement and train neural network models for a variety of NLP applications, such as language prediction, sentiment analysis, machine translation, and dialogue systems.
Syllabus
1. Introduction to Neural Networks and Backpropagation
2. Sequence Modelling in NLP with Recurrent Neural Networks
3. Word embeddings
4. More advanced RNNs: LSTMs and GRUs
5. Neural networks for text classification and sentiment analysis
6. Machine translation with the encoder-decoder architecture and attention
7. Transformer architectures and BERT
8. Social media processing
9. Information extraction and named entity recognition
10. Anaphora resolution (coreference)
11. Dialogue systems.
Learning Outcomes in Subject Specific Skills
By the end of the module the student will be able to:
1. Understand the NN approaches most widely used in NLP, such as Recurrent NN, Convolutional NNs, LSTMs, and the Encoder-Decoder architecture.
2. Understand how such architectures are used in NLP tasks such as language modelling, sentiment analysis, machine translation, coreference resolution, and dialogue systems