Advanced Machine Learning with Large Language Models | WSQ Courses

Originally published at:

The field of natural language processing (NLP) has been transformed by large language models, such as GPT-3, GPT-4, ChatGPT and BERT, which have achieved state-of-the-art performance on a wide range of NLP tasks. In this course, you will learn the theory and practice of advanced machine learning with large language models.

You will start by understanding the basics of Transformer architecture, the foundation of large language models. You will then explore pre-training and fine-tuning techniques, including the use of Hugging Face Transformers API to load, train, and evaluate pre-trained models. You will learn about the latest advances in large language models, such as GPT-3, GPT-4, ChatGPT, and compare their performance to other pre-trained models such as BERT.

You will also get hands-on experience implementing custom Transformer models, and you will explore their applications in practical NLP tasks, such as text generation, sentiment analysis, named entity recognition, question answering systems, and chatbots.


All participants will receive a Certificate of Completion from Tertiary Courses after achieved at least 75% attendance.

Funding and Grant Applications

No funding is available for this course


Courses are provided by