Tutorial: Text Classification using GPT2 and Pytorch


Apr 09, 09:30 AM PDT
  • Virtual AICamp
  • 374 RSVP
Description
Speaker
Text classification is a very common problem that needs solving when dealing with text data. We’ve all seen and know how to use Encoder Transformer models like Bert and RoBerta for text classification but did you know you can use a Decoder Transformer model like GPT2 for text classification?

In this tutorial, I will walk you through on how to use GPT2 from HuggingFace for text classification. We will start with downloading customized dataset, installing required componments, selecting pre-trained models, and then train the model. we will finally evaluate the results and how to optimize further.

George Mihaila

George is a PhD candidate at University of North Texas, in the Department of Computer Science. He worked three summers for State Farm as a Data Scientist (DS) and Machine Learning (ML) Engineer, and two years combined of DS and ML Engineer for the University of north Texas High Performance Computing center. He has more than 5 years of combined experience in Natural Language Processing (NLP), Computer Vision (CV), Deep Learning (DL), Reinforcement Learning (RL) and Machine Learning Operations (MLOps).
The event ended.
Watch Recording
*Recordings hosted on Youtube, click the link will open the Youtube page.
Contact Organizer