Skip to content

Latest commit

 

History

History
95 lines (69 loc) · 27.8 KB

README.md

File metadata and controls

95 lines (69 loc) · 27.8 KB

Transformers-for-NLP-2nd-Edition

drawing

@copyright 2022, Packt Publishing, Denis Rothman

Contact me for any question you have on LinkedIn
Get the book on Amazon

Under the hood working of transformers, fine-tuning GPT-3 models, DeBERTa, vision models, and the start of Metaverse, using a variety of NLP platforms: Hugging Face, OpenAI API, Trax, and AllenNLP

Getting started

You can run these notebooks on cloud platforms like Google Colab or your local machine. Note that some chapters require a GPU to run in a reasonable amount of time, so we recommend one of the cloud platforms as they come pre-installed with CUDA.

Running on a cloud platform

To run these notebooks on a cloud platform, just click on one of the badges in the table below:

Chapter Colab Kaggle Gradient StudioLab
Getting Started with the Architecture of the Transformer Model
  • Multi_Head_Attention_Sub_Layer.ipynb
  • positional_encoding.ipynb
Open In Colab Open In Colab Kaggle Kaggle Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab
Fine-Tuning BERT Models
  • BERT_Fine_Tuning_Sentence_Classification_GPU.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Pretraining a RoBERTa Model from Scratch
  • KantaiBERT.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Downstream NLP Tasks with Transformers
  • Transformer_tasks.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Machine Translation with the Transformer
  • Trax_translation.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
The Rise of Suprahuman Transformers with GPT-3 Engines
  • Fine_tuning_GPT_3.ipynb
  • Getting_Started_GPT_3.ipynb
Open In Colab Open In Colab Kaggle Kaggle Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab
Applying Transformers to Legal and Financial Documents for AI Text Summarization
  • Summerizing_Text_with_T5.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Matching Tokenizers and Datasets
  • Tokenizers.ipynb
  • Training_OpenAI_GPT_2_CH09.ipynb
Open In Colab Open In Colab Kaggle Kaggle Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab
Semantic Role Labeling with BERT-Based Transformers
  • SRL.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Let Your Data Do the Talking: Story, Questions, and Answers
  • Haystack_QA_Pipeline.ipynb
  • QA.ipynb
Open In Colab Open In Colab Kaggle Kaggle Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab
Detecting Customer Emotions to Make Predictions
  • SentimentAnalysis.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Analyzing Fake News with Transformers
  • Fake_News.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Interpreting Black Box Transformer Models
  • BertViz.ipynb
  • Understanding_GPT_2_models_with_Ecco.ipynb
Open In Colab Open In Colab Kaggle Kaggle Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab
From NLP to Task-Agnostic Transformer Models
  • Vision_Transformers.ipynb
  • DALL_E.ipynb
Open In Colab Open In Colab Kaggle Kaggle Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab
The Emergence of Transformer-Driven Copilots
  • Domain_Specific_GPT_3_Functionality.ipynb
  • KantaiBERT_Recommender.ipynb
  • Vision_Transformer_MLP_Mixer.ipynb
Open In Colab Open In Colab Open In Colab Kaggle Kaggle Kaggle Gradient Gradient Gradient Open In SageMaker Studio Lab Open In SageMaker Studio Lab Open In SageMaker Studio Lab
Appendix III: Generic Text Completion with GPT-2
  • OpenAI_GPT_2.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Appendix IV: Custom Text Completion with GPT-2
  • Training_OpenAI_GPT_2.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab
Bonus
  • Q&A_DR.ipynb
Open In Colab Kaggle Gradient Open In SageMaker Studio Lab

Key Features

Implement models, such as BERT, Reformer, and T5, that outperform classical language models
Compare NLP applications using GPT-3, GPT-2, and other transformers
Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision

Book Description

Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence.

Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers.

An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP.

This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description.

By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.

What you will learn

Discover new ways of performing NLP techniques with the latest pretrained transformers
Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer
Create language understanding Python programs using concepts that outperform classical deep learning models
Apply Python, TensorFlow, and PyTorch programs to sentiment analysis, text summarization, speech recognition, machine translations, and more
Measure the productivity of key transformers to define their scope, potential, and limits in production

Who This Book Is For

If you want to learn about and apply transformers to your natural language (and image) data, this book is for you.

A good understanding of NLP, Python, and deep learning is required to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters of this book.

Table of Contents

1.What are Transformers?
2.Getting Started with the Architecture of the Transformer Model
3.Fine-Tuning BERT models
4.Pretraining a RoBERTa Model from Scratch
5.Downstream NLP Tasks with Transformers
6.Machine Translation with the Transformer
7.The Rise of Suprahuman Transformers with GPT-3 Engines
8.Applying Transformers to Legal and Financial Documents for AI Text Summarization
9.Matching Tokenizers and Datasets
10.Semantic Role Labeling with BERT-Based Transformers
11.Let Your Data Do the Talking: Story, Questions, and Answers
12.Detecting Customer Emotions to Make Predictions
13.Analyzing Fake News with Transformers
14.Interpreting Black Box Transformer Models
15.From NLP to Task-Agnostic Transformer Models
16.The Emergence of Transformer-Driven Copilots
Appendix I: Terminology of Transformer Models
Appendix II: Hardware Constraints for Transformer Models
And more!