Skip to content

Implementation of Transformer for Neural Machine Translation in PyTorch and Hugging Face (Work in Progress).

Notifications You must be signed in to change notification settings

b-turan/transformer_pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

73 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

transformer_pytorch

This is a PyTorch implementation of the Transformer model using the T5 transformer model for Neural Machine Translation. For a detailed explanation of the model, please refer to the blog post or check the arXiv paper. The code is based on the PyTorch tutorial and the Hugging Face tutorial. The code is written in a modular way, so that it can be easily extended to other tasks.

Simple Exemplary Usage

First install the requirements:

conda env create --file env.yml -n transformer_pytorch
conda activate transformer_pytorch

Then run the training script:

python transformer_tutorial.py

Exemplary Execution of main.py

python main.py --epochs 30 --train --debug

Extension

You can play around with the flags to include more datapoints or to train on a different language pair. The code is written in a modular way, so that it can be easily extended to other tasks. The number of samples can be reduced to an arbritrary integer, e.g., --n_samples 1000000.

About

Implementation of Transformer for Neural Machine Translation in PyTorch and Hugging Face (Work in Progress).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages