Skip to content

LittleChenCc/SimulTransBaseline

 
 

Repository files navigation

SimulTransBaseline

This is a sample code for AutoSimulTrans Workshop (https://autosimtrans.github.io) based on PaddlePaddle(https://github.com/paddlepaddle/paddle) with dynamic graph. This code implements Transformer based Wait-K training and decoding proposed in paper STACL: Simultaneous Translation with Implicit Anticipation and Controllable Latency (https://arxiv.org/abs/1810.08398).

The following is the code struture

.
├── utils                # Utilities
├── gen_data.sh          # Scripts to download and bpe preprocessed WMT18 zh-en corpus
├── predict.py           # Inference code
├── reader.py            # Data reader
├── stream_reader.py     # Stream data reader
├── README.md            # Documentation
├── train.py             # Training
├── model.py             # Transformer model and beam (greedy) search
└── transformer.yaml     # configuration

Dependencies

  1. jieba==0.37
  2. sacremoses==0.0.38

Quick Start

Installation

  1. Paddle

    This project depends on PaddlePaddle 1.7 develop version. Please refer to Installation Manual to install.

    You can simply install it with this comand:

  2. Download code

    git clone https://github.com/PaddlePaddle/models.git
    cd models/dygraph/transformer

About

This is a sample code for AutoSimulTrans Workshop (https://autosimtrans.github.io)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 96.0%
  • Shell 4.0%