Skip to content

Latest commit

 

History

History
12 lines (9 loc) · 299 Bytes

DistilBART.md

File metadata and controls

12 lines (9 loc) · 299 Bytes

DistilBART

You can fine-tune a DistilBART model from HuggingFace through model=BART, model_path=<hf-or-local-path>, dataset=<dataset-name>.

Example usage:

python run_textbox.py \
    --model=BART \
    --model_path=sshleifer/distilbart-cnn-12-6 \
    --dataset=samsum