Skip to content

add use_flash_attention_2 to param for Model loader Transformers#4373

Merged
oobabooga merged 8 commits intooobabooga:devfrom
fenglui:main
Nov 4, 2023
Merged

add use_flash_attention_2 to param for Model loader Transformers#4373
oobabooga merged 8 commits intooobabooga:devfrom
fenglui:main

Commits

Commits on Oct 24, 2023

Commits on Oct 27, 2023

Commits on Oct 28, 2023

Commits on Nov 4, 2023