add use_flash_attention_2 to param for Model loader Transformers#4373
Merged
oobabooga merged 8 commits intooobabooga:devfrom Nov 4, 2023
fenglui:main
Merged
add use_flash_attention_2 to param for Model loader Transformers#4373oobabooga merged 8 commits intooobabooga:devfrom fenglui:main
oobabooga merged 8 commits intooobabooga:devfrom
fenglui:main
Commits
Commits on Oct 23, 2023
Commits on Oct 24, 2023
Commits on Oct 27, 2023
Commits on Oct 28, 2023
Commits on Nov 4, 2023
- committed
- committed
- committed