Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Support for QLORA/QA-QLORA weights which are not merged #3225

Open
orellavie1212 opened this issue Mar 6, 2024 · 2 comments
Open

Add Support for QLORA/QA-QLORA weights which are not merged #3225

orellavie1212 opened this issue Mar 6, 2024 · 2 comments

Comments

@orellavie1212
Copy link
Contributor

currently only original LORA is supported as not fused adapter, I hope to be able to add the support for QLORA/QA-LORA support for the adapters, without fusing with the base model.

@chenqianfzh
Copy link
Contributor

Hi, I am workig on adding QLora support to Vllm.

The first model to support would probably be timdettmers/qlora-alpaca-13b ( and some other qlora models presented by timdettmers in hugging face).

Copy link

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

@github-actions github-actions bot added the stale label Oct 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants