Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan to support trl-peft load_in_8bit for training.py ? #21

Open
the-unsoul opened this issue Apr 24, 2023 · 1 comment
Open

Any plan to support trl-peft load_in_8bit for training.py ? #21

the-unsoul opened this issue Apr 24, 2023 · 1 comment

Comments

@the-unsoul
Copy link

the-unsoul commented Apr 24, 2023

Hello,

I am fairly new with LLM in general (only started to study 2 weeks ago). So if I say/ask something silly, please excuse me.

And I stumble upon this blog post from HuggingFace
https://huggingface.co/blog/trl-peft

After a quick check it seem that training.py currently not support load_in_8bit.
And I wonder if there are any specific reason to not do so?

(I also want try to add such support to flan-alpaca)

@chiayewken
Copy link
Collaborator

Hi, it should be possible to support load_in_8bit for training since we already support lora, contributions are welcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants