Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is Lora training necessary? #19

Open
Feynman1999 opened this issue Jul 16, 2024 · 1 comment
Open

Is Lora training necessary? #19

Feynman1999 opened this issue Jul 16, 2024 · 1 comment

Comments

@Feynman1999
Copy link

Is Lora training necessary? What would happen if it were changed to full parameter fine-tuning? How do you view this

@cswry
Copy link
Owner

cswry commented Oct 24, 2024

Training with LoRA can help preserve the generation capabilities as much as possible, whereas full-parameter fine-tuning may result in a slight decrease in generation performance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants