-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Max Recursion Error when using with lora #122
Comments
I've identified that the error happens exactly in the lora library at this When |
Ar-Kareem
added a commit
to Ar-Kareem/tensor_parallel
that referenced
this issue
Oct 2, 2023
I think I fixed it. |
Ar-Kareem
added a commit
to Ar-Kareem/tensor_parallel
that referenced
this issue
Oct 2, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I get the following error when attempting to use LoRa with Llama 2
caused by the
peft
module executing:if getattr(model, "is_gradient_checkpointing", True):
Below is the minimal reproducible example that breaks when using tensor parallel and works when disabling it
When settings
USE_TENSOR_PARALLEL = False
the code works, but when settingUSE_TENSOR_PARALLEL = True
I get the following error:The text was updated successfully, but these errors were encountered: