-
Notifications
You must be signed in to change notification settings - Fork 51
st to 4.0.1 & lint & model_card & fix mps cached_contrastive #107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Thank you the handling the PR on ST and also adding it here! FWIU what you did should work (did you test it?), the only nitpick I have is now we should remove the
I suppose this is because the PR has been included in this version that you want to do the upgrade in this PR. Edit: also, obviously thanks for the contributions, really appreciate it! |
|
Always glad to help! I'm not too sure about the MPS implementation—the values looked ok, but I can't guarantee their correctness. Therefore, I switched to raising an error for MPS.
All devices with MPS do not have any other graphics card.
Yes, i did: model = models.ColBERT(
model_name_or_path="colbert-ir/colbertv2.0",
model_card_data=PylateModelCardData(language="de", model_name="testing"),
)
model.push_to_hub(
"samheym/pylate-test", private=True, train_datasets=[], exist_ok=True
)
No, I have time to wait for this. Just out of curiosity, what are you checking manually? Is there any option to include these tests in the CI? |
Not much, just trying to launch some larger scale training (nothing fancy, just ms marco) and checking the results, to make sure there isn't any huge regression. |
|
Hello, |
|
Hello! I didn't investigate this PR with much care, but I do want to share that v4.0 was designed to not introduce anything breaking, especially not if you don't use CrossEncoder (training). Nice work here @sam-hey!
|
|
Yeah no it's just me being paranoiac, but I did some unrelated training and take the opportunity to test the memory fix/sanity check training and it's fine! |
Co-authored-by: Antoine Chaffin <[email protected]>
|
Thanks a lot to you @tomaarsen @NohTow — it’s always a pleasure working with you! Best |
|
LGTM thanks! |
test_contrastive_trainingfailing on mps device. #106