-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request] ColBERT V2 #14
Comments
Hi @filippo82, I think it could be cool to add distillation loss of course. I plan to improve the loss function of the train module in the following weeks, there won't be any breaking change. Before releasing the centroids algorithm of ColBERTV2 I plan to release another method to accelerate ColBERT retriever, it will be really fast and still accurate, work in progress. |
Hi @raphaelsty 👋🏻 thanks a lot for your reply and sorry for the slooow reply. Let me know if there is any way I can help with testing/debugging. |
Hi @filippo82, I did release neural-cherche 1.1.0 which improve loss stability and brings better default parameters to models. Also I did release neural-tree in order to accelerate ColBERT. Feel free to open a PR in neural-cherche if you are interested in Knowledge Distillation |
Hi @raphaelsty,
first of all, thanks a lot for your this project. I really appreciate its simplicity and effectiveness.
Question: do you have any plans to implement ColBERT V2?
Best wishes.
The text was updated successfully, but these errors were encountered: