Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] ColBERT V2 #14

Open
filippo82 opened this issue Feb 17, 2024 · 3 comments
Open

[Feature request] ColBERT V2 #14

filippo82 opened this issue Feb 17, 2024 · 3 comments

Comments

@filippo82
Copy link

Hi @raphaelsty,

first of all, thanks a lot for your this project. I really appreciate its simplicity and effectiveness.

Question: do you have any plans to implement ColBERT V2?

Best wishes.

@raphaelsty
Copy link
Owner

raphaelsty commented Feb 18, 2024

Hi @filippo82, I think it could be cool to add distillation loss of course. I plan to improve the loss function of the train module in the following weeks, there won't be any breaking change.

Before releasing the centroids algorithm of ColBERTV2 I plan to release another method to accelerate ColBERT retriever, it will be really fast and still accurate, work in progress.

@filippo82
Copy link
Author

Hi @raphaelsty 👋🏻 thanks a lot for your reply and sorry for the slooow reply.

Let me know if there is any way I can help with testing/debugging.

@raphaelsty
Copy link
Owner

raphaelsty commented Mar 5, 2024

Hi @filippo82, I did release neural-cherche 1.1.0 which improve loss stability and brings better default parameters to models. Also I did release neural-tree in order to accelerate ColBERT.

Feel free to open a PR in neural-cherche if you are interested in Knowledge Distillation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants