Conversation
|
This PR is exactly what I need! I have a custom AI provider with 100+ models and new models are added daily. Manual configuration for every model is completely unsustainable. |
|
@rekram1-node Thanks for working on this. What do you think about adding functionality for plugins to hook into the detection mechanism? My use case is the following: The However, at my company’s LLM inference server, we have a separate custom endpoint that provides all the necessary information. So if there was a plugin API to hook into the auto detection mechanism, we could write a plugin that calls that endpoint instead of I have actually already tried to do something similar, simply with the |
No description provided.