You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First off - thanks for open sourcing such a cool project, I'm coming from n8n and at a glance...pyspur seems to have less artificial limitations to funnel folks into an enterprise plan - which is very awesome.
I tried modifying /pyspur/backend/pyspur/nodes/_model_info.py and inserted AZURE_GPT_o1 = "azure/o1" into line 104, however when rebuilding the container, and selecting from my LLMModels dropdown, I don't see my modification to be able to call the azure o1 endpoint... Could you kindly provide some guidance as to how we could consume additional models from our Azure endpoint such as O1, or DeepseekR1?