Skip to content

New Feature Request: Support for additional Azure hosted models (Openai's O1, and DeepseekR1 for example) #287

@ipdeft

Description

@ipdeft

First off - thanks for open sourcing such a cool project, I'm coming from n8n and at a glance...pyspur seems to have less artificial limitations to funnel folks into an enterprise plan - which is very awesome.

I tried modifying /pyspur/backend/pyspur/nodes/_model_info.py and inserted AZURE_GPT_o1 = "azure/o1" into line 104, however when rebuilding the container, and selecting from my LLMModels dropdown, I don't see my modification to be able to call the azure o1 endpoint... Could you kindly provide some guidance as to how we could consume additional models from our Azure endpoint such as O1, or DeepseekR1?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions