Skip to content

[Bug]: Ollama, embed - no model selection - iPadOS/iOS #1202

@serbans99

Description

@serbans99

Description

On iPad-iPhone, using a remote ollama for embedding, the model selection shows "no models currently available". Same environment on MacOS yields the expected drop-down to select models.
Same environment (Ollama) for chat works as expected on iPad-iPhone.

Steps to Reproduce

  1. open Obsidian
  2. click on "load smart environment"
  3. go to settings of plugin/smart sources
  4. embedding model - Ollama(local), enter ollama host http://IP:port/
  5. toggle between ollama local and transformers to force reload of models

Expected Behavior

there should be a list of embedding models displayed.

Actual Behavior

"no models currently available" in the model selection dropdown

Screenshots

NA

Operating System

iPadOS 18.6

Obsidian Version

1.9.12

Smart Connections Version

3.0.78

AI Provider/Model

ollama/nomic-embed-text

Other Enabled Plugins

selfhosted livesync, all .smart-env folders excluded from syncing

Additional Context

transformers (on-device) embedding works

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions