Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot use the baml playground and a local client while connected to a machine via ssh #1342

Open
this-josh opened this issue Jan 17, 2025 · 5 comments

Comments

@this-josh
Copy link

From VS Code if I do Remote-SSH: Connect to Host...

Then navigate to my baml folder and set the client to

client<llm> Ollama {
  provider "openai-generic"
  options {
    base_url "http://localhost:11434/v1"
    model llama3.3:latest
  }
}

Now pressing run in the playground gives this error

vaibhav_resume
Unable to run
Unspecified error code: 2 reqwest::Error { kind: Request, source: "JsValue(TypeError: Failed to fetch\nTypeError: Failed to fetch\n    at __wbg_fetch_1e4e8ed1f64c7e28 (https://vscode-remote+ssh-002dremote-002b2a0c-003a5bc0-003a40-003a2e26-003ac27-003a598a-003ab050-003a306d.vscode-resource.vscode-cdn.net/home/jak121/.vscode-server/extensions/boundary.baml-extension-0.72.1/web-panel/dist/assets/baml_schema_build.js:2758:17)\n    at baml_schema_build.wasm.reqwest::wasm::client::js_fetch::h702657c4e8b70698 (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[8226]:0x785c64)\n    at baml_schema_build.wasm.reqwest::wasm::client::fetch::{{closure}}::h9dcb23421a40299d (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[402]:0x207ed0)\n    at baml_schema_build.wasm.baml_runtime::internal::llm_client::primitive::request::make_request::{{closure}}::h750be6e03bf00365 (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[424]:0x21c239)\n    at baml_schema_build.wasm.<baml_runtime::internal::llm_client::primitive::LLMPrimitiveProvider as baml_runtime::internal::llm_client::traits::WithStreamable>::stream::{{closure}}::h8e98f766dfb8589c (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[143]:0x341b0)\n    at baml_schema_build.wasm.baml_runtime::BamlRuntime::run_test::{{closure}}::{{closure}}::{{closure}}::h23629bb0be43580f (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[148]:0x53d71)\n    at baml_schema_build.wasm.wasm_bindgen_futures::future_to_promise::{{closure}}::{{closure}}::h0ef435efcf095f59 (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[264]:0x1690a7)\n    at baml_schema_build.wasm.wasm_bindgen_futures::queue::QueueState::run_all::h783ac55f5c0fa9d6 (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[4859]:0x689a5f)\n    at baml_schema_build.wasm.wasm_bindgen_futures::queue::Queue::new::{{closure}}::hc075082cb8ddbf92 (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[14598]:0x8237ff)\n    at baml_schema_build.wasm.<dyn core::ops::function::FnMut<(A,)>+Output = R as wasm_bindgen::closure::WasmClosure>::describe::invoke::h2aa238002d130c19 (wasm://wasm/baml_schema_build.wasm-02c1bb5a:wasm-function[14597]:0x8237f1))" }

If I take the generated curl request

curl -X POST 'http://localhost:11434/v1/chat/completions' -H "content-type: application/json" -d "{
  \"model\": \"llama3.3:latest\",
  \"messages\": [
    {
      \"role\": \"system\",
      \"content\": \"Extract from this content:\nVaibhav Gupta\[email protected]\n\nExperience:\n- Founder at BoundaryML\n- CV Engineer at Google\n- CV Engineer at Microsoft\n\nSkills:\n- Rust\n- C++\n\nAnswer in JSON using this schema:\n{\n  name: string,\n  email: string,\n  experience: string[],\n  skills: string[],\n}\"
    }
  ],
  \"stream\": true
}"

I can run this fine in my VS code (ssh) terminal, also if I open VS code up on the remote machine and try using the playground that works fine.

So this issue seems to be to do with BAML playground while in a VS SSH instance

@aaronvg
Copy link
Contributor

aaronvg commented Jan 17, 2025

Try changing vscode settings:

baml.enablePlaygroundProxy = false

@this-josh
Copy link
Author

HI @aaronvg thanks for getting back to me, I've just tried this now and it hasn't worked, with what seems to be the same error message

Unspecified error code: 2 reqwest::Error { kind: Request, source: "JsValue(TypeError: Failed to fetch\nTypeError: Failed to fetch\n    at __wbg_fetch_c5d6726a1da3618f (https://vscode-remote+ssh-002dremote-002bcv-002diits-002dw01-002edept-002eic-002eac-002euk.vscode-resource.vscode-cdn.net/home/jak121/.vscode-server/extensions/boundary.baml-extension-0.73.4/web-panel/dist/assets/baml_schema_build.js:2758:17)\n    at baml_schema_build.wasm.reqwest::wasm::client::js_fetch::h0af2ae822c41a29d (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[8513]:0x7c37e4)\n    at baml_schema_build.wasm.reqwest::wasm::client::fetch::{{closure}}::h22fbbcbe205f38bf (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[403]:0x20c3da)\n    at baml_schema_build.wasm.baml_runtime::internal::llm_client::primitive::request::make_request::{{closure}}::hfbb3a3d328000cd5 (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[426]:0x221752)\n    at baml_schema_build.wasm.<baml_runtime::internal::llm_client::primitive::LLMPrimitiveProvider as baml_runtime::internal::llm_client::traits::WithStreamable>::stream::{{closure}}::h8c8151155b30a105 (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[143]:0x3461b)\n    at baml_schema_build.wasm.baml_runtime::BamlRuntime::run_test::{{closure}}::{{closure}}::{{closure}}::h4ab0d045b542953a (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[151]:0x5f9c7)\n    at baml_schema_build.wasm.wasm_bindgen_futures::future_to_promise::{{closure}}::{{closure}}::h5f746656074539c4 (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[268]:0x17084d)\n    at baml_schema_build.wasm.wasm_bindgen_futures::queue::QueueState::run_all::h783ac55f5c0fa9d6 (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[5031]:0x6be960)\n    at baml_schema_build.wasm.wasm_bindgen_futures::queue::Queue::new::{{closure}}::hc075082cb8ddbf92 (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[14967]:0x863cba)\n    at baml_schema_build.wasm.<dyn core::ops::function::FnMut<(A,)>+Output = R as wasm_bindgen::closure::WasmClosure>::describe::invoke::h2aa238002d130c19 (wasm://wasm/baml_schema_build.wasm-02d549f2:wasm-function[14966]:0x863cac))" }

Request options: {"model":"qwen2.5:14b-instruct-q6_K"}

@aaronvg
Copy link
Contributor

aaronvg commented Jan 24, 2025

Did you set the OLLAMA_ORIGINS=* env var?

Theres a var that allows ollama to bypass cors

@phiwi
Copy link

phiwi commented Feb 5, 2025

I am experience exactly the same situation. Starting Ollama with

singularity run --env  OLLAMA_ORIGINS=*  OLLAMA_KEEP_ALIVE=1h  --nv  ollama.sif 

didn't help.

@aaronvg
Copy link
Contributor

aaronvg commented Feb 5, 2025

If you join our discord I'm happy to debug with you in our office hours with screenshare https://discord.com/invite/yzaTpQ3tdT

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants