You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/source/workflows/llms/using-local-llms.md
+7-5Lines changed: 7 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,10 +32,11 @@ Regardless of the model you choose, the process is the same for downloading the
32
32
33
33
### Install the Simple Web Query Example
34
34
35
-
First, ensure the current working directory is the root of the NeMo Agent toolkit repository. Then, install the simple web query example so we have the `webpage_query` tool available.
35
+
First, ensure the current working directory is the root of the NeMo Agent toolkit repository. Then, install NAT and the simple web query example.
@@ -142,10 +143,11 @@ vLLM provides an [OpenAI-Compatible Server](https://docs.vllm.ai/en/latest/getti
142
143
143
144
### Install the Simple Web Query Example
144
145
145
-
First, ensure the current working directory is the root of the NeMo Agent toolkit repository. Then, install the simple web query example so we have the `webpage_query` tool available.
146
+
First, ensure the current working directory is the root of the NeMo Agent toolkit repository. Then, install NAT and the simple web query example.
Edit `.tmp/sizing_calc/config-sizing-calc.yml` file by adding a `base_url` parameter for the `llms.nim_llm` section for your cluster. Then, if needed, change the `llms.nim_llm.model_name`.
0 commit comments