-
Notifications
You must be signed in to change notification settings - Fork 44
2.3.9 Satellite: cmdh
av edited this page Nov 2, 2024
·
3 revisions
Handle:
cmdh
URL: -
# [Optional] Pre-build the service image
harbor build cmdh
cmdh
is a CLI-only service that tries to generate a bash command from your input. You don't need to start it, as it's a CLI tool.
Be advised that it works best with larger models. Expect acceptable results from codestral
and above.
Note
Harbor runs patched version of cmdh
that uses Ollama json format and has a slightly tweaked system prompt.
# Invoke "harbor cmdh" and then type in
# your command, no quotes needed
harbor cmdh how many CPU cores do I have?
Harbor mounts $PWD
to the service, so it can work with it directly. The $PWD
will match those of a host, but be aware that only the children directories and files are actually available.
# Go where you'd run the command
user@os:~/.cache/huggingface$ ▼ harbor cmdh find all the ggufs here
# Example output:
✔ Retrieving command... find all the ggufs here
desired command: find / -type f -name "*.gguf"
assistant message: The 'find' command is a powerful utility that searches for files and directories in a specified path. In this case, it will search the entire file system from root (/) for any files with the extension '.gguf'. The options used ensure that only regular files are considered, not directories or other types of files.
? Choose an option: Run desired command
Running: find / -type f -name "*.gguf"
/home/user/.cache/huggingface/gguf/llama3-simpo-expo-q8_0.gguf
/home/user/.cache/huggingface/gguf/llama3-simpo-expo-q2_k.gguf
/home/user/.cache/huggingface/gguf/llama3-simpo-expo-q4_k.gguf
/home/user/.cache/huggingface/gguf/phi-3-mini-128k-instruct.Q4_K_M.gguf
/home/user/.cache/huggingface/Tess-v2.5-Phi-3-medium-128k-14B-Q6_K.gguf
You should select a model for the cmdh
to use.
# Set cmdh to use ollama (if not already set)
harbor cmdh host ollama
# See what's currently available
harbor ollama list
# See currently configured model
harbor cmdh model
# Set the model to run
harbor cmdh model codestral
cmdh
should also work with OpenAI-compatible APIs.
# Note the casing - required by library
harbor cmdh host OpenAI
# Point cmdh to OpenAI API
harbor cmdh key <key>
harbor cmdh url <url>
# Set the model to run
harbor cmdh model gpt-3.5-turbo