llama.cpp server-cuda-b3963 Public Latest
Install from the command line
$ docker pull ghcr.io/luoyu-intel/llama.cpp:server-cuda-b3963
linux/amd64
$ docker pull ghcr.io/luoyu-intel/llama.cpp:server-cuda-b3963@sha256:827f4ebe7262fb8a0b550a566b2542e03f18693309060d0232e008459ca4bb25
unknown/unknown
$ docker pull ghcr.io/luoyu-intel/llama.cpp:server-cuda-b3963@sha256:e43b57dcd817b964c261e8a685054ec616b30c164c143a14b50bd67a0828af48
Recent tagged image versions
- 2 Version downloads
- 2 Version downloads
- 2 Version downloads
- 2 Version downloads
- 2 Version downloads
Loading
Sorry, something went wrong.