Skip to content

llama.cpp server-cuda-b3963 Public Latest

Install from the command line
$ docker pull ghcr.io/luoyu-intel/llama.cpp:server-cuda-b3963

Recent tagged image versions

  • Published about 1 year ago · Digest
    sha256:452a0d903bc3a0dbf428bc7ae8137195c7a0d78c4f10c01d2c69c161c8011d25
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:e266be2c3e3b84d89bb3bce4b9712575967d0ee4b6b86c6a5fe445c8332c7583
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:c290a1703b87533ef20a5db8a738e29b24c6c78b2983bd673dd3f39f1e3771b2
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:06253dd344a4a83d62948d822b9a038422e2f95a94ad5939ea01a9d22eb66871
    2 Version downloads
  • Published about 1 year ago · Digest
    sha256:ebc577823ed76310048fd6e89abef656ee66a943314001f4583ee2b23c8cfc29
    2 Version downloads

Loading

Details


Last published

1 year ago

Total downloads

797