Skip to content

[LLM] Support block_attention/cachekv quant for llama #10310

[LLM] Support block_attention/cachekv quant for llama

[LLM] Support block_attention/cachekv quant for llama #10310

Triggered via pull request January 10, 2024 09:24
Status Success
Total duration 39m 10s
Artifacts

tests.yml

on: pull_request
Fit to window
Zoom out
Zoom in