Skip to content

[LLM] Support block_attention/cachekv quant for llama #10375

[LLM] Support block_attention/cachekv quant for llama

[LLM] Support block_attention/cachekv quant for llama #10375

Triggered via pull request January 10, 2024 09:24
Status Success
Total duration 1m 49s
Artifacts

lint.yml

on: pull_request
Fit to window
Zoom out
Zoom in