Skip to content

Commit

Permalink
update supported_models.
Browse files Browse the repository at this point in the history
  • Loading branch information
Reinerzhou committed Nov 19, 2024
1 parent 6075cd6 commit 93a9a16
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/en/supported_models/supported_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ The TurboMind engine doesn't support window attention. Therefore, for models tha
| DeepSeek-MoE | 16B | LLM | Yes | No | No | No | No |
| DeepSeek-V2 | 16B, 236B | LLM | Yes | No | No | No | No |
| MiniCPM3 | 4B | LLM | Yes | Yes | Yes | No | No |
| MiniCPM-V-2_6 | 8B | LLM | Yes | No | No | No | No |
| MiniCPM-V-2_6 | 8B | LLM | Yes | No | No | No | Yes |
| Gemma | 2B-7B | LLM | Yes | Yes | Yes | No | No |
| Dbrx | 132B | LLM | Yes | Yes | Yes | No | No |
| StarCoder2 | 3B-15B | LLM | Yes | Yes | Yes | No | No |
Expand Down
2 changes: 1 addition & 1 deletion docs/zh_cn/supported_models/supported_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ turbomind 引擎不支持 window attention。所以,对于应用了 window att
| DeepSeek-MoE | 16B | LLM | Yes | No | No | No | No |
| DeepSeek-V2 | 16B, 236B | LLM | Yes | No | No | No | No |
| MiniCPM3 | 4B | LLM | Yes | Yes | Yes | No | No |
| MiniCPM-V-2_6 | 8B | LLM | Yes | No | No | No | No |
| MiniCPM-V-2_6 | 8B | LLM | Yes | No | No | No | Yes |
| Gemma | 2B-7B | LLM | Yes | Yes | Yes | No | No |
| Dbrx | 132B | LLM | Yes | Yes | Yes | No | No |
| StarCoder2 | 3B-15B | LLM | Yes | Yes | Yes | No | No |
Expand Down
2 changes: 2 additions & 0 deletions lmdeploy/pytorch/supported_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,8 @@
PhiMoEForCausalLM=True,
# mllama
MllamaForConditionalGeneration=True,
# MiniCPM-V-2_6
MiniCPMVForCausalLM=True,
)


Expand Down

0 comments on commit 93a9a16

Please sign in to comment.