Skip to content
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.

Use mlc-llm as backend #213

Open
shiqimei opened this issue May 11, 2023 · 1 comment
Open

Use mlc-llm as backend #213

shiqimei opened this issue May 11, 2023 · 1 comment
Labels
issue:enhancement New feature or request topic:backend-support Support for alternate non-GGML backends, or for particular GGML backend features

Comments

@shiqimei
Copy link

https://github.com/mlc-ai/mlc-llm

@philpax philpax added the issue:enhancement New feature or request label May 11, 2023
@philpax
Copy link
Collaborator

philpax commented May 11, 2023

MLC is very cool, but I'm not sure how much effort it would be to integrate. We'll have to implement support for custom backends first (#31).

@philpax philpax added the topic:backend-support Support for alternate non-GGML backends, or for particular GGML backend features label Jun 15, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
issue:enhancement New feature or request topic:backend-support Support for alternate non-GGML backends, or for particular GGML backend features
Projects
None yet
Development

No branches or pull requests

2 participants