Skip to content

Conversation

@mudler
Copy link
Owner

@mudler mudler commented Jun 20, 2025

So we can have meta packages such as "vllm" that automatically installs the corresponding package depending on the GPU that is being currently detected in the system.

@netlify
Copy link

netlify bot commented Jun 20, 2025

Deploy Preview for localai ready!

Name Link
🔨 Latest commit e909f5e
🔍 Latest deploy log https://app.netlify.com/projects/localai/deploys/685aaf64d8776c0008f99859
😎 Deploy Preview https://deploy-preview-5696--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@mudler mudler force-pushed the feat/backends_meta branch 2 times, most recently from d3cf9d7 to 91c4b90 Compare June 20, 2025 19:19
So we can have meta packages such as "vllm" that automatically installs
the corresponding package depending on the GPU that is being currently
detected in the system.

Signed-off-by: Ettore Di Giacinto <[email protected]>
@mudler mudler force-pushed the feat/backends_meta branch from 91c4b90 to 9a34fc8 Compare June 20, 2025 20:08
@mudler mudler force-pushed the feat/backends_meta branch 6 times, most recently from 131b972 to fda7324 Compare June 24, 2025 13:44
Signed-off-by: Ettore Di Giacinto <[email protected]>
@mudler mudler force-pushed the feat/backends_meta branch 2 times, most recently from b978b50 to e909f5e Compare June 24, 2025 14:00
@mudler mudler merged commit a6d9988 into master Jun 24, 2025
28 checks passed
@mudler mudler deleted the feat/backends_meta branch June 24, 2025 15:08
@mudler mudler added the enhancement New feature or request label Jun 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants