Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Enable Multi-GPU Inference on a Single Machine with EXO #476

Open
svm87601 opened this issue Nov 20, 2024 · 0 comments
Open

How to Enable Multi-GPU Inference on a Single Machine with EXO #476

svm87601 opened this issue Nov 20, 2024 · 0 comments

Comments

@svm87601
Copy link

Hello! I am using EXO for distributed inference and would like to utilize multiple GPUs on a single machine to speed up inference. I have successfully configured EXO to work with a single GPU, but I'm not sure how to enable multi-GPU usage on the same machine. Could someone please guide me on how to set it up so that multiple GPUs are used for inference?

Environment Information:

  • Operating System: Ubuntu 22.04
  • CUDA Version: 12.5
  • Driver Version: 555.42.02
  • GPUs: RTX 3090 × 2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant