Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Train the lora flux with 2 x 4090 #652

Closed
ilkergalipatak opened this issue Aug 6, 2024 · 1 comment
Closed

Train the lora flux with 2 x 4090 #652

ilkergalipatak opened this issue Aug 6, 2024 · 1 comment

Comments

@ilkergalipatak
Copy link

Hello, would 2 x Nvidia 4090 be enough to train the flux model or would there be a need for vram while training? Can we allocate the required memory size over 2 GPUs?

@bghira
Copy link
Owner

bghira commented Aug 6, 2024

you can't currently do multi-gpu quantised training due to #644

it'd take a lot more 4090s than that to train it without quantisation. you would need DeepSpeed Zero 2 or 3.

it's not something I'd recommend - a single A40 or A100-40 or A6000 or L40(S) would do better.

@bghira bghira closed this as completed Aug 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants