Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auto-CPU container exits while loading model #296

Closed
1 task done
schech1 opened this issue Jan 18, 2023 · 4 comments
Closed
1 task done

Auto-CPU container exits while loading model #296

schech1 opened this issue Jan 18, 2023 · 4 comments
Labels
awaiting-response Waiting for the issuer to respond bug Something isn't working Stale

Comments

@schech1
Copy link

schech1 commented Jan 18, 2023

Has this issue been opened before?

  • It is not in the FAQ, I checked.
  • [X ] It is not in the issues, I searched.

Describe the bug

I start the container with:
docker compose --profile auto-cpu up --build
I have no GPU, therefore auto-cpu.

The container spins up, but exits during loading the model

Mounted .cache
�
Mounted LDSR
�
Mounted BLIP
Mounted Hypernetworks
Mounted VAE
Mounted GFPGAN
Mounted RealESRGAN
Mounted Deepdanbooru
Mounted ScuNET
Mounted .cache
Mounted StableDiffusion
Mounted embeddings
Mounted ESRGAN
Mounted config.json
Mounted SwinIR
Mounted MiDaS
Mounted ui-config.json
Mounted BSRGAN
Mounted Codeformer
Mounted extensions
+ python3 -u webui.py --listen --port 7860 --no-half --precision full
No module 'xformers'. Proceeding without it.
Warning: caught exception 'Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx', memory monitor disabled
Removing empty folder: /stable-diffusion-webui/models/BSRGAN
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading weights [cc6cb27103] from /stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.ckpt
exited with code 137

Which UI

auto-cpu

Hardware / Software

  • OS: Linux
  • OS version: Linux 6.0.0-0.deb11.6-amd64
  • WSL version (if applicable): -
  • Docker Version: <5:20.10.223-0debian-bullseye>
  • Docker compose version: Compose version v2.14.1
  • Repo version: from master
  • RAM: 16GB
  • GPU/VRAM: No GPU -> (Intel NUC10i7FNK)

Steps to Reproduce

  1. Run docker compose --profile auto-cpu up --build
@schech1 schech1 added the bug Something isn't working label Jan 18, 2023
@DevilaN
Copy link
Contributor

DevilaN commented Jan 18, 2023

Hi!
Exit code 137 in most cases means not enough RAM (Out Of Memory Kill)
Your linux box shares RAM with docker. What is your RAM usage when not using docker? Please run LANG=C free command and paste results here.

PS. FYI I've got 16GB RAM too and running it with cpu-auto runs without any problems at all, so it should be doable. Heads up and let's find out what's wrong.

@AbdBarho AbdBarho added the awaiting-response Waiting for the issuer to respond label Jan 20, 2023
@github-actions
Copy link

github-actions bot commented Feb 4, 2023

This issue is stale because it has been open 14 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale label Feb 4, 2023
@github-actions
Copy link

This issue was closed because it has been stalled for 7 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 12, 2023
@eggplants
Copy link

In my environment, same error has resolved with increasing memory.

If you are using Docker Desktop, adjust from Settings > Resources > Memory.

@nviraj nviraj mentioned this issue Jul 1, 2023
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting-response Waiting for the issuer to respond bug Something isn't working Stale
Projects
None yet
Development

No branches or pull requests

4 participants