Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
154 commits
Select commit Hold shift + click to select a range
ae56c00
define model configuration classes
Aug 12, 2023
b5d97b1
blackify
Aug 12, 2023
e8edb0d
add ABC for config storage
Aug 12, 2023
e8815a1
rename ModelConfig to ModelConfigFactory
Aug 12, 2023
32958db
add YAML file storage backend
Aug 13, 2023
b2894b5
add class docstring and blackify
Aug 13, 2023
6c9b9e1
Merge branch 'main' into lstein/model-manager-refactor
lstein Aug 13, 2023
0c74300
change paths to str to make json serializable
Aug 13, 2023
5434dcd
fix test to work with string paths
Aug 13, 2023
1ea0ccb
add SQL backend
Aug 13, 2023
51e84e6
Merge branch 'main' into lstein/model-manager-refactor
lstein Aug 13, 2023
81da3d3
change model field name "hash" to "id"
Aug 13, 2023
155d9fc
Merge branch 'lstein/model-manager-refactor' of github.com:invoke-ai/…
Aug 13, 2023
c56fb38
added ability to force config class returned by make_config()
Aug 13, 2023
7db71ed
rename modules
Aug 15, 2023
1c7d9db
start installer module
Aug 15, 2023
e83d005
module skeleton written
Aug 15, 2023
916cc26
partial rewrite of checkpoint template creator
Aug 17, 2023
0deb3f9
Merge branch 'main' into lstein/model-manager-refactor
Aug 20, 2023
1784aeb
fix flake8 errors
Aug 20, 2023
f023e34
added main templates
Aug 21, 2023
6f9bf87
reimplement and clean up probe class
Aug 23, 2023
4b3d54d
install ABC written
Aug 23, 2023
9adc897
added install module
Aug 23, 2023
055ad01
merge with main; resolve conflicts
Aug 23, 2023
93cef55
blackify
Aug 23, 2023
97f2e77
make ModelSearch pydantic
Aug 24, 2023
8396bf7
Merge branch 'main' into lstein/model-manager-refactor
Aug 30, 2023
e6512e1
add ABC for download manager
Aug 30, 2023
869f310
download of individual files working
Sep 2, 2023
8fc2092
added download manager service and began repo_id download
Sep 4, 2023
d1c5990
merge and resolve conflicts
Sep 4, 2023
8f51adc
chore: black
psychedelicious Sep 5, 2023
57552de
threaded repo_id download working; error conditions not tested
Sep 5, 2023
ca6d248
resolve merge conflicts
Sep 5, 2023
e907417
add unit tests for queued model download
Sep 6, 2023
404cfe0
add download manager to invoke services
Sep 6, 2023
626acd5
remove unecessary HTTP probe for repo_id model component sizes
Sep 6, 2023
3448eda
fix progress reporting for repo_ids
Sep 6, 2023
82499d4
fix various typing errors in api dependencies initialization
Sep 7, 2023
11ead34
fix flake8 warnings
Sep 7, 2023
d979c50
Merge branch 'main' into lstein/model-manager-refactor
lstein Sep 7, 2023
c9a016f
more flake8 fixes
Sep 7, 2023
b09e012
Merge branch 'lstein/model-manager-refactor' of github.com:invoke-ai/…
Sep 7, 2023
79b2423
last flake8 fix - why is local flake8 not identical to git flake8?
Sep 7, 2023
a7aca29
implement regression tests for pause/cancel/error conditions
Sep 7, 2023
2165d55
add checks for malformed URLs and malicious content dispositions
Sep 8, 2023
b7ca983
blackify
Sep 8, 2023
598fe81
wire together download and install; now need to write install events
Sep 9, 2023
64424c6
install of repo_ids records author, tags and license
Sep 9, 2023
3582cfa
make download manager optional in InvokeAIServices during development
Sep 9, 2023
b2892f9
incorporate civitai metadata into model config
Sep 10, 2023
b7a6a53
fix flake8 warnings
Sep 10, 2023
8636015
increase download chunksize for better speed
Sep 10, 2023
8052f2e
Merge branch 'main' into lstein/model-manager-refactor
Sep 10, 2023
f454304
make it possible to pause/resume repo_id downloads
Sep 10, 2023
b583bdd
loading works -- web app broken
Sep 11, 2023
7430d87
loader working
Sep 11, 2023
6d8b2a7
pytests mostly working; model_manager_service needs rewriting
lstein Sep 12, 2023
4b932b2
refactor create_download_job; override probe info in install call
lstein Sep 13, 2023
27dcd89
merge with main; model_manager_service.py needs to be rewritten
Sep 14, 2023
ac88863
fix exception traceback reporting
lstein Sep 14, 2023
171d789
model loader autoscans models_dir on initialization
lstein Sep 14, 2023
716a1b6
model_manager_service now mostly type correct
lstein Sep 15, 2023
a033ccc
blackify
lstein Sep 15, 2023
3529925
services rewritten; starting work on routes
Sep 15, 2023
b7789bb
list_models() API call now working
Sep 16, 2023
08952b9
Merge branch 'main' into lstein/model-manager-refactor
Sep 16, 2023
b9a90fb
blackify and isort
Sep 16, 2023
db7fdc3
fix more isort issues
Sep 16, 2023
c090c5f
update_model and delete_model working; convert is WIP
Sep 16, 2023
dc68347
loading and conversions of checkpoints working
Sep 16, 2023
c029534
all methods in router API now tested and working
Sep 16, 2023
539776a
import_model API now working
Sep 17, 2023
e880f4b
add logs to confirm that event info is being sent to bus
Sep 17, 2023
f0ce559
add install job control to web API
Sep 17, 2023
238d7fa
add models.yaml conversion script
Sep 17, 2023
d051c08
attempt to fix flake8 lint errors
Sep 17, 2023
151ba02
fix models.yaml version assertion error in pytests
Sep 17, 2023
d1382f2
fasthash produces same results on windows & linux
lstein Sep 19, 2023
0c88491
Merge branch 'main' into lstein/model-manager-refactor
Sep 19, 2023
73bc088
blackify
Sep 19, 2023
de666fd
move incorrectly placed models into correct directory at startup time
Sep 19, 2023
ed91f48
TUI installer more or less working
lstein Sep 21, 2023
3402cf6
preserve description in metadata when installing a starter model
Sep 21, 2023
3199409
TUI installer functional; minor cosmetic work needed
Sep 21, 2023
30aea54
remove debug statement
Sep 21, 2023
c9cd418
add/delete from command line working; training words downloaded
Sep 21, 2023
07ddd60
fix install of models with relative paths
Sep 22, 2023
d2cdbe5
configure script now working
Sep 23, 2023
d5d517d
correctly download the selected version of a civitai model
Sep 23, 2023
ab58eb2
resolve conflicts with ip-adapter change
Sep 23, 2023
6edee2d
automatically convert models.yaml to new format
Sep 23, 2023
8bc1ca0
allow priority to be set at install job submission time
Sep 24, 2023
f9b92dd
resolve conflicts with get_logger() code changes from main
Sep 24, 2023
ac46340
merge with main & resolve conflicts
Sep 25, 2023
effced8
added `cancel_all` and `prune` model install operations to router API
Sep 25, 2023
1d6a4e7
add tests for model installation events
Sep 26, 2023
2e9a7b0
Merge branch 'main' into lstein/model-manager-refactor
lstein Sep 26, 2023
0b75a4f
resolve merge conflicts
Sep 28, 2023
81fce18
reorder pytests to prevent fixture race condition
Sep 28, 2023
2f16a2c
fix migrate script and type mismatches in probe, config and loader
Sep 29, 2023
3b832f1
fix one more type mismatch in probe module
Sep 29, 2023
4555aec
remove unused code from invokeai.backend.model_manager.storage.yaml
Sep 29, 2023
cbf0310
add README explaining reorg of tests directory
Sep 29, 2023
208d390
almost all type mismatches fixed
Sep 29, 2023
807ae82
more type mismatch fixes
Sep 30, 2023
acaaff4
make model merge script work with new model manager
Sep 30, 2023
c025c9c
speed up model scanning at startup
Sep 30, 2023
230ee18
do not ignore keyboard interrupt while scanning models
Sep 30, 2023
c91429d
merge with main
Oct 3, 2023
63f6c12
make merge script read invokeai.yaml when default root passed
Oct 3, 2023
48c3d92
make textual inversion training work with new model manager
Oct 3, 2023
062a6ed
prevent crash on windows due to lack of os.pathconf call
Oct 3, 2023
e3912e8
replace config.ram_cache_size with config.ram and similarly for vram
Oct 3, 2023
459f023
multiple minor fixes
Oct 4, 2023
4624de0
Merge branch 'main' into lstein/model-manager-refactor
lstein Oct 4, 2023
de90d40
Merge branch 'lstein/model-manager-refactor' of github.com:invoke-ai/…
Oct 4, 2023
16ec7a3
fix type mismatches in download_manager service
Oct 4, 2023
a180c0f
check model hash before and after moving in filesystem
Oct 4, 2023
cb0fdf3
refactor model install job class hierarchy
Oct 4, 2023
cd5d3e3
refactor model_manager_service.py into small functional modules
Oct 5, 2023
9cbc62d
fix reorganized module dependencies
Oct 5, 2023
8e06088
refactor services
Oct 6, 2023
6303f74
allow user to select main database or external file for model record/…
Oct 7, 2023
00e85bc
make autoimport directory optional, defaulting to inactive
Oct 7, 2023
4421638
fix conversion call
Oct 7, 2023
432231e
merge with main
Oct 7, 2023
7f68f58
restore printing of version when invokeai-web and invokeai called wit…
Oct 7, 2023
5106054
support clipvision image encoder downloading
Oct 7, 2023
a64a34b
add support for repo_id subfolders
Oct 8, 2023
e5b2bc8
refactor download queue jobs
Oct 8, 2023
bccfe8b
fix some type mismatches introduces by reorg
Oct 8, 2023
ce2baa3
port support for AutoencoderTiny models
Oct 8, 2023
a80ff75
Update invokeai/app/invocations/model.py
lstein Oct 9, 2023
fe10386
address all PR 4252 comments from ryan through October 5
Oct 9, 2023
3644d40
Merge branch 'lstein/model-manager-refactor' of github.com:invoke-ai/…
Oct 9, 2023
3962914
merge with main
Oct 9, 2023
33d4756
improve selection of huggingface repo id files to download
Oct 9, 2023
4149d35
refactor installer class hierarchy
Oct 9, 2023
e50a257
merge with main
Oct 9, 2023
4bab724
fix broken import
Oct 9, 2023
67607f0
fix issues with module import order breaking pytest node tests
Oct 10, 2023
71e7e61
add documentation for model record service and loader
Oct 10, 2023
76aa19a
first draft of documentation finished
Oct 11, 2023
e079cc9
add back source URL validation to download job hierarchy
Oct 12, 2023
0a0412f
restore CLI to broken state
Oct 12, 2023
a2079bd
Update docs/installation/050_INSTALLING_MODELS.md
lstein Oct 12, 2023
aace679
Update invokeai/app/services/model_convert.py
lstein Oct 12, 2023
b708aef
misc small fixes requested by Ryan
Oct 12, 2023
5f80d4d
Merge branch 'lstein/model-manager-refactor' of github.com:invoke-ai/…
Oct 12, 2023
a51b165
clean up model downloader status locking to avoid race conditions
Oct 12, 2023
0f9c676
remove download queue change_priority() calls completely
Oct 12, 2023
53e1199
prevent potential infinite recursion on exceptions raised by event ha…
Oct 12, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,214 changes: 1,214 additions & 0 deletions docs/contributing/MODEL_MANAGER.md

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions docs/contributing/contribution_guides/development.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Once you're setup, for more information, you can review the documentation specif
* #### [InvokeAI Architecure](../ARCHITECTURE.md)
* #### [Frontend Documentation](./contributingToFrontend.md)
* #### [Node Documentation](../INVOCATIONS.md)
* #### [InvokeAI Model Manager](../MODEL_MANAGER.md)
* #### [Local Development](../LOCAL_DEVELOPMENT.md)


Expand Down
19 changes: 14 additions & 5 deletions docs/features/CONFIGURATION.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,11 +207,8 @@ if INVOKEAI_ROOT is `/home/fred/invokeai` and the path is

| Setting | Default Value | Description |
|----------|----------------|--------------|
| `autoimport_dir` | `autoimport/main` | At startup time, read and import any main model files found in this directory |
| `lora_dir` | `autoimport/lora` | At startup time, read and import any LoRA/LyCORIS models found in this directory |
| `embedding_dir` | `autoimport/embedding` | At startup time, read and import any textual inversion (embedding) models found in this directory |
| `controlnet_dir` | `autoimport/controlnet` | At startup time, read and import any ControlNet models found in this directory |
| `conf_path` | `configs/models.yaml` | Location of the `models.yaml` model configuration file |
| `autoimport_dir` | `autoimport/main` | At startup time, read and import any main model files found in this directory (not recommended)|
| `model_config_db` | `auto` | Location of the model configuration database. Specify `auto` to use the main invokeai.db database, or specify a `.yaml` or `.db` file to store the data externally.|
| `models_dir` | `models` | Location of the directory containing models installed by InvokeAI's model manager |
| `legacy_conf_dir` | `configs/stable-diffusion` | Location of the directory containing the .yaml configuration files for legacy checkpoint models |
| `db_dir` | `databases` | Location of the directory containing InvokeAI's image, schema and session database |
Expand All @@ -234,6 +231,18 @@ Paths:
# controlnet_dir: null
```

### Model Cache

These options control the size of various caches that InvokeAI uses
during the model loading and conversion process. All units are in GB

| Setting | Default Value | Description |
|----------|----------------|--------------|
| `disk` | `20.0` | Before loading a model into memory, InvokeAI converts .ckpt and .safetensors models into diffusers format and saves them to disk. This option controls the maximum size of the directory in which these converted models are stored. If set to zero, then only the most recently-used model will be cached. |
| `ram` | `6.0` | After loading a model from disk, it is kept in system RAM until it is needed again. This option controls how much RAM is set aside for this purpose. Larger amounts allow more models to reside in RAM and for InvokeAI to quickly switch between them. |
| `vram` | `0.25` | This allows smaller models to remain in VRAM, speeding up execution modestly. It should be a small number. |


### Logging

These settings control the information, warning, and debugging
Expand Down
33 changes: 30 additions & 3 deletions docs/installation/050_INSTALLING_MODELS.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,11 +123,20 @@ installation. Examples:
# (list all controlnet models)
invokeai-model-install --list controlnet

# (install the model at the indicated URL)
# (install the diffusers model using its hugging face repo_id)
invokeai-model-install --add stabilityai/stable-diffusion-xl-base-1.0

# (install a diffusers model that lives in a subfolder)
invokeai-model-install --add stabilityai/stable-diffusion-xl-base-1.0:vae

# (install the checkpoint model at the indicated URL)
invokeai-model-install --add https://civitai.com/api/download/models/128713

# (delete the named model)
invokeai-model-install --delete sd-1/main/analog-diffusion
# (delete the named model if its name is unique)
invokeai-model-install --delete analog-diffusion

# (delete the named model using its fully qualified name)
invokeai-model-install --delete sd-1/main/test_model
```

### Installation via the Web GUI
Expand All @@ -141,6 +150,24 @@ left-hand panel) and navigate to *Import Models*
wish to install. You may use a URL, HuggingFace repo id, or a path on
your local disk.

There is special scanning for CivitAI URLs which lets
you cut-and-paste either the URL for a CivitAI model page
(e.g. https://civitai.com/models/12345), or the direct download link
for a model (e.g. https://civitai.com/api/download/models/12345).

If the desired model is a HuggingFace diffusers model that is located
in a subfolder of the repository (e.g. vae), then append the subfolder
to the end of the repo_id like this:

```
# a VAE model located in subfolder "vae"
stabilityai/stable-diffusion-xl-base-1.0:vae

# version 2 of the model located in subfolder "v2"
monster-labs/control_v1p_sd15_qrcode_monster:v2

```

3. Alternatively, the *Scan for Models* button allows you to paste in
the path to a folder somewhere on your machine. It will be scanned for
importable models and prompt you to add the ones of your choice.
Expand Down
15 changes: 13 additions & 2 deletions invokeai/app/api/dependencies.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,17 @@
from invokeai.version.invokeai_version import __version__

from ..services.default_graphs import create_system_graphs
from ..services.download_manager import DownloadQueueService
from ..services.graph import GraphExecutionState, LibraryGraph
from ..services.image_file_storage import DiskImageFileStorage
from ..services.invocation_queue import MemoryInvocationQueue
from ..services.invocation_services import InvocationServices
from ..services.invocation_stats import InvocationStatsService
from ..services.invoker import Invoker
from ..services.latent_storage import DiskLatentsStorage, ForwardCacheLatentsStorage
from ..services.model_manager_service import ModelManagerService
from ..services.model_install_service import ModelInstallService
from ..services.model_loader_service import ModelLoadService
from ..services.model_record_service import ModelRecordServiceBase
from ..services.processor import DefaultInvocationProcessor
from ..services.sqlite import SqliteItemStorage
from ..services.thread import lock
Expand Down Expand Up @@ -127,8 +130,12 @@ def initialize(config: InvokeAIAppConfig, event_handler_id: int, logger: Logger
)
)

download_queue = DownloadQueueService(event_bus=events)
model_record_store = ModelRecordServiceBase.open(config, conn=db_conn, lock=lock)
model_loader = ModelLoadService(config, model_record_store)
model_installer = ModelInstallService(config, queue=download_queue, store=model_record_store, event_bus=events)

services = InvocationServices(
model_manager=ModelManagerService(config, logger),
events=events,
latents=latents,
images=images,
Expand All @@ -141,6 +148,10 @@ def initialize(config: InvokeAIAppConfig, event_handler_id: int, logger: Logger
configuration=config,
performance_statistics=InvocationStatsService(graph_execution_manager),
logger=logger,
download_queue=download_queue,
model_record_store=model_record_store,
model_loader=model_loader,
model_installer=model_installer,
session_queue=SqliteSessionQueue(conn=db_conn, lock=lock),
session_processor=DefaultSessionProcessor(),
invocation_cache=MemoryInvocationCache(max_cache_size=config.node_cache_size),
Expand Down
Loading