Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ModelPatcher] object patch not properly removed between runs #6303

Closed
huchenlei opened this issue Jan 1, 2025 · 5 comments · Fixed by #6357
Closed

[ModelPatcher] object patch not properly removed between runs #6303

huchenlei opened this issue Jan 1, 2025 · 5 comments · Fixed by #6357
Assignees
Labels
Bug Something is confirmed to not be working properly. Important

Comments

@huchenlei
Copy link
Collaborator

huchenlei commented Jan 1, 2025

Expected Behavior

Object patch is properly removed from different workflow execution.

Actual Behavior

Object patches are not removed when executing different patches.

Steps to Reproduce

  1. Install ComfyUI-layerdiffuse and patch PR: Fix SD15 memory sharing huchenlei/ComfyUI-layerdiffuse#119

  2. Start ComfyUI server

  3. Execute workflow1 and observe that it executes normally layer_diffusion_fg_example (1).json
    image

  4. Execute workflow 2 and observe error
    layer_diffusion_cond_joint_bg (1).json
    image

  5. Restart ComfyUI server

  6. Execute workflow 2 and observe that it executes normally
    image

  7. Execute wrkflow 1 and observe the same error as in step 4 on workflow 2
    image

Putting a breakpoint here https://github.com/huchenlei/ComfyUI-layerdiffuse/blob/ab409eefbba15ecc322a6ea830e077576e904326/lib_layerdiffusion/attention_sharing.py#L334

on step 7 run indicates that the objectes_backup is not restored after the previous run.
image

Debug Logs

N/A

Other

Ref issue: huchenlei/ComfyUI-layerdiffuse#115

Workflow 2 input image:
309219693-e7e2d80e-ffbe-4724-812a-5139a88027e3

@huchenlei huchenlei added Bug Something is confirmed to not be working properly. Important labels Jan 1, 2025
@huchenlei
Copy link
Collaborator Author

Note:

Previously the code was calling model_management.unload_model_clones to force restore original objects, however that function was removed in #5450 without clear replacement.

@huchenlei huchenlei changed the title [ModelPatcher] object patch not properly removed from run to run [ModelPatcher] object patch not properly removed between runs Jan 1, 2025
@huchenlei
Copy link
Collaborator Author

huchenlei commented Jan 3, 2025

Minimal reproduction:

object_patch_reproduction.json

Steps:

patch_error.mp4

Relevant code:

class DummyPatch(torch.nn.Module):
    def __init__(self, module: torch.nn.Module, dummy_float: float = 0.0):
        super().__init__()
        self.module = module
        self.dummy_float = dummy_float

    def forward(self, *args, **kwargs):
        if isinstance(self.module, DummyPatch):
            raise Exception(f"Calling nested dummy patch! {self.dummy_float}")

        return self.module(*args, **kwargs)


class ObjectPatchNode:
    @classmethod
    def INPUT_TYPES(cls):
        return {
            "required": {
                "model": ("MODEL",),
                "target_module": ("STRING", {"multiline": True}),
            },
            "optional": {
                "dummy_float": ("FLOAT", {"default": 0.0 }),
            }
        }

    RETURN_TYPES = ("MODEL",)
    FUNCTION = "apply_patch"
    CATEGORY = "DevTools"
    DESCRIPTION = "A node that applies an object patch"

    def apply_patch(
        self, model: ModelPatcher, target_module: str, dummy_float: float = 0.0
    ) -> ModelPatcher:
        module = utils.get_attr(model.model, target_module)
        work_model = model.clone()
        work_model.add_object_patch(target_module, DummyPatch(module, dummy_float))
        return (work_model,)

The issue is that when I want to apply a different patch by adjusting dummy_float input, the previous patch was not removed.

@huchenlei
Copy link
Collaborator Author

Context: Add --disable-smart-memory won't trigger this bug.

@huchenlei
Copy link
Collaborator Author

From @Kosinkadink

i figured it out. comfy code is working as intended, but the function you'd wanna call in this case is ModelPatcher's get_model_object, ideally after you clone it (in case some other node pack does something dumb with the original model patcher and pollutes it).
comfy.utils.get_attr gets whatever object is currently on that key - which, once the model is loaded from the previous sampling, is the DummyPatch that was applied from the run before (the model is still loaded).
model.get_model_object will instead make sure to reference the object_patches/object_patches_backup of the ModelPatcher, and will only call comfy.utils.get_attr if there is nothing in that shared object_patches_backup dict.

Actionable:

  • Audit usage of comfy.utils.get_attr
  • Add docstring on comfy.utils.get_attr pointing to ModelPatcher.get_model_object

@huchenlei
Copy link
Collaborator Author

Codesearch shows there are about 8 repos affected:

https://cs.comfy.org/search?q=context:global+get_attr%28&patternType=keyword&sm=0

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something is confirmed to not be working properly. Important
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants