Skip to content

Model Pruning Callback Failing #10835

@aqibsaeed

Description

@aqibsaeed

Hi,

I am trying to use ModelPruning callback as follows:

    callbacks=[
        ModelPruning(
            pruning_fn="l1_unstructured",
            amount=0.01,
            use_global_unstructured=True,
        )
    ]

but after training for an epoch, the Trainer throws following error, (only happens when using ModelPruning callback):

File "/home/.conda/envs/dummy/lib/python3.8/site-packages/torch/nn/utils/convert_parameters.py", line 77, in _check_param_device
    if param.is_cuda:  # Check if in same GPU
AttributeError: 'bool' object has no attribute 'is_cuda'

I tried pytorch 1.7.0 and 1.9.0 but the issue persist. Any idea what is causing this error?

Thanks.

cc @tchaton @rohitgr7 @carmocca

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions