Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'Parameter' object has no attribute 'hess' #5

Open
samiit opened this issue Dec 10, 2020 · 3 comments
Open

AttributeError: 'Parameter' object has no attribute 'hess' #5

samiit opened this issue Dec 10, 2020 · 3 comments
Labels
question Further information is requested

Comments

@samiit
Copy link

samiit commented Dec 10, 2020

Hi David,

Thanks for this repo. I tried it as an optimizer on toy examples and it worked fine but am struggling to make it work for an object detection case. Specifically, I am trying to use this as an optimizer for Yolo v5 from Ultralytics.

When I updated the train.py file there, I am facing some issues at the time of training. It uses a scheduler and I was wondering whether this is the source of the problem.
Here is part of the error log:

 File "train.py", line 309, in train
    optimizer.step()
  File "/usr/local/lib/python3.6/dist-packages/torch/optim/lr_scheduler.py", line 67, in wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/torch/autograd/grad_mode.py", line 26, in decorate_context
    return func(*args, **kwargs)
  File "/yolov5/ada_hessian.py", line 100, in step
    self.zero_hessian()
  File "/yolov5/ada_hessian.py", line 59, in zero_hessian
    if not isinstance(p.hess, float) and self.state[p]["hessian step"] % self.update_each == 0:
AttributeError: 'Parameter' object has no attribute 'hess'

Do you have any suggestions?

Regards,
Sam

@davda54
Copy link
Owner

davda54 commented Dec 10, 2020

Hi,

do you overwrite the model parameters after initializing the optimizer? The error on line 59 shouldn't normally occur, because every parameters is initialized with hess when creating the optimizer:

for p in self.get_params():
    p.hess = 0.0

@davda54 davda54 added the question Further information is requested label Dec 10, 2020
@samiit
Copy link
Author

samiit commented Dec 18, 2020

Hi David,

I am still checking this out. Kindly don't close the issue yet. Hopefully I can get back to you by today.

Sam

@SamMohel
Copy link

Did you solve it please ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants