Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about: local variable 'step_size_min' referenced before assignment in pysurvival #53

Open
dadekandrew2010 opened this issue Feb 18, 2022 · 3 comments

Comments

@dadekandrew2010
Copy link

The error message is prompted by the step function in the rprop.py file in the user torch directory. You only need to initialize the variable that reports the error in this function, similar to step_size_min=[]
def step(self, closure=None):
......
F.rprop(params,
grads,
prevs,
step_sizes,
step_size_min=step_size_min,
step_size_max=step_size_max,
etaminus=etaminus,
etaplus=etaplus)

@mcombalia
Copy link

I also have this issue.

@Jaqen00
Copy link

Jaqen00 commented Mar 9, 2022

I also have this issue.

image

UnboundLocalError                         Traceback (most recent call last)
Input In [3], in <cell line: 11>()
      8 neural_mtlr = NeuralMultiTaskModel(bins=100, structure=structure)
     10 # Fitting the model
---> 11 neural_mtlr.fit(X_train, T_train, E_train,
     12                 init_method = 'orthogonal', optimizer ='rprop', lr = 1e-4,
     13                 l2_reg = 1e-1,  l2_smooth = 1e-1,
     14                 batch_normalization = True,  bn_and_dropout = True,
     15                 dropout=0.5,  num_epochs = 500)

File ~/PycharmProjects/pysurvival/venv/lib/python3.8/site-packages/pysurvival/models/multi_task.py:369, in BaseMultiTaskModel.fit(self, X, T, E, init_method, optimizer, lr, num_epochs, dropout, l2_reg, l2_smooth, batch_normalization, bn_and_dropout, verbose, extra_pct_time, is_min_time_zero)
    366 Triangle = torch.FloatTensor(Triangle)
    368 # Performing order 1 optimization
--> 369 model, loss_values = opt.optimize(self.loss_function, model, optimizer, 
    370     lr, num_epochs, verbose,  X_cens=X_cens, X_uncens=X_uncens, 
    371     Y_cens=Y_cens, Y_uncens=Y_uncens, Triangle=Triangle, 
    372     l2_reg=l2_reg, l2_smooth=l2_smooth)
    374 # Saving attributes
    375 self.model = model.eval()

File ~/PycharmProjects/pysurvival/venv/lib/python3.8/site-packages/pysurvival/utils/optimization.py:189, in optimize(loss_function, model, optimizer_str, lr, nb_epochs, verbose, num_workers, **kargs)
    187     optimizer.step(closure)
    188 else:
--> 189     optimizer.step()
    190 loss = closure()
    191 loss_value = loss.item()

File ~/PycharmProjects/pysurvival/venv/lib/python3.8/site-packages/torch/optim/optimizer.py:88, in Optimizer._hook_for_profile.<locals>.profile_hook_step.<locals>.wrapper(*args, **kwargs)
     86 profile_name = "Optimizer.step#{}.step".format(obj.__class__.__name__)
     87 with torch.autograd.profiler.record_function(profile_name):
---> 88     return func(*args, **kwargs)

File ~/PycharmProjects/pysurvival/venv/lib/python3.8/site-packages/torch/autograd/grad_mode.py:28, in _DecoratorContextManager.__call__.<locals>.decorate_context(*args, **kwargs)
     25 @functools.wraps(func)
     26 def decorate_context(*args, **kwargs):
     27     with self.__class__():
---> 28         return func(*args, **kwargs)

File ~/PycharmProjects/pysurvival/venv/lib/python3.8/site-packages/torch/optim/rprop.py:109, in Rprop.step(self, closure)
    101         step_size_min, step_size_max = group['step_sizes']
    103         state['step'] += 1
    105     F.rprop(params,
    106             grads,
    107             prevs,
    108             step_sizes,
--> 109             step_size_min=step_size_min,
    110             step_size_max=step_size_max,
    111             etaminus=etaminus,
    112             etaplus=etaplus)
    114 return loss

UnboundLocalError: local variable 'step_size_min' referenced before assignment

@onacrame
Copy link

onacrame commented Apr 3, 2022

I think this project died so I wouldn't expect an update. Shame as it seemed like a useful package

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants