Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

seq2seq.BahdanauAttention raise TypeError: probability_fn is not an instance of str #2820

Open
BrandonStudio opened this issue Mar 19, 2023 · 9 comments

Comments

@BrandonStudio
Copy link

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): both Windows and Colab
  • TensorFlow version and how it was installed (source or binary): stable latest, pip
  • TensorFlow-Addons version and how it was installed (source or binary): stable latest
  • Python version: 3.10 / 3.9
  • Is GPU used? (yes/no): yes

Describe the bug

call BahdanauAttention with default probability_fn value ("softmax") raises type error

Tried to debug and found that probability_fn was a function when checking type

Code to reproduce the issue

import tensorflow_addons as tfa
tfa.seq2seq.BahdanauAttention(1, 1, 1, normalize=False, name='BahdanauAttention')

Other info / logs

@We-here
Copy link

We-here commented Mar 20, 2023

Hello, have you solved this issue? i also have the same error when initializing tea.seq2seq.LuongAttention.

@bhack
Copy link
Contributor

bhack commented Mar 20, 2023

Do you have a very minimal gist to run to reproduce this?

@BrandonStudio
Copy link
Author

@We-here unfortunately no, I have not found a way to bypass type check

@BrandonStudio
Copy link
Author

@bhack I think the code above is enough. Exception is thrown when initializing the instance of the class, not afterwards

@bhack
Copy link
Contributor

bhack commented Mar 20, 2023

Yes cause _process_probability_fn was going to transform str back to the function.

Can you send a PR to the specific test to help reproduce this case?

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/seq2seq/tests/attention_wrapper_test.py

@seanpmorgan

@HybridNeos
Copy link

Also seeing this issue on google colab.

  • Python version: 3.9
  • Tensorflow version: 2.11.0
  • Tensorflow-addons version: 0.19.0
  • Is GPU used? (yes/no): yes

@DangMinh24
Copy link

@BrandonStudio I skip the type checking by comment out @typechecked at AttentionMechanism and its derived classes

@BrandonStudio
Copy link
Author

@DangMinh24 Did you just modify the library source code?

@DangMinh24
Copy link

@BrandonStudio Yeah, I modify tensorflow-addons code in local environment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants