-
Notifications
You must be signed in to change notification settings - Fork 278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Renaming Custom Layer breaks TFMA Evaluator #154
Comments
Hi @abbyDC |
Hi @pindinagesh! The link you attached doesn't show anything on my end when i click on it. May I ask for the working link for this so I can take a look at it? Thanks! :) |
Sorry for the inconvenience, |
Hi yup the link works now. Will take a look at the post first to check which of the workarounds I have already tried |
Hi @abbyDC Could you please tell us the status of this issue? |
Hello! Upon further investigation and experimentations, the problem still looks the same for me. Several things I've tried similar to the issue above:
|
System information
provided in TensorFlow Model Analysis): No
You can obtain the TensorFlow Model Analysis version with
python -c "import tensorflow_model_analysis as tfma; print(tfma.version.VERSION)"
Describe the problem
I have a custom layer named MultiHeadAttention layer and when I ran the tfx pipeline, it shows a warning that it has a conflict with the default MultiHeadAttention layer and that I should rename the layer something else. When I rename it to CustomMultiHeadAttention, it suddenly breaks the tfx pipeline particularly in the evaluator component. When I don't change anything else in the code except reverting it back to the name "MultiHeadAttention" the evaluator component runs okay but the problem is that when trying to export the model/saving and loading, I'm also having some problems. What is the cause of this or is it a bug in tfma/tfx?
Source code / logs
Error when changing Custom Layer name from MultiHeadAttention -> CustomMultiHeadAttention
eval_config.py
code snippet for evaluator component in tfx pipeline
multihead attention layer declaration snippet
The text was updated successfully, but these errors were encountered: