-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reduce model_confidence
softmax warning in CI Tests
#8146
Comments
model_confidence
softmax warning in CI Testsmodel_confidence
softmax warning in CI Tests
Exalate commented: iurisevero commented: Hi, I would like to try to solve this issue. Is anyone working on it right now? |
Exalate commented: iurisevero commented: So, I was working on it and the first idea I have to solve the warning at def test_confidence_loss_settings( component_config: Dict[Text, Any], raises_exception: bool ): component_config[SIMILARITY_TYPE] = INNER if component_config[MODEL_CONFIDENCE] == SOFTMAX: with pytest.warns(UserWarning): run_confidense_loss_settings(component_config, raises_exception) else: run_confidense_loss_settings(component_config, raises_exception) def run_confidense_loss_settings( component_config: Dict[Text, Any], raises_exception: bool ): if raises_exception: with pytest.raises(InvalidConfigException): train_utils._check_confidence_setting(component_config) else: train_utils._check_confidence_setting(component_config) I'd like to know, is it running away from the code pattern you follow? (hope no) Also, I noticed that tests with |
Exalate commented: iurisevero commented: I tested to change the I believe the way to reduce the warnings now is to add the |
Exalate commented: Imod7 commented: Hey @iurisevero for your comments on this ticket! @twerkmeister any thoughts on this one? |
Exalate commented: iurisevero commented: After lots of effort trying to catch all warnings raised, I realized that was too much work for little progress and, maybe, just ignore them at Some files look like it is just impossible to catch the warns, like the filterwarnings = felt very valid after some hours |
Exalate commented: rgstephens commented: Why do I get this warning when |
Exalate commented: Imod7 commented: @iurisevero Thank you so much for your work on this one! @rgstephens Exactly! Based on this ticket I also see
and it is also explicitly mentioned that a possible value is Maybe someone from the research team @dakshvar22 ? can give a more accurate answer on this one. Should the model_confidence be set everywhere to |
Exalate commented: dakshvar22 commented: We cannot set the default value of |
Exalate commented: iurisevero commented: As the model_confidence cannot be changed, should the filter warning be implemented to reduce the warnings in CI tests? I think is an alternative until the new major release. |
Exalate commented: wochinge commented: Should be enough to use |
Exalate commented: iurisevero commented: Well, I tried to do it, and I failed... I couldn't fix all warnings with just I'll look for the changes I made and push them to Github, it may help. Edit: All work I've done on this issue is in this branch. The first and second commit, to be more specific. |
Exalate commented: wochinge commented: Could you open up a PR against the |
Exalate commented: iurisevero commented: Sure. I'll just undo the revert first, to recover the I don't remember how far I went in the catches, but I'll try to take a look at it on weekdays. |
Exalate commented: stale[bot] commented: This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Python version: Python 3.6 Python 3.7
Operating system (windows, osx, ...): Ubuntu Windows
Issue: This Issue is part of the effort to close the Issue#7738 hence to reduce the amount of warnings that are raised when running all tests in the CI. From a first investigation, it seems that this warning ( UserWarning: model_confidence is set to
softmax
. It is recommended to try usingmodel_confidence=linear_norm
to make it easier to tune fallback thresholds. ) for which the current Issue was opened, had 150 occurrences but in a latest CI run it appears in only 81 tests.Warning (including full traceback):
2021-03-09T05:38:31.3112981Z rasa/utils/train_utils.py:428: 81 tests with warnings 2021-03-09T05:38:31.3114236Z /home/runner/work/rasa/rasa/rasa/utils/train_utils.py:428: UserWarning: model_confidence is set to
softmax
. It is recommended to try usingmodel_confidence=linear_norm
to make it easier to tune fallback thresholds. 2021-03-09T05:38:31.3115429Z category=UserWarning,Possible Solutions:
model_confidence
tolinear_norm
will eliminate the warning.model_confidence
might have changed since this depends on the latest findings from the research team. So maybe rerun the CI tests to see the latest recommendation or one of the tests below.Command or request that led to error:
pytest tests/utils/test_train_utils.py::test_confidence_loss_settings pytest tests/utils/test_train_utils.py::test_confidence_similarity_settings pytest tests/core/test_training.py::test_training_script_with_max_history_set
Definition of Done:
model_confidence
warning appears in only the cases where it is explicitly tested forsoftmax
.model_confidence
takes a default value then this should be the recommended one.The text was updated successfully, but these errors were encountered: