Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correct transformer_size parameter if needed for ResponseSelector during loading #10651

Merged
merged 3 commits into from
Jan 11, 2022

Conversation

jupyterjazz
Copy link
Contributor

Proposed changes:

Status (please check what you already did):

  • added some tests for the functionality
  • updated the documentation
  • updated the changelog (please check changelog for instructions)
  • reformat files using black (please check Readme for instructions)

@jupyterjazz jupyterjazz requested a review from a team as a code owner January 10, 2022 11:43
@jupyterjazz jupyterjazz requested review from a team and removed request for a team January 10, 2022 11:43
@jupyterjazz jupyterjazz changed the title Correct transformer_size parameter if needed for ResponseSelector` during loading Correct transformer_size parameter if needed for ResponseSelector during loading Jan 10, 2022
Copy link
Contributor

@mleimeister mleimeister left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the proposed fix in RasaSequenceLayer also take effect in the training phase of ResponseSelector, not just model loading? In that case the original function ResponseSelector._warn_and_correct_transfomer_size would be redundant and could be removed.

However, if this gets problematic with the ResponseSelector tests (afaik they check that this function is run as expected), it could also be dealt with in a follow-up ticket.

@mleimeister
Copy link
Contributor

mleimeister commented Jan 10, 2022

For transparency, you could add to the docstring of RasaSequenceLayer._get_transformer_dimensions to clarify in which cases DEFAULT_TRANSFORMER_SIZE is used.

@jupyterjazz
Copy link
Contributor Author

Does the proposed fix in RasaSequenceLayer also take effect in the training phase of ResponseSelector, not just model loading? In that case the original function ResponseSelector._warn_and_correct_transfomer_size would be redundant and could be removed.

However, if this gets problematic with the ResponseSelector tests (afaik they check that this function is run as expected), it could also be dealt with in a follow-up ticket.

Yes, it does and as you predicted removing the existing correction causes some problems in the tests. I'll create a separate ticket to address that.
Just to clarify, should we remove the whole function or should we leave the warning and disable the correction?

Copy link
Member

@ancalita ancalita left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍🏼

@mleimeister
Copy link
Contributor

Just to clarify, should we remove the whole function or should we leave the warning and disable the correction?

Since the default is then set in RasaSequenceLayer, the warning could be moved there as well and the original function be removed completely. That way, the warning would be logged both when training and loading a model?

@mleimeister
Copy link
Contributor

LGTM 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants