Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading and predicting with graph based model. #9747

Conversation

joejuzl
Copy link
Contributor

@joejuzl joejuzl commented Sep 29, 2021

Proposed changes:
This implements most of the functionality for loading and predicting with the graph runner.

TODO:

  • Fixing caching in tests using a fixture (fix cache usage in tests #9787)
  • Fix interactive.
  • yaml_store_reader in shared is using regex_message_handler.
  • Do we need to know what type of model we load (nlu/core)?
  • Some other TODO's in the code.

@joejuzl joejuzl force-pushed the 3.0-architecture-revamp/9604/prediction branch from 520d4aa to 5885e5b Compare September 29, 2021 15:16
@wochinge
Copy link
Contributor

Wohooo! 🎉 Amazing work! Will have a look tomorrow 🔎

@joejuzl
Copy link
Contributor Author

joejuzl commented Sep 30, 2021

These are the public methods of the processor that end up with a graph run:

.handle_message
# Handle a new message - full prediction
    calls ._run_prediction_loop
        loops over:
            .predict_next_with_tracker_if_should
                calls ._predict_next_action_with_tracker
                    runs graph
            ._run_action
    _save_tracker

.predict_next_with_tracker_if_should
# predicts next action but checking for should_predict_another_action etc. doesn't save tracker
    checks should_predict_another_action
    calls ._predict_next_action_with_tracker
        runs graph


.predict_next_with_tracker
# predicts next action but doesn't save tracker.
    calls ._predict_next_action_with_tracker
        runs graph

.predict_next_for_sender_id
# predict_next action but don't run it. Saves tracker (but only affects followup action)
    calls .predict_next_with_tracker
        calls ._predict_next_action_with_tracker
            runs graph
    _save_tracker (what actually gets saved?)

.log_message
# processes a message and optionally saves it to the tracker:
    calls ._handle_message_with_tracker
        calls .parse_message
            runs graph
        updates tracker with result
    optionally _save_tracker

.trigger_external_user_uttered
# triggers external message with intent and entities specified
    adds message to tracker
    calls ._run_prediction_loop
        loops over:
            .predict_next_with_tracker_if_should
                calls ._predict_next_action_with_tracker
                    runs graph
            ._run_action
    _save_tracker

.parse_message
# process message without tracker
    runs graph
    
.handle_reminder
    calls .trigger_external_user_uttered

rasa/core/agent.py Outdated Show resolved Hide resolved
rasa/core/featurizers/tracker_featurizers.py Show resolved Hide resolved
self.domain = domain
self.tracker_store = tracker_store
self.lock_store = lock_store
self.max_number_of_predictions = max_number_of_predictions
self.message_preprocessor = message_preprocessor
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably need to include this in the migration guide.

rasa/core/processor.py Show resolved Hide resolved
rasa/core/run.py Outdated Show resolved Hide resolved
rasa/server.py Show resolved Hide resolved
rasa/server.py Outdated Show resolved Hide resolved
rasa/shared/core/events.py Show resolved Hide resolved
rasa/telemetry.py Outdated Show resolved Hide resolved
Copy link
Contributor

@wochinge wochinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll continue tomorrow morning. So far it's been a very pleasant read. Considering that we are avoiding refactorings for now just this state is already multiple factors better than what we had 💯

rasa/nlu/classifiers/keyword_intent_classifier.py Outdated Show resolved Hide resolved
rasa/cli/shell.py Outdated Show resolved Hide resolved
rasa/core/agent.py Outdated Show resolved Hide resolved
rasa/core/agent.py Outdated Show resolved Hide resolved
rasa/core/agent.py Outdated Show resolved Hide resolved
@@ -557,47 +431,42 @@ async def parse_message_using_nlu_interpreter(
}

"""

processor = self.create_processor()
if not self.is_ready():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it make sense to make this a decorator? Somewhat duplicate with ensure_loaded_agent in server 🤔

rasa/core/agent.py Outdated Show resolved Hide resolved
rasa/core/agent.py Show resolved Hide resolved
@joejuzl joejuzl linked an issue Oct 1, 2021 that may be closed by this pull request
4 tasks
@joejuzl joejuzl marked this pull request as ready for review October 1, 2021 08:14
@joejuzl joejuzl requested a review from a team as a code owner October 1, 2021 08:14
@joejuzl joejuzl requested review from a team and alopez and removed request for a team and alopez October 1, 2021 08:14
Copy link
Contributor

@wochinge wochinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Next batch of comments - I think I'm mostly missing the tests now

rasa/core/processor.py Show resolved Hide resolved
rasa/core/processor.py Outdated Show resolved Hide resolved
rasa/core/processor.py Show resolved Hide resolved
rasa/core/processor.py Show resolved Hide resolved
rasa/core/processor.py Show resolved Hide resolved
rasa/core/test.py Show resolved Hide resolved
parsed_message = processor.interpreter.featurize_message(
Message(data={TEXT: text})
)
parsed_message = processor.parse_message(UserMessage(text=text))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't we only do this when having e2e which means that we already have the tokens from the time when we verified the NLU test result?

rasa/core/training/story_conflict.py Show resolved Hide resolved
Copy link
Contributor

@wochinge wochinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amazing stuff!

rasa/core/test.py Show resolved Hide resolved
rasa/engine/runner/dask.py Outdated Show resolved Hide resolved
rasa/model_testing.py Show resolved Hide resolved
rasa/model_testing.py Show resolved Hide resolved
rasa/model_testing.py Show resolved Hide resolved
tests/core/test_run.py Outdated Show resolved Hide resolved
tests/engine/test_validation.py Show resolved Hide resolved
tests/test_server.py Outdated Show resolved Hide resolved
tests/nlu/test_config.py Outdated Show resolved Hide resolved
Copy link
Contributor

@wochinge wochinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work! Requesting changes due to the raise at the server start

rasa/core/agent.py Show resolved Hide resolved
rasa/core/agent.py Show resolved Hide resolved
rasa/core/agent.py Outdated Show resolved Hide resolved
rasa/core/agent.py Show resolved Hide resolved
rasa/core/agent.py Outdated Show resolved Hide resolved
rasa/core/run.py Outdated Show resolved Hide resolved
parsed_message = processor.interpreter.featurize_message(
Message(data={TEXT: text})
)
parsed_message = processor.parse_message(UserMessage(text=text))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rasa/engine/storage/local_model_storage.py Outdated Show resolved Hide resolved
rasa/engine/storage/storage.py Outdated Show resolved Hide resolved
)
new_agent.lock_store = app.agent.lock_store
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't really get why we have to do the lockstore separately but as it was done like that before 🤷🏻

Copy link
Contributor

@wochinge wochinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💯 🎸 🚀

@joejuzl joejuzl merged commit 2e7b6ee into 3.0-architecture-revamp/9277/recipe Oct 7, 2021
@joejuzl joejuzl deleted the 3.0-architecture-revamp/9604/prediction branch October 7, 2021 10:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Integrate rasa train and rasa run with new model architecture
2 participants