Skip to content

User Journeys 2022 (Draft)

Matt King edited this page May 27, 2023 · 1 revision

Introduction

This page outlines user journeys in the ARIA-AT App for the following user personas:

The specific scenarios and use cases are derived from the working mode. These user journeys will be used to prioritize the design and development of new features, user interfaces, and user flows for ARIA-AT App in 2022.

Status: DRAFT

User Persona: Test Admin

Phase 1: Test Plan Research and Development

Scenario 1: Add draft test plan to Test Queue

  • Use Case Overview: When a Test Developer indicates a Test Plan is ready for community review, the Admin will then make it available through the Test Queue in the app.
  • Trigger: The Admin has been notified that a new Test Plan is ready for review and has been merged into the main branch.
  • Precondition: The Test Plan has been merged into main.
  • User Journey:
    1. Navigates to the Test Queue.
    2. Opens "Add Test Plans to the Test Queue" disclosure.
    3. Selects a Test plan from the Test Plan dropdown.
    4. Selects a Test plan version from the Test Plan Version dropdown.
    5. Selects an Assistive Technology from the dropdown.
    6. Selects a Browser from the dropdown.
    7. Clicks the "Add Test Plan to Test Queue" button.

Phase 2: Draft Test Plan Review

Scenario 2: Review Test Plan Conflicts where the root cause is tester interpretation

  • Use Case Overview: When at least two testers have fully executed a Test Plan in at least one browser and results from the different testers conflict with one another due to tester interpretation.
  • Trigger: The Admin has been notified (or noticed themself) that a Test Plan has been fully executed by at least two testers and there are conflicts.
  • Precondition: A Test Plan run is fully executed by at least two testers and the "Conflicts" label is displayed for the Test Plan in the Test Queue.
  • User Journey:
    1. Navigates to the Test Queue.
    2. Identifies the Test Plan with conflicts.
    3. Clicks the "Opens run as" button.
    4. Selects a Tester from the "Open run as dropdown" who has completed the Test Run.
    5. Lands on the Test Run Page.
    6. Navigates through the Test Navigator Nav to identify tests with conflicts.
    7. Clicks a Test with conflicts in the Test Navigator Nav.
    8. Clicks the "Review Conflicts" button in the Alert displayed above the Test.
    9. Reviews conflicts details displayed in the modal.
    10. Facilitates conversation with testers to identify which result should be modified.
    11. Click the "Edit Results" button underneath the Test.
    12. Modifies the test results.
    13. Clicks the "Submit Results" or the "Next Test" button.

Scenario 3: Review Test Plan Conflicts where the root cause is an error in the test

  • Use Case Overview: When at least two testers have fully executed a Test Plan in at least one browser and results from the different testers conflict with one another due to an error in the test.
  • Trigger: The Admin has been notified (or noticed themself) that a Test Plan has been fully executed by at least two testers and there are conflicts.
  • Precondition: A Test Plan run is fully executed by at least two testers and the "Conflicts" label is displayed for the Test Plan in the Test Queue.
  • User Journey:
    1. Navigates to the Test Queue.
    2. Identifies the Test Plan with conflicts.
    3. Clicks the "Opens run as" button.
    4. Selects a Tester from the "Open run as dropdown" who has completed the Test Run.
    5. Lands on the Test Run Page.
    6. Navigates through the Test Navigator Nav to identify tests with conflicts.
    7. Clicks a Test with conflicts in the Test Navigator Nav.
    8. Clicks the "Review Conflicts" button in the Alert displayed above the Test.
    9. Reviews conflicts details displayed in the modal.
    10. Clicks the "Raise an Issue for Conflict" to create a Github Issue.
    11. Facilitates Community Group conversation about the failure.

Phase 3: Candidate Test Plan Review

Scenario 4: Promote a Draft Test Plan to Candidate Test Plan

  • Use Case Overview: When at least two testers have fully executed a Test Plan in at least one browser with each in-scope assistive technology and all review issues are close.
  • Trigger: The Admin has been notified (or noticed themself) that a Test Plan has no conflicts anymore.
  • Precondition: All review issues are close and at least two testers have generated equivalent test results in at least one browser with each in-scope assistive technology.
  • User Journey:
    1. Navigates to the Test Queue
    2. Identifies Test Plan that went through the review.
    3. Clicks the "Mark as Candidate" button.
  • User Journey gaps and/or suggested changes: We need to change "Mark as in Review" to be "Mark as Candidate"

Phase 4: Recommended Test Plan Reporting

Scenario 5: Promote a Candidate Test Plan to Recommended Test Plan

  • Use Case Overview: When a test plan has been in the Candidate phase for at least 120 days and there are no open test plan issues.
  • Trigger: The Admin has been notified (or noticed themself) that a Test Plan has been in the Candidate phase for at least 120 days.
  • Precondition: The test plan has been in the Candidate phase for at least 120 days and all issues have been resolved.
  • User Journey:
    1. Navigates to the Test Queue.
    2. Identifies the Candidate Test Plan that is ready to be moved to the Recommended phase
    3. Clicks the "Mark as Recommended" button.
  • User Journey gaps and/or suggested changes: We need to change "Mark as Finalized" to be "Mark as Recommended"

Scenario 6: Publish a Recommended Test Plan Report

  • Use Case Overview: When a test plan has been promoted to Recommended it is ready to be published.
  • Trigger: An AT Developer has promoted Candidate Test Plan to Recommended.
  • Precondition: A Candidate Test Plan has been promoted to Recommended.
  • User Journey:
    1. Navigates to the Test Queue.
    2. Identifies Recommended Test Plan.
    3. Clicks the "Publish Report" button.
  • User Journey gaps and/or suggested changes: 2. Implement a "Publish Report" action to make a Test Plan Report available on the Reports page.

Scenario 7: Add a new AT Version to the app

  • Use Case Overview: When a new version of an in-scope AT is released, the Test Admin needs to make it available for testers.
  • Trigger: A new version of an AT has been released.
  • Precondition: None.
  • User Journey:
    1. Navigates to the Test Queue.
    2. Opens "Manage Assistive Technology Versions" disclosure.
    3. Selects an AT from the dropdown.
    4. Clicks the "Add a New Version" link.
    5. Paste the version number in the input field.
    6. Clicks the "Add Version" button.

User Persona: AT Developer

Phase 3: Candidate Test Plan Review

Scenario 1: Review the status of Candidate Test Plans

  • Use Case Overview: When an AT Developer wants to know Candidate Test Plans that are waiting for their review
  • Precondition: The Test Plan needs to be promoted to Candidate.
  • User Journey:
    1. Navigates to the Test Reports page.
  • User Journey gaps and/or suggested changes:
    1. Incorporate a “phase” labels for Test plans on the Reports page so users can differentiate Candidate from Recommended Test Plans
    2. Incorporate “status” labels for Test Plans on the Reports page so AT Developers know if their help is needed and to identify progress. The proposed labels are:
      1. Ready for review: When a new Test Plan is moved to the Candidate Phase or when all issues have been resolved
      2. X open issues: When an AT Developer requests changes or leaves feedback, both of these actions culminate in the creation of a new Github issue. The number of open issues should be automatically updated in the app.
      3. Approved: When an AT Developer reviews a Candidate Test Plan and considered is ready to be moved to the Recommended phase but the Admin.

Scenario 2: Mark the status of a Candidate Test Plan

  • Use Case Overview: When an AT Developer wants to mark their review of a Candidate Test Plan
  • Precondition: The AT Developer has reviewed a Candidate Test Plan
  • User Journey:
    1. Navigates to the Test Reports page.
    2. Identifies a Candidate Test Plan from the table.
    3. Clicks the Candidate Test Plan name.
    4. Scrolls through the Report to identify the AT/Browser combination to review.
    5. Marks their review with 1 of the 3 different statuses available: Approved, Request Changes, or Leave Feedback
  • User Journey gaps and/or suggested changes:
    1. Incorporate 3 actions for Candidate Test Plans with the following functionalities
      1. Approved: This action will mark a Candidate Test plan as ready to be moved to the Recommended phase by the Admin.
      2. Request Changes:
        1. This option should be available in two places, at a high level so the AT Developer can request changes for a Test Plan in general and within the report’s detail page so they can do the same for a single test.
        2. When the AT Developer requests changes, they will be taken to a pre-populated Github Issue where they can elaborate. A “Request Changes” label will be automatically added to the GitHub Issue.
        3. Once the issue is saved, the “open issues” indicator in the app should reflect this. When an issue gets closed, the app should reflect this update as well
      3. Leave Feedback:
        1. Like the “Request Changes” action, this option should also be available in two places, at a high level so the AT Developer can leave feedback for a Test Plan in general and within the report’s detail page so they can do the same for a single test.
        2. When this option is clicked, the AT Developer will be taken to a pre-populated Github Issue where they can elaborate on the feedback they want to provide. A “Feedback” label should be automatically added to the GitHub issue. Once the issue is saved, the “open issues” indicator in the app should reflect this. When an issue gets closed, the app should reflect this update as well.
    2. There is currently a “Raise an issue” button when looking at a single test in a Test Plan Report. This should be removed to avoid confusion

Scenario 3: Review the report form (test run page) of a test plan.

  • Use Case Overview: When an AT Developer is reviewing a Test Plan report and wants to view the report’s form.
  • Precondition: The AT Developer is reviewing a Test Plan report.
  • User Journey:
    1. Navigates to the Test Reports page.
    2. Identifies a Candidate Test Plan from the table.
    3. Clicks the Candidate Test Plan name.
    4. Scrolls through the Report to identify the AT/Browser combination to review.
    5. Clicks the “Open Test Plan Run”
  • User Journey gaps and/or suggested changes:
    1. Add an option to the Reports so an AT Developer can review a Test run.
    2. This option could be available in two places.
      1. Under the AT and Browser details heading, which would take the AT Developer to the first test in the Test Run page
      2. Under the Test Name heading in the details page, which will take the AT Developer to that particular test in the Test Run page
    3. They should not be able to make any edits.

Scenario 4: Raise a bug with a Screen Reader

  • Use Case Overview: When an AT Developer encounters a bug in a screen reader while reviewing a Test Plan report
  • Precondition: The AT Developer is reviewing a Test Plan report
  • User Journey:
    1. Navigates to the Test Reports page.
    2. Identifies a Candidate Test Plan from the table.
    3. Clicks the Candidate Test Plan name.
    4. Scrolls through the Report to identify the AT/Browser combination to review.
    5. Goes through the tests in the Reports Table to review the passing and failing assertions.
    6. Clicks on one of the Test Names.
    7. Clicks the “Open Test” button
    8. Identifies a screen reader bug
    9. Clicks the “File Screen Reader Bug” button
  • User Journey gaps and/or suggested changes:
    1. We need to incorporate the “File Screen Reader Bug”, which should probably go where the current “Raise an issue” button is.
    2. This button should take the AT Developer to the Screen Reader’s repository
Clone this wiki locally