Grader QA Feedback Automation

Streamline your Grader Feedback workflows with Automations to drive Grader accountability at scale

Heli Munshi avatar
Written by Heli Munshi
Updated over a week ago


Grader QA was first introduced to Maestro users to provide a way for QA Managers to assess the integrity of their grading processes without needing to organize a Team Calibration exercise. Admins can schedule Automations to send samples of graded tickets to Benchmark Graders and have them reassess those graded tickets to validate a Grader's accuracy.

Now, a complementary automated solution is available to send feedback received from these Benchmark Graders to the Original Graders at scale. With the new Send Grader QA Feedback Automation, Admins can set a cadence for when Original Graders receive their feedback directly.

Why is this useful for Maestro Users?

Similar to its use with Agents, a call-to-action is a great way to drive engagement with feedback. Graders can log in to Maestro today and manually search for their feedback on their own. But some issues may arise:

  • Graders may be less diligent than others when searching for feedback consistently

  • The probability for Graders to miss out on feedback increases if they are unaware of when feedback is ready for them

By implementing a more structured feedback workflow, Admins can maximize the impact of GraderQA by ensuring the Original Graders are receiving their feedback timely and regularly so they can review it, reflect on it, and improve from it.

Admins can create an Automation to notify all of their Graders to check their new feedback on a regular basis so they can ask questions and course-correct more quickly. By closing the feedback loop on the GraderQA workflow, users can take the next step towards better Grader Alignment and more consistent grading.

Who Has Access to GraderQA Feedback Automation Configuration?

The following requirements are necessary to create GraderQA Feedback automations:

  • The user must have access to create Automations

  • An Admin must have enabled GraderQA in their Maestro account by going to Settings > Other Settings > Additional Feature Settings

Additional Feature Settings

Navigating the Send Grader QA Feedback Automation

Go to Settings > Sharing Automations > Create New Automation and select Send Grader QA Feedback Assignments. The following prompt will appear.

Send Grader QA Feedback Automation

Create a title for your Automation to reflect the purpose and cadence of the feedback to be distributed. If you'd like feedback to be sent out weekly, "Weekly Grader QA Feedback" is a good start.

Choose the number of days/weeks/months for your lookback window – how far back the Automation will search for benchmarked tickets. If you set the Select Feedback parameter to a week, then the Automation will identify any graded tickets that were benchmarked within the past 7 days every time it runs. A custom date range is also available if desired.

set automation schedule

Veteran Automation users will notice the configuration process is similar to other Automation types. After selecting the feedback window, set the schedule for the Automation. In order to ensure you are collecting all the feedback submitted and not including any already-sent feedback, we recommend that your frequency match your “lookback window” under the “Select Feedback” section. For example, if you selected feedback from the last two weeks, then the Automation should be set to run every two weeks.

automation notifications

Finally, select how you'd like Graders to receive their notifications to review their feedback, then save the Automation.

Note: Feedback notifications and the feedback Ticket Reviews are shared with the Original Grader only, not the Benchmark Grader.

Once saved, the Automation will be located on the main Sharing Automations page.

Like other Automation types, pressing the three vertical dots at the top-righthand side of the Automation will show additional settings to control access and make modifications.

Admins – use the Update Access option to ensure only the right users have visibility into this Automation. If you do not want other users to adjust the Automation, make sure no other users are selected here.

Accessing Feedback from Grader QA Feedback Automation

Note: In order for a Grader to receive feedback, they must be included in the list of Graders who can view Grader QA benchmark scores. To confirm your configuration for this, go to Settings > User Roles > Role Permissions. Make sure the desired Graders are selected under this option:

Once the Automation is run, a separate Ticket Review (bundle of tickets) is created for each Grader who is being sent feedback.

Based on the Automation’s configuration, the Grader will receive an e-mail notification with a link to this Ticket Review so they can view feedback easily.

The alternative way for a Grader to review their feedback is to visit the “Ticket Reviews” page and filter by assignment type “Grader QA Feedback,” as shown here:

Clicking into a Ticket Review will show the list of benchmarked tickets for a particular Grader on the lefthand side and the actual tickets on the righthand side. The Grader QA Feedback Ticket Review's title will include the name of the Automation that rendered it, the name of the Original Grader, and the date range of the lookback window in which the Automation searched for benchmarked tickets.

For any given ticket, Graders can press See Scores for the option to view their original QA Score for the Agent or the Grader QA feedback that was left for them.

Common Questions

What feedback gets sent?

Any ticket with scores from Benchmark Graders is included in the Automation. For example, if a Benchmark Grader was assigned 10 Grader QA tickets in their Assignment and only completed 2 tickets by the time the Feedback Automation was run, the 2 completed tickets will be included for feedback. The remaining 8 tickets will be sent later once they are completed and the Feedback Automation is run again.

Could a user have multiple GQA Feedback Automations?

Users are not blocked from creating more than 1 Grader QA Feedback Automation, but we do not recommend it. The best setup would be to have 1 Grader QA Feedback Automation which picks up ALL the feedback from Benchmark Graders and is set to run on a recurring schedule. A reminder here that we recommend frequency match the “lookback” period.

What happens if a Benchmark Grader updates their score?

If a Benchmark Grader updates their benchmark score, this will be treated the same way as a new benchmark score and this new score will be sent as part of the feedback the next time the Grader QA Feedback Automation is run.

Did this answer your question?