“What is the productivity of my grading team? Who are our strictest and most lenient graders? How aligned are my graders?” These are just a few of the questions you can answer by using the Grader Performance Dashboard.
This dashboard gives you a deep overview of the performance of your grading team, measured by many different metrics like AGT (average grading time), alignment score, assignment completion rate, etc. You can find this dashboard under the “Dashboards” section of the “Reports” page.
Let’s go through each section of the dashboard to better understand the data and how it can be used.
ACCESS
The dashboard is accessible by those with the role of admin or limited admin.
Please note that any changes made to a user's role or access can take up to 1-3 hours to sync through our reporting system and for them to see the updated information on this dashboard.
FILTERS
You can filter the dashboard in many different ways. A new filter you’ll see here that’s not in other reports you may be used to seeing in Maestro is the “Time Interval” filter. For reports where we are reporting on a metric over time, this defines what time intervals you want to break out the data into - week over week, month over month, etc. Please note that a “week” in reporting is considered to be Monday - Sunday.
Please also note that the charts on this dashboard use either the “reported as” date or the "assignment creation" date (not last graded, last updated, etc.). This is not configurable at the moment.
You can determine which time axis is being used for each chart by clicking the filter icon in the upper right corner of the chart.
"Reported as" is displayed as "reportingdate"
"Assignment creation" is displayed as "Assignment Created Date"
Be sure to hit “Apply Filters” after you update any filter.
Click “Save Filters” to save any combination of filters. When you come back to the dashboard in the future, you won’t need to manually refilter again. You can simply open up and apply a saved filter from the folder icon. NOTE: These saved filters are user-specific. Other users will not be able to see the filters you save.
I. OVERALL
This section provides a general, high-level overview of your grading team. Based on the selected filters, you can find our your team’s grading volume (number of tickets graded) and average grading time, as well as the ratio of agents to graders.
Please note that the grading volume/tickets graded only represent Agent QA (AQA) tickets, not Grader QA or Calibrations. This number also does not include any "N/A" scores.
II. GRADER METRICS
This next table gives you all the major metrics about your graders and groups of graders.
Avg Grading Time (AGT) - this is the amount of time it takes on average for this grader to grade one Agent QA ticket in Maestro. This does not factor in any other grades a grader might give such as Calibrations or Grader QA. This includes grading any kind of AQA ticket, whether they had a score or N/A attached to them.
(AGT of a group is calculated by summing the grading time for each ticket graded by that group and dividing it by the number of tickets graded)
GQA Alignment - This is the grader’s average Grader QA (GQA) alignment score. This average does not differentiate between whether that was the grader’s alignment as a benchmark grader or original grader.
(GQA Alignment of a group is calculated by summing the GQA of each user in that group and dividing by the number of users)
Avg QA Score Given - This is the average Agent QA score given by the grader. This does not factor in any other grades a grader might give such as Calibrations or Grader QA.
Total Grades - This is the number of tickets graded and does include AQA, Grader QA, and Calibrations. It will also count any scores that had a score of "N/A".
III. GRADING
This section lists out ticket assignments assigned for grading and then reports how many of those assignments were actually completed. Each ticket assigned counts as 1 assignment. An assignment is considered anything assigned via an automation. This means randomly graded tickets or manually assigned tickets are not included in this table. Note that this only included AQA tickets, and does not count GQA or Calibrations.
The "assigned" column includes all AQA tickets assigned for grading. The "graded" column includes the count of those tickets which were graded and had a numeric score (as in, it won't count any "N/A" scores).
The date ranges you see represent the date the ticket assignment was created. This is an important thing to note as this means the number you see here may not match up to the grading volume you see at the top of the dashboard. Essentially, this table is telling you "of the tickets assigned via automation during this time period, this is how many were graded at some point up to now." Whereas the "grading volume" count at the top of the dashboard is saying "this is how many tickets were graded during the time period set in the filters, regardless of when they were assigned out for grading".
In these next tables, you can confirm which graders/groups of graders tend to give the highest or lowest grades. Sometimes, this is just due to the types of tickets or agents these graders are grading. Other times, it’s an indicator of certain graders needing to adjust their grading standards.
See how often graders or groups are replacing tickets. To dig further into the reasons why, you can check out the Ticket Replacement Reasons Report!
This report helps you compare how many tickets have been graded this week compared to the prior week, split out by grader. “Current Week” means the Monday - Sunday of the week you are in. Previous week would mean the previous Monday - Sunday to that. Therefore, if you are looking at this report on a Tuesday, “Current Week” would only show you data for Monday and Tuesday, whereas “Previous Week” would be showing you data for the previous full calendar week (Monday to Sunday).
It will count only AQA scores and doesn't count any scores without a point value (as in, it will not count any tickets with a "N/A" score).
Note that this chart will completely ignore the date and time interval filters as it will always report on the current and previous week (reported as date). However, you can filter to certain rubrics or groups and this report will take those filters into account.
Tip: you can toggle the dots next to “current week” and “previous week” to dictate whether you want to see both or only one of the time frames on the chart.
IV: APPEALS
This section gives you a brief summary of your appeals workflow to help you understand which graders, rubrics, and criteria are getting appealed the most and how many of those appeals were valid and resulted in the appeal being approved. If you perform Grader QA, you will also be able to see the average general Grader QA Alignment score for that grader.
To dig further into appeals, check out the Appeals Dashboard! (you can find it under the same "Dashboards" section in Reporting)
V. GRADER QA
In this last section, you can confirm how your average Grader QA alignment is changing over time. Ideally, you should be seeing alignment increasing (improving) over time. You can also compare it to other metrics like AGT and tickets graded to better understand the factors that may affect alignments, such as if graders are starting to grade tickets faster and sacrificing accuracy for efficiency. Please note that the AGT and tickets graded charts in this section are not specific to Grader QA tickets only, but represent the averager grading time and count of tickets graded for Agent QA tickets.
That’s a wrap on the Grader Performance dashboard. You can find this dashboard under the “Dashboards” section of the “Reports” page. Give it a try and let us know what you think!