Skip to main content
Calibration Alignment Reporting

Generate and Analyze Alignment Scores for Calibration Sessions

Matt avatar
Written by Matt
Updated over 2 years ago

The new “Calibration Alignment” report is a heat map that displays how aligned graders are on the various calibrations they’ve completed.

calibration alignment report

Why is Calibration Alignment Reporting Important?

Admins spearheading some of the most successful QA programs schedule regular calibration sessions with their graders to ensure they're all aligned with their grading. This increases the probability for agents to be graded properly and accurately no matter who happens to grade them.

In addition to consistent feedback, calibration reporting can help leaders uncover where potential grading discrepancies occur most frequently, which leads to healthy discussions around how to improve overall grading alignment:

  • Which graders are least aligned?

  • Which rubric(s) or rubric criteria have the least alignment?

  • What is the best way to close the gap?

    • Work more closely with a particular grader and his/her/their interpretation of the rubric or rubric criteria and how to grade certain customer situations?

    • Modify the rubric or rubric criteria somehow to provide further clarification?

Either way, this report gives leaders more ways to assess their overall QA process and determine actionable steps to optimize it. Understanding the WHY behind any misalignment could be the first step in improving QA's effectiveness for the team - as well as the added confidence in your QA metric and what it represents.


Where is Calibration Reporting Located?

On the Reporting page, you'll now see a folder labeled Calibrations located on the left side of the screen. There are now two reports in this folder at this time:

  • Calibration Alignment - the new Calibration solution designed as a heat map

  • Calibration - the old report that was originally located under the Grading Performance report folder

    • Note: The old Calibration report will be deprecated within the next 3 months

how to find calibration reporting


Who has Access to Calibration Reporting?

Users who have View Calibrations set to "Own" or "Own + Final" on the Role Permissions page in User Roles can see the new calibration report. To summarize:

If View Calibrations is set to:

  • "Own + Final" - The user has access to the Calibration Alignment reporting at all levels (their own Alignment Scores and the User Groups/Graders/Rubrics they have access to").

    • We recommend Admins and Limited Admins be set to this option.

  • "Own" - The user can only see the Alignment Scores comparing their personal Individual Calibrations with the Final Calibrations for those tickets. Because of this, the User Groups option would be unavailable since the user would only have access to himself/herself/themselves.

    • We recommend Graders and Managers to be set to this option if they take part in Calibration exercises.

  • "No" - The Calibration Alignment report is unavailable to the user and cannot be seen as a result.

role permissions for calibrations


Navigating the Calibration Heat Map

The Calibration report comes with 5 filters:

  • Ticket Reviews: Defaults to the most recently completed Calibration Review. Multiple Calibration Review exercises can be selected if desired. The report will combine all of the Calibrations exercises into one unified Alignment Score (instead of showing a separate Alignment Score per Calibration Ticket Review)

    • IMPORTANT - A Calibration Ticket Review won't appear in the report dropdown until the Final Calibrations for the tickets in the review are complete. Without them, there would be no baseline to compare graders' Individual Calibrations

  • User Groups: Configure your User Groups by going to Settings > User Roles > User Groups

  • Graders: Specific graders can be selected

  • Rubrics: Specific rubrics can be selected

  • Criteria: Specific criteria can be selected

User Groups/Graders represent the rows in your heat map, while the Rubrics/Criteria serve as your columns.

The default view shows the most recently completed Calibration Ticket Review at the rubric level for all User Groups. Admins can select specific User Groups to refine the report.

rubric alignment scores

Each percentage represents an Alignment Score. For example - The combined Individual Calibrations for all graders in the All QA Specialists user group are 76% aligned with the Final Calibrations for those calibrated tickets.

As you can see above, hovering over one tile with an Alignment Score shows a prompt with options to get more granular:

  • View alignment discrepancies at the ticket level

  • View Alignment Scores at the rubric criteria level

  • View Alignment Scores at the grader level

criteria alignment scores

Above shows an example of the criteria view of the heat map. As we can see, the User Groups has an Alignment Score of 57% for the Merchant Growth rubric criteria, meaning their Individual Calibrations were only 57% aligned with the agreed-upon Final Calibration scores. Hovering over the tile again allows us to dive deeper and look at the grader level for this team.

alignment score heat map

We can now see the Alignment scores for the two graders on this team at the rubric criteria level. It appears that although both graders' Alignment Scores are fairly low for this criteria, One grader has a much lower Alignment Score for Merchant Growth. This could lead to further investigation into why that is the case. Hovering over the 44% tile will prompt the option to view the actual tickets that were calibrated during the calibration exercise to get a better understanding of where the discrepancy is occurring. Below is one of the tickets

final calibration

Here is an example of seeing a Final Calibration at the Ticket level, comparing the Final Calibration score and response to one grader's Individual Calibration. If we look for our grader in question, we can see where he chose a different answer than the majority of his grading peers. Depending on how many points this criterion was worth relative to other criteria in the rubric, the Alignment score will factor that in and "ding" the grader more for misaligning on a criterion that holds more weight (or impacts the overall score more).

final calibration at ticket level


How is the new Calibration Report Different from the Old Report?

Reason 1: Now Incorporating the Alignment Score as a Metric

The Alignment Score is not just a metric for GraderQA - it is a more accurate way to gauge misalignment for Calibrations as well. For Calibrations specifically, it is a percentage that shows how aligned a grader's Individual Calibration for a ticket is with the Final Calibration created by the grading team, which is typically initiated and led by an Admin.

(Aside - This workflow is separate from Benchmark Calibrations, which are typically created by a Benchmark Calibrator. Most teams use the workflow referenced above)

An alignment score is NOT the percentage difference between the Individual Calibration Score for the ticket and the Final Calibration Score.

The reason for this is that both the Individual Calibration and Final Calibration may have the same score, but obtained in different ways.

For example, an Original Score and a Benchmark Score could both have the same QA score of 80%. However, the criteria selected to add up to this 80% could differ across the two unique QA scores. This would mean that the scores are not truly aligned.

For this reason, we calculate Alignment on the Criteria level, and then weigh that up to a Rubric Alignment Score. This is the core difference between the original Calibration Report and the new Calibration Report, which takes this logic into account for all calculations.

Reason 2: Readjusting Individual Calibration/Final Calibration Comparisons at the Ticket Level

In the old report, Individual Calibrations were averaged separately from Final Calibrations and then compared. This means that if a grader only took part in 4 ticket calibrations, but 10 total calibrations were done on the same rubric, that grader's average Individual Calibrations would be compared to the average of all 10 Final Calibrations that were created for that rubric. In other words, the grader could potentially be impacted by calibrations he/she/they were never a part of.

In the new report, we first determine a grader's alignment at the ticket level, so that each Individual Calibration is compared to the right Final Calibration to generate an Alignment Score for each Ticket. Then we take the average of the Ticket Alignment Scores to get an overall Alignment for a Calibration Review.

Here's an example. Suppose a Calibration exercise has 3 tickets.

  • Ticket #1 - Comparing Grader's Individual Calibration to Final Calibration for Ticket 1 = Grader's Alignment Score for Ticket #1

  • Ticket #2 - Comparing Grader's Individual Calibration to Final Calibration for Ticket #2 = Grader's Alignment Score for Ticket #2

  • Ticket #3 - Comparing Grader's Individual Calibration to Final Calibration for Ticket #3 = Grader's Alignment Score for Ticket #3

  • Average these 3 Alignment scores together = Grader's Alignment Score for the Calibration Ticket Review

  • If 2 other graders (3 total graders) also participate in the Calibration exercise, their Alignment scores would be calculated the same way. Take the average of all 3 graders' Alignment scores and that gives you the Alignment Score for that User Group/group of graders.

Reason 3: Option to Only Show Data for Fully Completed Calibration Ticket Reviews

Admins can now view reporting in 2 ways:

  • View all calibration alignment scores for all Graders, whether they fully completed all tickets within a Calibration Ticket Review or not

  • ONLY show Calibration alignment scores for graders who've calibrated every ticket within the Calibration Ticket Review


Deeper Explanation - Calculating the Criteria Alignment Score

As an example, let's assume this scenario:

  • There is a criterion on your rubric that is worth 15 points - “Did the agent provide the appropriate resolution”.

  • The Individual Calibrator scores a 10 out of 15 on this question

  • The Final Calibration score is a 7 out of 15 on this question.

Under these conditions:

  • The Individual Calibrator was “off” of the Final Calibration by 3 points, 20%.


    This is calculated by dividing the amount “off” (3 points) by the total points available on this Criteria (15 points)

  • Therefore, the Individual Calibrator is 80% aligned for this Criteria

  • The Criteria Alignment Score for this criteria is 80%.

This step process is done for each criterion within the rubric.


Deeper Explanation - Calculating the Rubric Alignment Score

In the case that a rubric has a max total point of 80, this is how you would calculate the Alignment score.

  • The above criteria would be ~18.75% (15/80) of the total score.

  • Multiply the Alignment by the weight calculated to receive how much this option contributes towards the alignment score.

The math for step two:

  • 80% Alignment x 18.75% Weight = 15% Weighted Criteria Alignment Score.

15% is added with all the other Weighted Criteria Alignment Scores from the rubric to get the Rubric Alignment Score.

Please see this spreadsheet for a more detailed example of how these metrics are calculated. Make a copy of the Google Sheet to test the calculations for yourself!


Can the Calibration Alignment Report be Exported?

Yes! As of 9/10/2021, users can now export this report as a CSV file.

calibration alignment report export

Select Export Chart in the top-right corner. In the following prompt, title your export and enter the email addresses of the recipients. The notification can be configured to send via email or Slack (if integrated with Maestro). Press Export Chart.

export chart prompt

Go to the Exports tab (top of your screen) and the option to download the chart will appear.

download export

The downloaded CSV chart (one file) will look exactly like the chart you created in the app. Your settings in the Calibration Alignment Report will dictate what columns (Rubrics/Criteria) and rows (User Groups/Individual Graders) of data will appear in your export. In the example below, the exported chart shows Alignment Scores for one group of Graders for one Rubric at the Criteria level.

Did this answer your question?