Upload
lymien
View
216
Download
3
Embed Size (px)
Citation preview
Contact Center QA Guide: Building a World-Class Quality Assurance Program
Sponsored by:
About DMG Consulting LLC
DMG Consulting LLC is the leading analyst firm in the contact center and analytics markets. We are a strategic advisor to companies large and small. Our mission is to help companies build effective contact center and service environments that provide an outstanding customer experience. We achieve this goal by helping our clients leverage technology, process and people to optimize operational efficiency, sales and profits. Our actionable strategies and tactics effect change that enable companies to make strategic and tactical improvements with a rapid return on investment. Our customers include Global 2000 companies, government agencies, non‐profit organizations, and more than 150 contact center and analytics vendors. DMG Consulting LLC’s business strategists have an average of 20 years experience in customer experience management (CEM), customer relationship management (CRM), contact centers, building customer‐focused businesses and financial analysis. We understand the power of customer data and the contributions that quality management, recording, speech analytics, performance management, surveying, eLearning, coaching and workforce management systems make to the contact center and the entire enterprise. Our consulting experience with more than 2000 end‐user organizations and our hands‐on operational, technology and financial expertise give us deep insight into what customers want and need from enterprises and vendors. All rights reserved. This report is protected by United States copyright law and may not be copied, reproduced, displayed, published, transmitted or distributed in whole or in part in any form or medium without the express written permission of DMG Consulting LLC. You may not alter or remove any trademark, copyright or other notice from this report. Reproduction in whole or in part in any form or medium without express written permission of DMG Consulting LLC is prohibited. Substantial effort went into verifying and validating the accuracy of the information contained within this Report, however, DMG Consulting LLC makes no warranties as to the accuracy or completeness of this information. DMG Consulting LLC is not liable for any damages, consequential or otherwise, arising from use of this information. Copyright © 2009 DMG Consulting LLC
© 2009 DMG Consulting LLC - i - April 2009 All rights reserved.
Table of Contents
Executive Summary .....................................................................................................1 1. Introduction .....................................................................................................2 2. What is Contact Center Quality Assurance? ......................................................3 3. Quality Assurance Program Benefits ................................................................6 4. Components of a QA Program..........................................................................7 5. Getting Started ..............................................................................................10 5.1 Creating a Quality Assurance Leadership Team.............................................. 15 5.2 Staffing QA Positions .....................................................................................16 5.3 Implementation Roadmap..............................................................................18
6. Developing a QA Evaluation Form ..................................................................21 6.1 Form Categories.............................................................................................24 6.2 QA Evaluation Form Questions.......................................................................25 6.3 Assigning Weights to QA Evaluation Forms ....................................................26 6.4 Validation....................................................................................................... 27 6.5 Example QA Evaluation Forms .......................................................................29
7. Quality Assurance Monitoring Criteria and Guidelines ....................................48 7.1 Quality Monitoring Criteria and Guidelines .....................................................48 7.2 Why are Quality Monitoring Criteria and Guidelines Necessary? .....................49
8. Calibration......................................................................................................50 8.1 What is Calibration?........................................................................................50 8.2 Benefits of Calibration.................................................................................... 51 8.3 The Calibration Process ..................................................................................52
9. Quality Assurance Program Mechanics and Processes ....................................54 9.1 Transaction Selection Criteria.........................................................................54 9.2 Determining the Number and Frequency of Evaluations .................................55 9.3 Coaching ........................................................................................................56 9.4 Evaluation Feedback and Escalation Process ..................................................58 9.5 Ongoing Training ...........................................................................................59 9.6 Addressing Agent Performance Issues............................................................60 9.7 Rewards and Recognition ...............................................................................63 9.8 Updating Procedures/Training........................................................................64
© 2009 DMG Consulting LLC - ii - April 2009 All rights reserved.
© 2009 DMG Consulting LLC - iii - April 2009 All rights reserved.
9.9 Monitoring Quality Assurance Reviewers.......................................................65 9.10 Reporting .......................................................................................................67 9.11 QA Database .................................................................................................. 74
10. Kicking Off the QA Program........................................................................... 75 10.1 Agent Training ............................................................................................... 76 10.2 QA Program Pilot ........................................................................................... 78
11. Advanced Quality Assurance Initiatives .......................................................... 79 11.1 Surveying ....................................................................................................... 79 11.2 Customer Experience Monitoring ................................................................... 79 11.3 First Call Resolution (FCR)...............................................................................80
12. Quality Management/Liability Recording Suites .............................................81 Appendix A: Procedure Format Sample .....................................................................85 Appendix B: Policy Format Sample ............................................................................88
© 2009 DMG Consulting LLC - iv - April 2009 All rights reserved.
Table of Figures
Figure 1: Quality Assurance Process .............................................................................5 Figure 2: Quality Assurance Program Development Roadmap....................................10 Figure 3: QA Implementation Roadmap .....................................................................19 Figure 4: Common Quality Evaluation Form Sections/Categories ...............................24 Figure 5: Financial Services Customer Service QA Evaluation Form ............................30 Figure 6: Healthcare Customer Service QA Evaluation Form.......................................34 Figure 7: Technical Support QA Evaluation Form........................................................38 Figure 8: Sample Customer Service Quality Monitoring Form.....................................42 Figure 9: New Order Precision Quality Monitoring Form.............................................45 Figure 10: The Calibration Process............................................................................... 53 Figure 11: Coaching Methods ......................................................................................56 Figure 12: Average Contact Center QA Score: December 2008....................................67 Figure 13:Contact Center Average Scores by Evaluation Section: December 2008.......68 Figure 14: Average Scores by Evaluation Question: December 2008............................ 70 Figure 15: Average QA Scores by Agent: December 2008............................................ 72 Figure 16: QA Scores by Agent by Evaluation: December 2008.................................... 72 Figure 17: Average QA Scores by Agent by Category: December 2008 ........................ 73 Figure 18: QA Agent Training Outline.......................................................................... 76 Figure 19: Workforce Optimization Suites ...................................................................82
© 2009 DMG Consulting LLC - 1 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Executive Summary
Contact center quality assurance (QA) is an ongoing business activity that provides
valuable insights into agent, customer and contact center service delivery performance and
opportunities. The primary goal of any QA program is to verify and ensure that a company’s
contact center services are being performed in a manner that meets or exceeds internal
requirements and customer expectations. At a fundamental level, QA programs measure
how well agents comply with internal policies and procedures and interact with customers in
phone, email and chat sessions. The more advanced QA programs go well beyond these
basics. They combine the results of customer satisfaction surveys with internal
measurements to provide a 360‐degree view of the customer experience. Well‐designed and
effective QA programs demonstrate a company’s commitment to its customers and agents,
and are essential for building world‐class contact centers. This Guide is a “cookbook” for
building a strong, effective and well‐received QA program. It is ideal for managers and
supervisors in contact centers of any size who are either building their first quality assurance
program or want to enhance an existing program.
© 2009 DMG Consulting LLC - 2 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
1. Introduction
Quality assurance isn’t an option for contact centers; it’s essential for the success of the
contact center, customer and agent satisfaction, improving agent and supervisor productivity and
effectiveness, and keeping management in touch with their staff’s performance. To achieve the best
results and foster confidence in the program, managers must ensure that they are evaluating the
right components of agent performance during customer interactions and using appropriate
measurements and weights. Building an effective QA program is an iterative, multi‐step process
that requires senior management support, planning and input, and buy‐in from all levels of contact
center staff. Automation is helpful for formalizing, standardizing and institutionalizing the initiative,
but QA a program will succeed only if the staff is on board and believes in its value.
© 2009 DMG Consulting LLC - 3 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
2. What is Contact Center Quality Assurance?
Contact center quality assurance, also known as Quality Management (QM), is a process
where managers, supervisors and QA specialists monitor and evaluate how well agents handle
customer transactions. The monitoring process includes a simultaneous review of a call and the
system screens used to handle the interaction. (If QA is being done on an email or a chat session, it
will review just the servicing screens and the content of the agent’s response.) QA can be done in a
real‐time mode where the supervisor or QA specialist “live monitors” calls. For live monitoring QA,
reviewers can either access calls through their automatic call distributor (ACD) technology, or can sit
next to agents and watch and evaluate them as they handle calls. QA is often performed on
recorded transactions.
There are pros and cons to both live monitoring and using recorded transactions. When live
monitoring, the QA specialist/supervisor can provide immediate feedback to the agent. This can
become an effective coaching opportunity, as the feedback is provided in real‐time when the call is
still fresh in the agent’s mind. On the other hand, doing QA from recorded transactions allows an
organization to schedule calls, making the process more efficient for the reviewer. It allows QA
specialists/supervisors to find interactions that require their attention, either because they are really
good or really bad, instead of wasting time on satisfactory calls. It allows the reviewer to be able to
carefully review the interaction/screens, including the ability to go back and re‐review a portion of
the interaction. And, lastly, it is an anonymous process, so agents are not aware that they are being
evaluated and will not necessarily be on their best behavior.
When doing QA, the reviewer completes an evaluation form that measures how the agent handled
each component of the call or online transaction. The form can be on paper or part of a QA system.
The fundamental purpose of QA programs is to measure how well agents adhere to contact center
departmental policies and procedures. Contact/call center managers have traditionally live
monitored or listened to recorded interactions in order to gauge agent performance and internally
evaluate the customer experience. To be effective, the program should be a formalized ongoing
process designed to:
© 2009 DMG Consulting LLC - 4 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
1. Measure agent adherence to internal policies and procedures
2. Improve consistency and quality of customer interactions across all
channels (telephone, email, chat/IM, etc.)
3. Assess business execution – detect and fix broken or inefficient
policies, processes or operational issues throughout the company
4. Improve agent performance
5. Identify agent training needs
6. Identify policies or processes that frustrate and alienate customers
7. Maximize every customer interaction
8. Identify business trends
9. Improve the customer experience
The underpinnings of a good QA program are consistency, accurate measurement, and a cycle of
continuous feedback resulting in improvements. An effective QA program provides the contact
center with a vehicle for measuring the quality and consistency of service delivery, capturing
customer insights, and identifying trends, service issues and training/coaching opportunities to
improve agent performance and productivity. Quality assurance is a dynamic and iterative process
that must be adapted as a business changes. See Figure 1.
© 2009 DMG Consulting LLC - 5 - April 2009 All rights reserved.
CCnt
er
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
Figure 1: Quality Assurance Process
Source: DMG Consulting LLC, April 2009
Action Item: Identify the company’s reasons for performing contact center quality assurance. Build a program that delivers continuous feedback to the contact center and other operating departments.
© 2009 DMG Consulting LLC - 6 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
3. Quality Assurance Program Benefits
When quality assurance programs are well designed and used consistently, they yield great
benefits for customers, enterprises, contact centers and agents. The benefits include:
• Improved agent productivity, by reducing the average handle time of transactions
• Reduced operating expenses
• Better transaction quality
• Increased customer satisfaction
• Enhanced customer experience
• Identification of business opportunities, cross‐sell, up‐sell, new products and services
• Enhanced operating policies and procedures
• Reduced enterprise risk
• Improved agent satisfaction and reduced attrition
• Automated reporting for tracking and trending
Action Item: Set up a process for capturing, quantifying and reporting the benefits from your QA program. Be sure to share successes with agents, supervisors, managers and senior executives on an ongoing basis, so that everyone appreciates the program’s contributions.
© 2009 DMG Consulting LLC - 7 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
4. Components of a QA Program
While every company should customize their QA program to reflect their corporate culture
and values, the fundamental building blocks for contact center QA are standard across all
companies. Effective QA programs should include the following components:
• Procedures and policies: Document all existing transaction procedures and policies so that the
contact center staff knows the appropriate steps for handling all types of inquiries. QA
specialists/supervisors can also use the procedures and policies as a standard to make sure that
they are evaluating different transactions properly. It’s important to set up a process that keeps
the policies and procedures up‐to‐date.
• Quality monitoring criteria and guidelines: Specify the criteria to apply when evaluating
transactions and performing evaluations. The criteria should define what QA
specialists/supervisors are looking for in each type of contact center transaction. The easiest way
to create criteria is to use the contact center’s documented procedures and policies and note on
each the most important aspects for each type of transaction. It’s also a good idea to identify
agent actions that would cause them to lose points in a quality evaluation.
• Program mechanics: Define the mechanics of the QA program, including who (manager,
supervisor, team leader, QA specialist, trainer) is responsible for conducting evaluations, the
number and frequency of evaluations, how many coaching sessions per agent per month, and
how to select calls, emails and chat sessions for evaluation. (If a QA application is being used,
most of the mechanics will be automated, including the transaction selection process.)
• Training: Establish a closed‐loop training process that addresses new content, system issues,
updates and agent performance issues. The trainers and QA staff must work closely together to
ensure that the staff is fully trained. (In many small/mid‐sized contact centers, the same people
do both QA and training.) Prior to kicking off a QA program, all contact center staff – agents,
supervisors, QA specialists, trainers, managers – must be fully trained so that they know how to
© 2009 DMG Consulting LLC - 8 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
handle all types of customer interactions. If new or enhanced procedures and policies are
drafted to support the QA program, they should be reviewed with the staff before starting the
QA program. In addition to agent procedural training, it’s important to build a training program
that introduces the new or enhanced QA program to the staff. The more informed the staff is
about the program, the more effective the initiative.
• Coaching: Provide frequent feedback to agents about their performance. Feedback should
address where agents are performing well and areas where they have opportunities to improve.
Coaching is one of the critical success factors in QA programs and plays a very important role in
agent satisfaction and retention. However, as it can be very challenging to provide negative
feedback to agents, it’s important to train the QA staff to produce effective coaching sessions
and to make sure that management delivers coaching sessions consistently.
• Calibration: Build consistency into the QA program through calibration. Calibration is the
process of teaching all people involved in performing QA evaluations how to score transactions
on a consistent and equitable basis. To make a QA program fair for agents, it’s essential for all
QA reviewers to agree on the meaning and value of each question in a monitoring form. To
achieve consistency, it’s important to run calibration sessions where all reviewers listen to the
same call, score it, identify variance in scoring approaches, reconcile their differences, and set a
standard measurement that all will use going forward. The only way to reach consensus is to run
calibration sessions. Calibration is an ongoing process and should be run on a monthly basis. It’s
also a good idea to involve agents in the calibration process so that they can appreciate the
challenges associated with consistently evaluating transactions.
• Evaluation feedback: Provide a process to facilitate two‐way communication between agents
and reviewers. Agents need to have a mechanism for responding to their quality evaluations so
that they feel empowered and not “put upon.” They also need a formal process for filing
complaints when they believe that a QA reviewer is not being fair or is not listening to their
input. QA reviewers should welcome discourse, as it will ultimately yield a better and more
effective program.
© 2009 DMG Consulting LLC - 9 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
• Rewards and recognition: Recognizing and rewarding top performers is essential for the
success of a QA program. While agents should deliver outstanding performance because it’s
their job, recognizing when they do encourages them to keep up the good work and motivates
others to strive for recognition, as well. Rewards do not have to be large; they could include a
plaque, a parking spot, lunch with the CEO, a gift card, movie tickets, etc.
Action Item: Implement an “Executive QA” program, where senior managers from sales, marketing, operations and all other supporting areas sit with contact center agents as they handle calls. This has proven to create customer‐focused awareness and foster collaboration between departments. (Rapid process change is facilitated when senior executives hear first‐hand the impact of their processes and programs on customers.) When senior managers take this program seriously, it has a very positive impact on agent morale and satisfaction because it underscores the importance of their job.
CC
nter
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
5. Getting Started
Figure 2 depicts a high‐level project plan for building a QA program. This figure provides the
high‐level steps that each organization should customize to meet the needs of their company. The
first step in the process is identifying and assigning resources to oversee and manage the program
on an ongoing basis. This step is critical because every company needs a champion to push the
program through, or it is unlikely to happen. (The project champion often becomes the manager or
supervisor responsible for the program.) It takes a significant amount of work to build a contact
center QA program and there must be an individual responsible for making sure that everyone
involved in the program is motivated and the project stays on track and on time.
Figure 2: Quality Assurance Program Development Roadmap
Source: DMG Consulting LLC, April 2009
Pilot the program for 2 months and review results
Identify QA manager and reviewer staff
Communicate to all staff that a formal Quality Assurance program is under development
Develop the evaluation forms; determine section and question weights
Test the form using real calls, conduct calibration and refine the form as needed
Develop quality monitoring criteria and guidelines
Define the coaching and feedback process
Build Quality Assurance training program
Train all staff members on the Quality Assurance program
1
2
3
4
5
6
7
8
.
Adjust program and repeat pilot as needed
9
10
Implement the formal QA program11
Pilot the program for 2 months and review results
Identify QA manager and reviewer staff
Communicate to all staff that a formal Quality Assurance program is under development
Develop the evaluation forms; determine section and question weights
Test the form using real calls, conduct calibration and refine the form as needed
Develop quality monitoring criteria and guidelines
Define the coaching and feedback process
Build Quality Assurance training program
Train all staff members on the Quality Assurance program
1
2
3
4
5
6
7
8
.
Adjust program and repeat pilot as needed
9
10
Implement the formal QA program11 Implement the formal QA program11
© 2009 DMG Consulting LLC - 10 - April 2009 All rights reserved.
© 2009 DMG Consulting LLC - 11 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
The 11 high‐level steps for building a QA program are explained below. It’s important to note that
departmental procedures for handling all types of customer transactions and inquiries should be
drafted or enhanced prior to developing the QA process.
Step 1: Identify resources from the contact center to serve as the QA manager and quality
review staff. (The QA manager will also play a major role in building or enhancing the program.)
Ideally the manager should have prior experience in setting up and running a QA program and
conducting contact center QA evaluations. Minimally, the QA reviewer(s) must demonstrate
complete and accurate knowledge of all contact center policies, procedures and systems, as well
as having excellent interpersonal and coaching skills. The number of resources dedicated to the
process varies based on the size of the contact center.
Step 2: The institution of a formal QA program should be communicated to all staff members as
early as possible. The success of the program depends upon agent cooperation, which will be
enhanced if agents are invited to participate in program development and all changes are clearly
communicated. Communication should include information about why the program is being
developed, roles and responsibilities, impact on agents (their reviews, raises, and incentives),
and program benefits.
Step 3: Develop the quality evaluation forms. Typically, managers create one for each channel
supported (phone, email, chat), and/or one for different call classifications, i.e. service, sales,
collections, fraud, etc. Once the number and types of forms are decided, determine the sections
required in each form, i.e., opening, verification, problem resolution, communications, closing,
and then write the questions for each section. Once the sections and questions are drafted,
assign weights to each question and possibly to each section, reflecting their relative
importance. It’s easiest to use a 100‐point scale but this is not a requirement.
Step 4: Using real calls, test the evaluation forms to ensure all call components that need to be
measured are captured and that the intended goals are achieved fairly and accurately. Conduct
calibration sessions involving supervisors and QA specialists to ensure that everyone
participating in the QA program is using the form correctly and with the same rigor, consistently
© 2009 DMG Consulting LLC - 12 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
and fairly across all questions. It generally takes a few calibration sessions to finalize the
questions and weights for each evaluation form.
Step 5: Document the quality monitoring evaluation criteria and guidelines for how to apply
them.
Step 6: Define the agent coaching and feedback process. Coaching is considered the most
effective method for training adults; it should take place in a one‐on‐one personalized session
with an agent. During coaching sessions, agents should receive specific and targeted feedback
to learn what they are doing well as well as the areas where they can improve. This is a critical
part of successfully managing agents and motivating them to improve their performance.
Step 7: Build the QA training program. Training should include an in‐depth review of the quality
monitoring process and QA form, evaluation criteria and metrics/measurements, frequency and
number of evaluations conducted, scoring methodology, how to access, review and respond to
evaluations, and how to provide feedback to improve the program. Optionally, the QA training
program may also include a review of documented policies, procedures and guidelines that
govern agents’ behavior in processing transactions or advising customers.
Step 8: Train staff on the QA program. Roll out the training program so all agents and other
relevant contact center staff – managers, supervisors, trainers, QA specialists, agents, and
possibly participants from other departments – become familiar and comfortable with the new
program. If possible, conduct calibration sessions with agents during the training session to
demonstrate application of the quality criteria and use of the evaluation form. (Note: Be sure to
ask the agents whose calls will be used during calibration for their permission.) Agents who
understand how their calls are being evaluated and know what is expected of them are more
likely to perform well.
Step 9: Pilot the QA program and monitor results. The pilot provides an opportunity to see what
works and what needs to be enhanced. It also gives QA reviewers and agents an opportunity to
get accustomed to the program.
© 2009 DMG Consulting LLC - 13 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
DMG recommends a 2‐ to 3‐month pilot to work the kinks out of the process and give everyone
time to get used to the new program before going live.
Step 10: Adjust the program throughout the pilot and, if changes are significant enough, it may
make sense to keep the pilot going longer. This step allows time to refine the program before it
is formally launched. Based on participant feedback and pilot results, the QA manager should
revise processes, evaluation criteria, evaluation forms, metrics, mechanics, etc. The pilot should
be run for as long as necessary to test all components of the QA program, including conducting
calibration exercises.
Step 11: Implement the QA program. Launch the program and begin to execute the QA
processes on a daily basis. QA reviewers should work closely with management, supervisors and
trainers to review quality assurance results and identify areas for training and development.
Periodic reviews to evaluate the QA process should be conducted to ensure compliance with
ever‐evolving procedures, policies and protocols. It is also a good idea to establish a reward
program or agent appreciation events, so that management can show special recognition for
outstanding employee performance based on QA program reviews.
After the QA program is implemented, contact centers should consider doing the following activities
on a monthly basis. They are not a requirement but will help to enhance agent moral and improve
the results of your QA program.
• Conduct monthly calibration sessions with all supervisors/QA reviewers to ensure program
consistency.
• Conduct monthly agent team meetings to review QA results and do group training, as
necessary.
• Have QA reviewers and trainers meet on a monthly basis to review training needs and
other improvement opportunities uncovered in the QA process. (In many contact centers,
the QA specialists and trainers speak daily.)
© 2009 DMG Consulting LLC - 14 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
• Publish a monthly QA newsletter (or other vehicle) with quality tips to help the
department achieve quality goals. The newsletter should also recognize top performers.
Action Item: Implement a process where top‐performing agents conduct side‐by‐side training sessions with agents who are not meeting their potential and are “under ‐performing”. Agents learn best from their peers. Using the top performers for this activity will recognize their outstanding performance and help get them on board in support of management objectives.
© 2009 DMG Consulting LLC - 15 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
5.1 Creating a Quality Assurance Leadership Team
Building an effective QA program is an iterative process that requires senior management
support, in addition to thorough planning, input and buy‐in from all levels of contact center staff. It’s
a best practice to form a QA leadership team with representation from all contact center
constituents including managers, supervisors, trainers, agents, and possibly other departments,
such as marketing and sales. This team could be led by the head of QA or the contact center
director. This is a good way to keep the head of the contact center engaged in the QA program on an
ongoing basis.
Action Item: Use the quality assurance monitoring process to identify areas for business process optimization. Establish a cross‐functional team to address contact center and enterprise business process opportunities identified during the quality monitoring process. This team should work together to change business processes that upset customers.
© 2009 DMG Consulting LLC - 16 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
5.2 Staffing QA Positions
The success of any QA program is largely dependent upon the commitment and skills of the
quality assurance specialist who administers the program and coaches agents. Quality assurance
specialists must be highly skilled individuals who possess outstanding job knowledge and excellent
communication, interpersonal and coaching skills. Because a vast amount of job knowledge is
required for this role, most contact centers staff these positions by promoting agents who
demonstrate exceptional job knowledge and call handling skills. (Other contact centers transfer a
supervisor or manager into this position.) However, being an outstanding customer service or sales
representative is not the same as knowing how to deliver effective one‐on‐one feedback to coach
and motivate a contact center agent. The ability to coach is not innate, and generally has to be
taught. QA specialists should have the following skills and knowledge:
• Job knowledge: To accurately, effectively, and objectively evaluate agents, quality
assurance specialists must possess strong knowledge of all products, services, systems,
processes and procedures. If quality assurance specialists are hired externally, they should
be required to complete a comprehensive training course that includes in‐depth coverage
of all products, services, policies procedures and guidelines that agents are required to
adhere to when processing transactions or advising customers.
• System knowledge: Quality assurance specialists must be fully trained and knowledgeable
about all systems that agents use to handle customer inquiries or process orders.
Additionally, they must be fully trained to navigate the quality management application so
that they can search, retrieve and play back calls, create evaluations, and respond to
agents’ evaluation feedback. QA specialists should also be trained to access and use
reports from the quality application as well as to create ad hoc reports.
QA program mechanics: Quality assurance specialists must possess a strong understanding
of the quality monitoring process, including how to select calls/emails/chat sessions, the
number/frequency of evaluations that they are required to do, and how to coach agents.
© 2009 DMG Consulting LLC - 17 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
• QA criteria/calibration: To maintain measurement integrity, it’s essential for all QA
specialists to uniformly apply quality monitoring criteria when evaluating calls. All QA
reviewers should be thoroughly trained to complete evaluations in a consistent manner.
The success and effectiveness of the program and its reception by agents depends upon
the ability to consistently, objectively and fairly evaluate contact center interactions. To
achieve these essential goals, it’s also recommended that calibration sessions be
conducted with all quality reviewers on a monthly basis.
• Coaching/motivation: QA specialists must demonstrate strong interpersonal and coaching
skills so that they can work one‐on‐one with agents to recognize areas of strengths as well
as provide directed feedback on areas that require improvement. All quality assurance
specialists should be required to take courses in effective coaching methods and
motivation techniques.
Action Item: When staffing QA programs, use highly respected staff members with strong product, service, system knowledge and expertise in coaching when staffing QA programs. Agents are more welcoming of feedback from people they respect.
© 2009 DMG Consulting LLC - 18 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
5.3 Implementation Roadmap
Once the leadership team is established and the quality manager and staff have been
appointed, a detailed breakdown of all program deliverables should be developed and assigned to
the responsible parties. Figure 3 provides a list of the required tasks, recommendations for
ownership of each initiative, and estimated time frames for completion. The tasks reflected in Figure
3 have already been discussed in other sections of this Guide. They are repeated here in order to put
them into a project plan and to show the responsible party and estimated time frame.
DMG recommends using this list of initiatives to build a detailed project plan. Your project plan will
likely include other steps and initiatives that are important for your organization. Once the project
plan is drafted, review it and get the buy‐in for all involved parties, particularly those assigned tasks,
to make sure that they are committed and able to dedicate time to the projects. It’s also a good idea
to have a weekly project review meeting to assess progress and to address items that are slipping. It
is ideal, but not necessary for senior contact center management to participate in the weekly
progress meeting. However, if a contact center manager is not able to participate in the weekly
meetings, the project manager should make an effort to keep him/her updated on the team’s
progress.
© 2009 DMG Consulting LLC - 19 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 3: QA Implementation Roadmap Task/Initiative Responsibility Timeframe Status Development Phase Communicate to all staff that a quality assurance program is under development and will soon be implemented; communication should be frequent
Contact Center Director
2 days
Promote or hire a QA specialist and quality reviewers
Contact Center Director
3 weeks
Develop a call monitoring evaluation form (Note: In multi-channel contact centers, a QA form will need to be created and tested for each channel)
Quality Assurance Leadership Team
2 weeks per form
Test the form using real calls to ensure that all call elements that need to be evaluated are captured
QA Manager and reviewers
2 days
Document quality monitoring evaluation criteria and guidelines for how to apply them
Quality Assurance Leadership Team
2 weeks
Determine weights for each section of the monitoring form
Quality Assurance Leadership Team
1 week
Test the form(s) using real calls to validate sections, questions, weights and ranges
Quality Assurance Leadership Team
1 week
Hold calibration sessions with agents, supervisors and managers to make sure everyone uses the evaluation form the same way
Quality Assurance Leadership Team
2 weeks
Enhance QA evaluation form based on input from calibration team
QA Manager and reviewers
3 days
Determine the volume of transactions (calls, emails, chat sessions, other) to be monitored per agent/month
Quality Assurance Leadership Team
1 day
Decide who will be conducting the evaluations (quality assurance specialists, supervisors, or both)
Quality Assurance Leadership Team
1 day
Conduct calibration sessions to ensure rating reliability and consensus of all reviewers
Quality Assurance Leadership Team
2 weeks
Define the agent coaching process QA Manager, contact center managers, supervisors and reviewers
1 day
Develop a process with an escalation/review procedure for agents to provide feedback or dispute evaluations
Quality Assurance Leadership Team
1 day
Develop reports to support the QA process; if using a QA application, build the reports in the application
Quality Assurance Leadership Team
2 weeks
Test reports QA Manager 1 week
© 2009 DMG Consulting LLC - 20 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 3: QA Implementation Roadmap Task/Initiative Responsibility Timeframe Status Development Phase
Train all QA specialists to use the QA application and reports
QA Manager 2 days
Develop QA training program QA Manager and trainers
3 weeks
Train all contact center staff on the quality assurance program
QA Manager and trainers
Variable depending on size of staff and program length
Kick off the program Contact Center Director/QA Manager
1 day
Pilot program for 2 to 3 months and review results Quality Assurance Leadership Team
2 – 3 months
Revise departmental Agent Performance Evaluation form to include quality monitoring metrics
Quality Assurance Leadership Team/HR
1 week
Implement QA program Contact Center Leadership Team
On-going
Establish a process for identifying and recognizing agents who achieve quality monitoring excellence
Contact Center Leadership Team
2 weeks
Perform monthly calibration sessions QA Manager, all QA reviewers
Monthly/ on-going
Conduct monthly training sessions Trainer Monthly/ on-going
Create and issue monthly QA newsletter QA Manager Monthly/ on-going
Have QA staff meet with training to enhance training programs
QA Manager and trainers
Monthly/ on-going
Action Item: Involve all levels of contact center staff in creating the program to avoid unnecessary skepticism and surprises. This will help agents appreciate the positive aspects of the QA program and speed up adoption.
© 2009 DMG Consulting LLC - 21 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
6. Developing a QA Evaluation Form
When managers think about developing a QA program, one of the first things they must
consider is creating a QA evaluation form. The evaluation form is the most visible component of the
program and is necessary for all QA programs, whether an organization is doing QA manually or
using an automated system.
The QA evaluation form needs to capture all interaction components that a contact center wants to
measure. Its questions and weights should reflect the culture of the company and what is most
important to their service strategy. In general, QA evaluation forms contain the following
components:
1. Call/evaluation details
Examples: Name of agent, date of transaction, reviewer name, date of evaluation, call type,
customer identifier (account number, social security number, etc.)
2. Sections (skill categories)
Examples: Call opening/closing, verification, product/plan knowledge, procedure
knowledge, system knowledge, hold/mute/transfer, communication skills, resolution skills,
etc.
3. Questions (to objectively assess skill proficiencies)
Examples: Did the representative identify the caller according to the verification policy prior
to releasing information? Did the representative log a summary of the call according to the
policy?
4. Scoring
Point values for each question and section of the evaluation form; includes point values,
points available, and points earned. Point values should be assigned based on the relative
importance of each section and question to the business and the customer.
© 2009 DMG Consulting LLC - 22 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
5. Coaching comments
Free‐form text box to allow reviewers to provide feedback on performance excellence or
opportunities.
6. Recommendations
Free‐form text box to allow reviewers to document an action or follow‐up items for agents
to complete. Examples: taking a specific eLearning course, reviewing specific procedures,
recommendations, etc.
7. Acknowledgement (optional)
Signature boxes so that the reviewer and agent can sign off and acknowledge that they had
a discussion.
After learning how to create a QA evaluation form and deciding what sections are required, it’s
relatively easy to modify an existing form or draft a new one. However, creating one for the first
time can be daunting. Therefore, we suggest getting started with an existing QA form and then
modifying it to meet the organization’s specific needs. The easiest way to obtain sample quality
evaluation forms is by asking a QA vendor, another contact center manager, or a consulting firm.
(Most vendors have sample forms and are generally happy to share them, particularly if they think
you are interested in purchasing their solution.)
While it’s relatively easy to obtain a sample QA evaluation form, the catch is that there is no "one
size fits all" form. For example, a technical help desk needs a different QA evaluation form from a
sales or customer service contact center. The QA evaluation form should be customized to meet the
needs of each particular contact center.
© 2009 DMG Consulting LLC - 23 - April 2009 All rights reserved.
CCnt
er
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
Here are a few best practices to employ when developing QA evaluation forms:
Involve all levels of your staff – agents, supervisors, trainers and managers.
Develop a separate QA form for each channel that you will be monitoring. Test
your QA form(s) using real transactions and refine them before putting them into
production.
Assign weights that correspond to the importance of all categories on your QA
form. Importance may be relative to the channel of communication, customer
and/or the business.
Include a comments box for each section of the evaluation so that coaching
comments on performance strengths and opportunities and agent feedback can
be captured.
Action Item: To modify or create a QA evaluation form, start with an existing or sample form. Then listen to a selection of different call types. While “scoring” these calls, identify form components that need to be changed or added to fit your environment. (This process applies to emails, chats or other types of transactions received by the contact center.)
© 2009 DMG Consulting LLC - 24 - April 2009 All rights reserved.
CCon
tC oo
nn ttaa a cc c
tt t CC C ee e nn n
tt t ee err r QQ QAA A GG Guu u ii i dd dee e
6.1 Form Categories
Figure 4 provides an example of sections or categories typically found in a QA form for calls.
It also includes example criteria that can be used to evaluate if an agent properly performed a
particular skill. Depending on how the form is structured, sections can be established based on skills,
call flow, competencies, or call segments. In Figure 4 below, the sections are based on skills.
Figure 4: Common Quality Evaluation Form Sections/Categories
Source: DMG Consulting LLC, April 2009
Action Item: Be sure to include a section in the evaluation form to reflect how well the agent performed in regard to compliance or disclosure regulations (if applicable).
Quality Evaluation Form
Call Opening/Closing
Is there a policy/procedure/service issue that prevented First Contact Resolution?
Demonstrated thorough knowledge of product/part/services/warranty, etc.
Effectively accessed information on system; utilized all appropriate systems to obtain information
Use hold/transfer effectively and only as necessary
Did the agent do everything possible to ensure a callback was not necessary?
Spoke clearly and confidently
Accurately diagnosed problem/issue
Obtained verification prior to releasing information according to established data security procedures
Provided name/company name; thanked the customer for calling or purchasing [brand] products
Is there a policy/procedure/service issue that prevented First Contact Resolution?
Demonstrated thorough knowledge of product/part/services/warranty, etc.
Effectively accessed information on system; utilized all appropriate systems to obtain information
Use hold/transfer effectively and only as necessary
Did the agent do everything possible to ensure a callback was not necessary?
Spoke clearly and confidently
Accurately diagnosed problem/issue
Obtained verification prior to releasing information according to established data security procedures
Provided name/company name; thanked the customer for calling or purchasing [brand] products
Demonstration of System Knowledge/Usage
Demonstration of Product Knowledge and Information
Hold/Transfer Procedure
Verification
Demonstration of Resolution Skills
Communication Skills
Agent First Contact Resolution (FCR)
Business Process FCR
© 2009 DMG Consulting LLC - 25 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
6.2 QA Evaluation Form Questions
After identifying the right sections/skills to include in the QA form, the next step is to create
a concise list of questions that captures the elements associated with demonstrating competency in
each skill. The goal is to write questions that allow QA reviewers to completely and objectively
assess agent performance.
Once the initial list of questions is developed for each section of the form, have a team of supervisors
and QA reviewers test the form by trying it out on a few calls or emails. It will typically take at least
ten rewrites of the questions before an evaluation form is complete. This exercise helps surface gaps
and identifies overlaps and redundancies in questions. Revise the form based on test results, and
retest it until the form successfully addresses all requirements. (You will know the form works when
there is a place to address every aspect of a call or email.)
Action Item: Test the QA form with real calls or emails. Make sure there is a place on the form to address everything that can happen in a call or email. This will surface confusing or ambiguous components of the form.
© 2009 DMG Consulting LLC - 26 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
6.3 Assigning Weights to QA Evaluation Forms
Once the form content is finalized, it’s time to select a scheme for assigning weights to the
questions and/or sections of the evaluation. The simplest approach is to go with a 100 point scale.
While there are many scoring methodologies, the most basic way is to assign a point value to each
section on the form and distribute the points among the questions in that section. For example, the
communications skill section may be assigned a total of 20 points, which are distributed among the
four questions in the section. While points can be distributed equally throughout a section or the
entire QA evaluation form, we suggest that they be assigned based on the relative importance of
each section and each question. Importance may be relative to the channel of communication,
customer and/or the business. After preliminary weights are distributed, the form should be tested
again by evaluating and scoring actual transactions. Points are generally reconsidered and
redistributed during testing. It’s also important to consider that in some interactions, not all
components of the form are applicable (for example, not every call includes a hold or transfer). The
scoring on the forms shown in Section 6.5 was designed to include “Not Applicable” (N/A) as a
consideration; if a particular question on the form is not applicable to the interaction being
evaluated, the question is marked N/A. The points are not available to be achieved, nor do they
affect the total score. Overall evaluation scores are calculated as follows:
Total points achieved ⁄ Total points available = QM evaluation score
For example: 76 points achieved / 84 points available = 90.4%
Action Item: When assigning weights to the form, it’s also a good idea to identify and define the serious errors that will result in a full and automatic failure of an evaluation form section or an entire evaluation.
© 2009 DMG Consulting LLC - 27 - April 2009 All rights reserved.
CC Coo o nn n tt taa a cc c
tt t CC C ee
nn ttee rr QQ Q
AA A GG Guu u ii i dd dee e
enter
6.4 Validation
Validation of the metrics associated with each category of the quality assurance form is an
important step in developing the quality assurance program. Metrics are typically validated by
testing the proposed weights against baseline quality measurements that have already been
established.
When modifying an existing QA evaluation form, the new form can be validated by using it to re‐
evaluate transactions that had previously been scored using a different QA form. If no quality
measurement baseline exists, initial testing sessions should validate QA scores based on agent
performance. So, top performers should receive high scores, average performers should be in the
mid‐range, and poor performers should have low scores.
Once the point values are validated and coming out as expected, the next step is to determine how
QA evaluation scores equate to the performance ratings used by the organization in its annual
review process. For example:
Excellent: 89% and above
Good: 80‐88%
Average: 71‐79%
Needs Improvement: 70% and below
The scoring ranges above are typical for many contact center QA programs. The ranges should be
high enough so that an excellent transaction is clearly identifiable, but not so high so that few agents
can attain an excellent rating. As a result, it’s often a good idea to start with one range and then as
agents get accustomed to the QA program and their quality improves, tighten the ranges. So, for
example, when a QA program is first kicked off, excellent may be anything above 85%. However,
since most customers are not likely to think that 85% is excellent, the range should be narrowed to
anything above either 89% or 90% a few months into the program.
© 2009 DMG Consulting LLC - 28 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Be sure to communicate to agents exactly what is happening with the ranges – starting easy to give
them a chance to become accustomed, and then bringing them to a level that will better meet
customer expectations – so that the staff does not think that management is just giving them a hard
time. Be sure to set expectations and communicate clearly at all stages of the program to get the
staff’s support.
Once the final program is rolled out, agents should have a two‐ to three‐month grace period to
become accustomed to the program, criteria and scoring before they are held accountable for QA
scores; this will minimize any claims of “unfairness” within the shop, as the staff will have had an
extended period of time to adjust to the program requirements. During this period, QA evaluations
and coaching sessions are performed, but are not counted toward agents’ annual evaluations.
Action Item: Review and enhance your QA evaluation forms periodically —minimally, every 9 to 12 months to keep them in sync with your business and customers’ expectations.
© 2009 DMG Consulting LLC - 29 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
6.5 Example QA Evaluation Forms
This section includes 5 example QA forms. The first form, Figure 5, is for a financial services
customer service contact center. Figure 6 is a form intended for use in an inbound healthcare
customer service contact center. Figures 7, 8, 9 depict forms intended for inbound technical support,
customer service with up‐sell/cross‐sell activities and new order processing, respectively. Use these
forms as examples for getting started, but customize them to reflect the needs and priorities of your
organization.
© 2009 DMG Consulting LLC - 30 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 5: Financial Services Customer Service QA Evaluation Form
Description: This form is for use by an inbound financial services contact center that addresses
customer inquiries, questions or problems on their accounts, and/or services that are provided.
Evaluation Details Call Details Representative name: Account number or customer ID: Call date: Caller: Evaluator name: Call category: Evaluation date: Call type:
Greeting (3 points)
Point Value Yes No Points
Available Points
Achieved
Used call greeting as defined in Greeting Policy 3
Point Totals
Coaching Comments:
Verification (10 points)
Point Value Yes No N/A Points
Available Points
Achieved
Verified caller as defined in Verification Policy prior to releasing account information
7
Verified additional items pertaining to specific caller inquiry
3
Point Totals
Coaching Comments:
Product Knowledge (20 points)
Point Value Yes No N/A Points
Available Points
Achieved
Accurately identified account, transaction, product or service 5
Provided complete information or instructions in accordance with established procedure
7
Provided accurate information or instructions in accordance with established procedure
8
Point Totals
© 2009 DMG Consulting LLC - 31 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Coaching Comments:
Inquiry Resolution (23 points)
Point Value Yes No N/A Points
Available Points
Achieved
Accurately understood the nature of the caller’s inquiry 5
Effectively/accurately resolved inquiry/ issue in accordance with established procedure
5
Completed fulfillment/referral/
follow-up as promised/required 5
Provided appropriate alternatives relative to customer’s need/situation
3
Ensured caller fully understood explanation, process, time frames and/or next steps
5
Point Totals
Coaching Comments:
Compliance (8 points)
Point Value Yes No N/A Points
Available Points
Achieved
Read mandatory disclosures as required and applicable 8
Point Totals
Coaching Comments:
Cross-sell/Up-sell (6 points)
Point Value Yes No N/A Points
Available Points
Achieved
Recommended appropriate product/service to meet customer need or extend the relationship
3
Effectively tied product/service benefit to customer situation/need 3
Point Totals
© 2009 DMG Consulting LLC - 32 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Coaching Comments:
System Knowledge/Usage (7 points)
Point Value Yes No N/A Points
Available Points
Achieved
Effectively accessed and utilized all appropriate systems, screens and fields to obtain information to resolve inquiry
5
Accurately utilized wrap-up 2
Point Totals
Coaching Comments:
Hold/Mute/Transfer (4 points)
Point Value Yes No N/A Points
Available Points
Achieved
Utilized hold/mute as defined in Hold/Mute Policy 2
Performed transfer to the correct area, only as necessary and in accordance with the Transfer Policy
2
Point Totals
Coaching Comments:
Communication Skills (14 points)
Point Value Yes No N/A Points
Available Points
Achieved
Maintained a courteous, pleasant, and respectful tone throughout the call
3
Conveyed information clearly and confidently and in a manner that was easily understood
3
Demonstrated effective listening skills 2
Expressed empathy and concern, as appropriate 1
Efficiently managed time and call flow (call management) 2
© 2009 DMG Consulting LLC - 33 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Communication Skills
Point Value Yes No N/A Points
Available Points
Achieved
Demonstrated professional language and behavior (call etiquette)
3
Point Totals
Coaching Comments:
Closing (5 points)
Point Value Yes No N/A Points
Available Points
Achieved
Used call closing as defined in Call Closing Policy 2
Accurately documented information in accordance with established procedure
3
Point Totals
Coaching Comments:
Total Points Available/Total Points Achieved
QM Evaluation Score
Date Reviewed with Representative Comments:
Recommendations:
Acknowledged by Representative
Acknowledged by Reviewer
Source: DMG Consulting LLC, April 2009
© 2009 DMG Consulting LLC - 34 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 6: Healthcare Customer Service QA Evaluation Form
Description: This form is for use by an inbound customer service organization that addresses calls
about insurance claims. The calls may address program eligibility, claims status and dipositioning,
program benefits and coverage.
Evaluation Details Call Details Representative name: HCID or SSN: Call date: Caller: Evaluator name: Call category: Evaluation date: Call type:
Greeting (5 points available)
Point Value Yes No Points
Available Points
Achieved
Used call greeting as defined in Greeting Policy 2
Documented caller name and phone number 3
Point Totals
Coaching Comments:
Verification (10 points available)
Point Value Yes No N/A Points
Available Points
Achieved
Verified caller as defined in Verification Policy 5
Verified eligibility as defined in Eligibility Policy 3
Asked caller for member ID or patient’s date of birth and name
1
Asked caller for address on file 1
Point Totals
Coaching Comments:
© 2009 DMG Consulting LLC - 35 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Plan/Benefit Knowledge (25 points available)
Point Value Yes No N/A Points
Available Points
Achieved
Accurately identified plan type, benefit, or claim 5
Demonstrated thorough knowledge of plan type, benefit or claim information
5
Provided complete information or instructions in accordance with established procedure
7
Provided accurate information or instructions in accordance with established procedure
8
Point Totals
Coaching Comments:
Inquiry Resolution (25 points available)
Point Value Yes No N/A Points
Available Points
Achieved
Accurately understood the nature of the caller’s inquiry 5
Effectively/accurately resolved inquiry/issue in accordance with established procedure
5
Completed fulfillment/referral/follow-up as promised/required
5
Provided alternatives, as appropriate 3
Provided time frames, as appropriate 2
Ensured caller fully understood explanation, process, and/or next steps
5
Point Totals
Coaching Comments:
© 2009 DMG Consulting LLC - 36 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
System Knowledge/Usage (10 points available)
Point
Value Yes No N/A Points
Available Points
Achieved
Effectively accessed and utilized all appropriate systems, screens and fields to obtain information to resolve inquiry
8
Accurately utilized wrap-up 2
Point Totals
Coaching Comments:
Hold/Mute/Transfer (6 points available)
Point Value Yes No N/A Points
Available Points
Achieved
Utilized hold/mute as defined in Hold/Mute Policy 2
Performed transfer as defined in Warm Transfer Policy
2
Transferred call to the correct area 2
Point Totals
Coaching Comments:
Communication Skills (14 points available)
Point Value Yes No N/A Points
Available Points
Achieved
Maintained a courteous, pleasant, and respectful tone throughout the call
3
Conveyed information clearly and confidently and in a manner that was easily understood
3
Demonstrated effective listening skills 2
Expressed empathy and concern, as appropriate 1
Efficiently managed time and call flow (call management)
2
Demonstrated professionalism (call etiquette)
3
© 2009 DMG Consulting LLC - 37 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Point Totals
Coaching Comments:
Closing (5 points available)
Point Value Yes No N/A Points
Available Points
Achieved
Used call closing as defined in Call Closing Policy 2
Accurately documented information in accordance with established procedure
3
Point Totals
Coaching Comments:
Total Points Available/Total Points Achieved
QM Evaluation Score
Date Reviewed with Representative Comments:
Recommendations:
Acknowledged by Representative
Acknowledged by Reviewer
Source: DMG Consulting LLC, April 2009
© 2009 DMG Consulting LLC - 38 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 7: Technical Support QA Evaluation Form
Description: This form is for use by an inbound customer service organization that provides technical
support for manufactured products. The contact center addresses inquiries from customers and field
service technicians. The inquiries encompass a wide range of issues such as parts, service, general
product information, authorized service repair shops, warranties, detailed technical support,
troubleshooting, etc.
Evaluation Details
Agent Name: ________________________
Date of Call: ________________________
Evaluator Name: ________________________
Date of Evaluation: ________________________
Call Details
Caller Category: __________________________
Call Category: __________________________
Call Type: __________________________
Product Type: __________________________
Serial Number: __________________________
Greeting/Opening (3 points)
Use of Established Call Opening Point Value Yes No Points
Available Points Achieved
Provided name, company name, inquired: "How may I help you?" 3
Point Totals
Coaching Comments:
Verification (6 points available)
Verification of Owner/Registration Information
Point Value Yes No N/A Points Available Points Achieved
Owner name/address/phone number: 3
Product serial number: 2
Date of installation: 1
Point Totals
Coaching Comments:
© 2009 DMG Consulting LLC - 39 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
System Knowledge/Usage (15 points available)
Demonstration of System Usage and Navigation Skills
Point Value Yes No N/A Points Available Points Achieved
Effectively accessed information on system 5
Utilized all appropriate systems to obtain information 5
Documented/updated information according to established procedure, as needed or required
3
Accurately utilized wrap-up 1
Point Totals
Coaching Comments:
Product Knowledge and Information (25 points available)
Demonstration of Product Knowledge/ Information Skills
Point Value Yes No N/A Points Available Points Achieved
Accurately identified product/part/service/warranty issue 6
Demonstrated thorough knowledge of product/part/services/warranty, etc. 5
Provided complete information/instructions 7
Provided accurate information/instructions 7
Point Totals
Coaching Comments:
Resolution Skills (25 available points)
Demonstration of Problem Resolution Skills
Point Value Yes No N/A Points Available Points Achieved
Accurately diagnosed problem 5
Effectively resolved issue in accordance with established procedure 5
Used effective negotiation skills 3
Completed fulfillment/follow-up as promised/required 5
Provided alternatives and timeframes, as appropriate 2
Ensured caller fully understood explanation/process, next steps 5
Point Totals
Coaching Comments:
© 2009 DMG Consulting LLC - 40 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Hold/Transfer Procedure (4 points available)
Use of Established Hold/Transfer Procedures
Point Value Yes No N/A Points Available Points Achieved
Used hold effectively/transferred the caller to the correct area, and only as necessary
2
Asked permission to place called on hold/thanked the caller for holding/checked back at established intervals
2
Point Totals
Coaching Comments:
Communication Skills (18 points)
Demonstration of Communication Skills
Point Value Yes No N/A Points Available Points Achieved
Maintained a courteous, pleasant, and respectful tone throughout the call 4
Spoke clearly and confidently; explained information to the caller in a manner that was easily understood
4
Demonstrated effective listening skills; restated the caller inquiry to ensure understanding
3
Expressed empathy and concern, as appropriate 2
Established confidence and reinforced brand 2
Efficiently managed time and call flow (call management) 3
Point Totals
Coaching Comments:
Closing (4 points)
Use of Established Call Closing Point Value Yes No Points Available Points Achieved
Thanked the customer for calling or purchasing brand products 2
Asked: "Is there anything else I can help you with?" 2
Point Totals
Coaching Comments:
Total Points Available/ Total Points Achieved:
QM Evaluation Score:
© 2009 DMG Consulting LLC - 41 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
First Contact Resolution
Demonstration of First Contact Resolution Skills Yes No N/A Points Available Points Achieved
Did the agent/technician do everything possible to ensure that a callback was not necessary?
Coaching Comments:
FCR Score:
Note: First Contact Resolution (FCR) is intended to be a separate quality monitoring metric for performance measurement. This category of the evaluation form should be based on the number of calls monitored. For example, if 5 calls per month are monitored, each call is weighted at 20%. A technician who achieved First Contact Resolution on 3 calls would have an FCR score of 60% for the month.
Business First Contact Resolution
Yes No N/A
Did the caller indicate this was a follow-up call?
Is there a policy/procedure/service issue that prevented First Contact Resolution?
Comments:
Note: Business First Contact Resolution is not intended to be a quality metric for technicians. It is a data collection category that should be used for business process optimization analysis.
Source: DMG Consulting LLC, April 2009
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 8: VPI Sample Customer Service Quality Monitoring Form
Description: This form is for use by an inbound customer service department that does up‐sell or
cross‐sell. This QA form measures agent skills and processes.
Sample Customer Service Quality Monitoring Form
Agent: Agent Name Evaluation Start Time: Date and Time Group: Group Number Evaluation End Time: Finalized Date or “In Progress”
Evaluator: Evaluator Name Interaction Date: Date and Time
Quality Skill Scores:
Overall Score: 115/115 (100%)
Greeting 5/5 (100%)
Procedural Requirements 10/10 (100%)
Communication 35/35 (100%)
Listening 10/10 (100%)
Empathy 10/10 (100%)
Upsell 10/10 (100%)
Problem Assessment & Resolution 30/30 (100%)
Call Handling 5/5 (100%)
Evaluation Comments:
Enter Evaluation Comments Here
Call Segment: Introduction, Score: 10/30 (33.33%)
Question: Did the call center agent greet the caller and clearly identify themselves and the company?
Answer: Yes No
Skill: Communication
Score: 5/5 (100%)
Question: Did the call center agent verify and update customer information?
Answer: Yes No
Skill: Procedural Requirements
Score: 5/5 (100%)
Question: Answer:
Skill:
Score:
How effectively did the agent incorporate the script into their style, following the guidelines, but also sounding natural ?
Excellent Very Good Good Fair Poor Choose One
Communication
10/10 (100%)
Question: How attentive was the agent with listening to the customer? (10 = Excellent, Poor = 0)
Answer: 0 1 2 3 4 5 6 7 8 9 10
Skill: Communication
Score: 10/10 (100%)
Call Segment: Customer Service Criteria, Score: 45/45 (100%)
© 2009 DMG Consulting LLC - 42 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Question: Did the agent refer to the customer as "Mr...", "Miss...", "Sir" or "Mam"?
Answer: Yes No
Skill: Communication
Score: 5/5 (100%)
Question: How attentive was the agent in listening to the customer?
Answer: Choose One Very Good Good Fair Poor
Skill: Listening
Score: 10/10 (100%)
Question: How well did the agent speak at an understandable rate and sound postive and upbeat?
Answer: 0 1 2 3 4 5 6 7 8 9 10
Skill: Communication
Score: 10/10 (100%)
Question: How well did the agent seem to empathize with the customer
Answer: Choose One Excellent Very Good Good Fair Poor
Skill: Empathy
Score: 10/10 (100%)
Question: Was the agent able get an order for the new promotion?
Answer: Choose One Yes No
Skill: Upsell
Score: 10/10 (100%) Call Segment: Problem Identification, Score: 15/25 (60%)
Question: How effectively did the agent use probing questions to identify the customer's problems?
Answer: Choose One Excellent Good Fair Poor
Skill: Problem Assessment & Resolution
Score: 10/10 (100%)
Question: Answer:
Skill:
Score:
If the agent had to put the customer on-hold, did they come back after 45 seconds and thank them for being patient and could continue to keep holding?
Yes No
Call Handling
5/5 (100%)
Question: Did the agent offer all possible solutions before scheduling an escalation call back?
Answer: Yes No
Skill: Problem Assessment & Resolution
Score: 10/10 (100%) Call Segment: Closing, Score: 15/15 (100%)
© 2009 DMG Consulting LLC - 43 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Question: Did the agent thank the customer and ask if there was anything else they could help them with?
Answer: Yes No
Skill: Problem Assessment & Resolution
Score: 5/5 (100%)
Question: Did agent review the call and get customers approval of call's resolution?
Answer: No Yes
Skill: Problem Assessment & Resolution
Score: 5/5 (100%)
Question: Did agent thank the customer for their business?
Answer: Yes No
Skill: Procedural Requirements
Score: 5/5 (100%)
Source: VPI, April 2009
© 2009 DMG Consulting LLC - 44 - April 2009 All rights reserved.
© 2009 DMG Consulting LLC - 45 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 9: VPI New Order Precision Quality Monitoring Form
Description: This form is for use by any inbound sales organizations that focuses on improving the
quality and outcome of its “new order” calls that are automatically identified via desktop analytics.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
New Order Precision Quality Monitoring Form
Agent: Agent Name Evaluation Start Time: Date and Time Group: Group Number Evaluation End Time: Finalized Date or “In Progress”
Evaluator: Evaluator Name Interaction Date: Date and Time
Quality Skill Scores:
Overall Score: 70/70 (100%)
Greeting 10/10 (100%)
Communication 25/25 (100%)
Sales 10/10 (100%)
Upselling 10/10 (100.00%)
Closing 10/10 (100.00%)
After Call Work 5/5 (100.00%)
Evaluation Comments:
Enter Evaluation Comments Here
Call Segment: Opening, Score: 20/20 (100%)
Question: Did the agent use the appropriate greeting and properly identify themselves to the caller? Answer: Yes No
Skill: Greeting Question: Did the call center agent verify and update customer information, if needed? Answer: Yes No
Skill: Communication Question:
Answer: Skill:
Was the agent's tone and word choice in line with the caller's?
Yes - the agent was extremely sensitive to the caller's needs Mostly - the agent listened to the caller, but did not try to actively identify the problems for the caller No - the agent did not attend to the caller's tone and urgency or anticipate their issues
Communication
Call Segment: Order Entry, Score: 20/20 (100%)
© 2009 DMG Consulting LLC - 46 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Question: Did the agent go through all available order options prior to taking the order information? Answer: Yes No
Skill: Sales Question: Was the customer offered the monthly promotion? Answer: Yes No
Skill: Upselling Question: Did the agent upsell the customer to the Gold level, if not already ordered? Answer: Yes No
Skill: Upselling
Call Segment: Closing, Score: 25/25 (100%)
Question: Did the agent confirm shipping method and location before summarizing the order? Answer: Yes No
Skill: Proceedural Requirements Question: Was an order summary provided, including total estimated cost and delivery date, to customer? Answer: Yes No
Skill: Proceedural Requirements Question: Did the agent offer additional assistance to caller and thank them for the order? Answer: Yes No
Skill: Communication Question: Did the agent use the correct closing for a new order call? Answer: Yes No
Skill: Closing
Call Segment: After Call Work, Score: 5/5 (100%)
Question: Did the agent enter the correct notes in to the summary field for a new order? Answer: Yes No
Skill After Call Work
Source: VPI, April 2009
© 2009 DMG Consulting LLC - 47 - April 2009 All rights reserved.
© 2009 DMG Consulting LLC - 48 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
7. Quality Assurance Monitoring Criteria and Guidelines
Quality assurance monitoring guidelines specify the criteria to be used by QA specialists and
reviewers in determining if an agent properly demonstrated a skill during a transaction. They also
indicate which section of the form to use when scoring the various components of the interaction.
7.1 Quality Monitoring Criteria and Guidelines
The quality monitoring criteria define sections, questions and skills that are captured in the
quality evaluation form. They describe how a skill should be demonstrated so that QA reviewers
know how to objectively evaluate and score an interaction. For example, if a question on the
evaluation form is intended to determine if an agent established rapport with a customer, the
criteria must define what building customer rapport means, i.e., addressing the customer by name
throughout the call or acknowledging an event that the customer mentioned in the course of the call
(anniversary, birthday, vacation plans, etc.). Please keep in mind that the meaning of building a
rapport at one company may be the opposite in another. While some companies may want their
customers to call their customers by their first name, others may insist that only a title and the last
name be used.
The criteria should also reflect which section and questions in the QA evaluation form should be
used to score the various call elements. While this may seem obvious, it is often quite difficult.
Consider this example: A quality monitoring form has the following three questions in the
communication skills section: Demonstrated effective listening skills? (3 points), Expressed empathy
and concern as appropriate? (1 point), and Established rapport with the customer? (2 points). If
during the course of a call a customer mentions that they were recently hospitalized and the agent
does not acknowledge that, where should this issue be captured and scored in the prior three
questions? The answer is that as long as it is consistently addressed in the same question, it doesn’t
matter. The quality monitoring criteria must specify where the issue needs to be addressed.
Action Item: Assign a member of the quality assurance review team to be responsible for overseeing the quality monitoring criteria and maintaining them, as required.
© 2009 DMG Consulting LLC - 49 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
7.2 Why are Quality Monitoring Criteria and Guidelines Necessary?
Quality monitoring criteria and guidelines take the “guesswork” out of figuring out the right
way to evaluate and score a transaction. The guidelines standardize the use of QA evaluation forms,
allowing QA reviewers to handle each transaction objectively and consistently. They also safeguard
against “double‐dipping” – a term used by agents to refer to a situation where they have lost points
for more than one question on a QA evaluation for the same issue. These issues are flushed out and
resolved during the testing and calibration process.
Action Item: Invest time up front to develop clear and concise quality monitoring criteria and guidelines. This yields a standardized approach and helps minimize misunderstandings about how to conduct QA evaluations.
© 2009 DMG Consulting LLC - 50 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
8. Calibration
Calibration, a method for building consensus and delivering a standardized evaluation tool,
is key to the success of all QA programs. It is a process intended to ensure that QA results are valid
and based on reliable measurement tools. For a QA program to be credible and produce accurate
and dependable results, it’s essential that all reviewers evaluate transactions on a consistent basis.
Calibration is not a one‐time event. To keep all evaluators synchronized, calibration must be done on
an ongoing basis and include all people involved in conducting evaluations. It’s also a great way to
build camaraderie and support for a QA initiative. Calibration should be conducted on a monthly
basis. If a contact center is a multi‐site environment, all sites should participate in group‐wide
calibration sessions to ensure that transactions are being evaluated consistently regardless of the
site where the transaction is handled.
8.1 What is Calibration?
Calibration is a process where all QA reviewers discuss how to score various types of
transactions. The QA reviewers meet and review agent transactions. Every individual scores the
same transactions and then scoring differences are identified. The reviewers then discuss the
reasons for the differences and reach consensus. DMG recommends that agents be invited to
participate in calibration sessions so that they can gain an appreciation of the effort and rigor
applied to the QA process. When conducted properly, calibration sessions foster collaboration and
establish consensus on how quality monitoring criteria should be applied to each question and
scored on the evaluation form.
Action Item: Update quality monitoring criteria based on the consensus reached in calibration sessions.
© 2009 DMG Consulting LLC - 51 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
8.2 Benefits of Calibration
When performed consistently, generally monthly, calibration is a proven approach for maintaining
program integrity and equity, and ensuring that agents are treated fairly. Among its many benefits,
calibration:
• Helps develop effective QA evaluation forms
• Teaches QA reviewers how to apply evaluation criteria and perform QA evaluations on a
consistent basis
• Builds consensus among all QA reviewers
• Helps maintain an open dialogue between the QA team and management
• Keeps staff updated about changes to the program, scoring criteria and the evaluation form,
as the program evolves
• Enhances agent perception of the program’s credibility and fairness
• Fosters collaboration and camaraderie among QA reviewers
• Keeps supervisors well versed in how agents are being evaluated and facilitates more
effective coaching
© 2009 DMG Consulting LLC - 52 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
8.3 The Calibration Process
Figure 10 provides a general overview of how calibration works in most contact centers,
although this process varies among companies. A set of recorded calls (and/or emails and chats in a
multi‐channel environment) are tagged for calibration and sent to all QA reviewers. Each reviewer
evaluates and scores the transactions. The QA manager generates a report that reflects scoring
variances between the reviewers for each transaction; this must be done at the question level to be
useful. (In some organizations, the scoring variances are calculated against a “master evaluation”
that is designated as the department standard.) A calibration meeting is held to review scoring
results and to discuss the variances. If a scoring discrepancy is uncovered, each reviewer presents a
justification of their scoring. In most cases, this generates a lively discussion that requires the group
to replay the transaction, or verify information in the procedure or training guide. Once all opinions
are presented, the group reaches a consensus on the best way to score each question. This is likely
to address how the question was rated and/or which form category should be used to capture a
particular issue. This is an important part of the calibration process, as point values vary depending
on where call elements are scored. The quality monitoring criteria guidelines (as well as procedure or
training manuals, if applicable) should be modified or updated based on the consensus outcome.
© 2009 DMG Consulting LLC - 53 - April 2009 All rights reserved.
CCnt
er
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
Figure 10: The Calibration Process
Source: DMG Consulting LLC, April 2009
Action Item: Include agents in calibration sessions. This helps them appreciate the effort management puts into accurately assessing calls and emails and fairly evaluating agent performance.
© 2009 DMG Consulting LLC - 54 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9. Quality Assurance Program Mechanics and Processes
An important part of every QA program is determining the number of evaluations that must
be completed to come up with either operationally or scientifically valid results. Once this number is
determined, management must decide if it can afford the resources required to conduct the optimal
number of evaluations. If the answer is “no,” as it is in many contact centers, then the challenge is to
come up with a lower number of evaluations to perform that will still provide insight into agent
performance and transaction trends. Determining the number and frequency of evaluations is called
defining the program mechanics.
9.1 Transaction Selection Criteria
Identifying transactions for QA is an important part of the process. There are four primary methods
for selecting transactions to be reviewed. The first approach is to set up a schedule to capture
transactions at pre‐defined intervals. The second approach is to randomly capture transactions. The
third approach is to use business rules to identify calls that require attention. The fourth method is to
use automation, such as speech or screen analytics, to identify transactions that are either really
good or really bad and require management attention. Regardless of the approach used, it’s
essential that the method capture transactions fairly and equitably. The capture method should be
communicated to agents so that they know how transactions are selected for review.
Once transactions are captured, they can be further qualified based on any number of factors,
including call direction, call duration, call type (based on wrap‐up, disposition or other interaction
classification mechanism), product type, etc. Date of interaction is another key factor as, depending
on the total number of evaluations being conducted per agent/per month, it’s a best practice to
spread out the evaluations over the course of the month. This helps to ensure that agents receive
timely feedback and gives them an opportunity to make adjustments to their performance.
Action Item: Set up a process to capture the best transactions for the QA team.
© 2009 DMG Consulting LLC - 55 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.2 Determining the Number and Frequency of Evaluations
The goal of a quality assurance program is to provide a statistically significant analysis of
service delivery and the quality of customer interactions. To accurately measure service quality and
establish credibility and reliability for the quality assurance process, randomly captured calls should
be evaluated consistently for all agents, on a regular basis. Unfortunately, most contact centers do
not have the resources to conduct QA on a statistically valid sample of transactions. Instead,
management generally specifies a number of calls/emails/chat sessions to evaluate on a weekly
and/or monthly basis. This number is based on QA resources. (To determine the number of QA
sessions that can be performed in a day, calculate the amount of time it takes to do an evaluation
and deliver a coaching session. Then divide this number by the working hours available per day,
which is generally 6.5.)
There are no industry guidelines for determining the ratio or number of calls that should be
monitored per contact center on a daily, weekly or monthly basis. This number varies based on total
number of agents, type of contact center (multi‐channel or multi‐skill) and transaction volume.
Here is a strategy that some contact centers use to determine the number of evaluations to monitor
per agent/month. Start by evaluating 10 calls per agent for a month to obtain a baseline figure. Each
subsequent month, reduce the number of evaluations for each agent by 1 and compare the
results/findings to the prior month. Continue this process until the variance between the results is
significant. At that point, the number of calls evaluated is as low as it can go. (The most common
number of calls evaluated for agents on a monthly basis is 3 to 5.) Another way contact centers
address QA resource limitations is to split the number of agent evaluations that need to be
completed among supervisors and quality reviewers. While it's important to have dedicated QA
resources, it's also essential for line supervisors or managers to keep informed about their agents'
performance. Evaluating agents is a great way to stay apprised of what’s happening with agents and
the contact center.
Action Item: Determine an appropriate number of transactions to evaluate for each agent on a weekly or monthly basis to give management insights into what’s happening in their contact center and to give agents a good idea of their erformance. p
CC
nter
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
9.3 Coaching
Coaching is another vital component of a successful QA program. Coaching is the process of
delivering feedback on a frequent, consistent and timely basis to agents. Effective coaching sessions
recognize agent strengths and assets in addition to identifying the areas where agents have
improvement opportunities. Coaching sessions that are instructive, supportive and collaborative
demonstrate management’s commitment to agents’ success. As depicted in Figure 11, a variety of
coaching methods can be employed. The methods include personalized coaching, best practice
clips, broadcast messages and real‐time feedback. Regardless of the method used, what is
important is that the required number of coaching sessions is, in fact, completed. Lack of coaching
or poor training is one of the primary reasons QA programs fail or do not achieve their anticipated
results. It also leads to agent attrition.
Figure 11: Coaching Methods
Source: DMG Consulting LLC, April 2009
© 2009 DMG Consulting LLC - 56 - April 2009 All rights reserved.
Broadcast Messaging
Best Practice Clip
Real-Time Feedback
Personalized Coaching
© 2009 DMG Consulting LLC - 57 - April 2009 All rights reserved.
CCnt
er
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
Coaching Best Practices
Always deliver coaching in a private setting to maintain confidentiality and avoid
outside distractions and interruptions
Begin coaching sessions on a positive note, highlighting agent strengths and
recognizing incremental improvements
When discussing performance opportunities focus on specific examples and
detail why the agent did not meet the performance standard
Provide the agent with specific examples of how the situation could have been
handled better, i.e., “Maybe a better way to explain this to the caller is to say,”
or “This information can be found on this screen…”
Encourage open dialogue by asking the agent to talk about areas where he/she
is having difficulties and discuss what kind of support would be most helpful
Establish incremental goals for the agent to achieve for the next coaching
session
Follow up as planned on any deliverables
Provide informal support between coaching sessions. Drop by, listen for a few
minutes and provide encouragement; a few kind words can have a significant
impact
End the session on a positive note
Action Item: Coach agents frequently using the method that is most effective for each individual.
© 2009 DMG Consulting LLC - 58 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.4 Evaluation Feedback and Escalation Process
The most effective QA programs facilitate two‐way communication between reviewers and
agents. Part of program development involves setting up the process for agents to receive their
completed evaluations, review them along with any recorded transactions, and channel any
questions or feedback back to the reviewer. Particularly in the early stages of the roll‐out, agents
who are not accustomed to being monitored are likely to challenge low scores. It’s important to have
a process that allows agents to voice their objections and escalate their concerns without trepidation
of repercussions. The following process is recommended:
• Give agents their completed QA evaluation(s) and underlying transaction(s) prior to them
sitting down with the QA reviewer
• If a coaching session is not already scheduled, allow agents to request a meeting with the
QA reviewer to discuss their evaluation(s)
• QA reviewers must be able to explain and substantiate how the transaction was evaluated
based on the recording and system information, citing written procedure and policy, or, in
case of an error, adjusting the score
• If the agent still disagrees with the score and how the form was evaluated, forward the
evaluation to a designated “QA arbitrator” for final disposition
Challenges to evaluation scores, as a rule, do not happen consistently or frequently. When they do
arise, it’s important for agents to feel comfortable raising concerns and voicing dissention.
Action Item: Invite agent participation in the QA feedback process, so that they feel empowered by the process.
© 2009 DMG Consulting LLC - 59 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.5 Ongoing Training
Ongoing training is important for agents and QA specialists. Beyond basic introductory
training, agents require up‐training on a consistent basis to keep abreast of changes in policies,
procedures, processes, products, services and systems. Besides being critical for optimizing agent
performance, training and coaching contribute significantly to agent job satisfaction, morale and
motivation.
Effective QA programs have a closed‐loop process with training so that issues identified during
evaluations are shared with trainers on a timely basis. An important part of the quality reviewer’s
responsibility is to identify trends and training opportunities to address performance gaps. The QA
program should include a process for funneling recommendations to the training department and
ensuring that all training, reference materials, policies and procedures are accurate and up‐to‐date.
It is also recommended that the quality manager and training department hold monthly meetings to
review training effectiveness and develop action plans to address any new
trends/issues/opportunities that are uncovered in the quality monitoring process.
Action Item: Use QA to identify agent training needs. QA specialist and trainers should work closely together to ensure that agents receive the training they need to consistently deliver an outstanding customer experience.
© 2009 DMG Consulting LLC - 60 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.6 Addressing Agent Performance Issues
For a QA program to be taken seriously by agents and be effective in changing their
behavior, its results must be reflected in each agent’s mid‐year and annual employee performance
review. Quality assurance evaluation results should be an important component of the process for
deciding the raises for agents. However, for this to happen, contact center management must work
with the human resource group, as they generally set the format for employee reviews. Agents
should never be surprised by the results of their performance reviews because they should be
receiving monthly, or at least periodic coaching sessions throughout the year as part of the QA
program. The following five steps will help quality reviewers and managers assist agents in meeting
contact center performance goals.
• Diagnose performance issues
Managers must be able to clearly articulate performance issues so that agents know what
they need to change and how to fix it. When preparing an agent’s performance appraisal,
review their quality assurance evaluation forms to determine where performance
opportunities exist and identify any trends. For example, determine if the agent consistently
has low or failing scores low in a particular category or for a particular call type. Are there
multiple performance issues, i.e., deficiencies in communications skills or problems adhering
to procedures, accessing information or processing transactions? Valuable insights can be
gained by conducting a side‐by‐side QA session with the agent to view first‐hand where and
when challenges arise or if work habits are contributing to their performance issues. Based
on the results of the trend analysis and/or observations from the side‐by‐side sessions,
document the underlying causes of performance problems.
• Create an action plan
Once the area(s) where the agent needs help have been identified, create an action plan to
address them. (Most performance reviews include a section for next steps or performance
improvements.) The contact center manager should discuss the findings with the agent’s
© 2009 DMG Consulting LLC - 61 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
supervisor, and then work with the supervisor and training department to identify the
resources that can be used to support the agent's development. This may include enrolling
the agent in additional training or up‐training sessions, eLearning or coaching sessions,
providing reference materials or job aids, offering additional system or communication skills
training, or regularly assigning time for the agent to sit with an outstanding colleague to
learn the correct way to handle interactions.
• Communicate with the agent
Depending on the structure of the organization, either the QA manager and/or the
supervisor should discuss performance issues with the agent. Begin by making sure that the
agent understands the quality evaluation criteria and how calls are scored. Be sure that the
assessment is constructive and recognizes the agent's strengths as well as performance
opportunities. Review the action plan that has been developed with the agent and
emphasize that it is intended to help them succeed. (While there are exceptions, most
people want to succeed but don’t always know what they need to do. Effective
communication may convert weak agents into top performers.)
• Provide consistent feedback and reinforcement
The most effective improvement programs provide continuous and encouraging feedback to
motivate contact center agents. When an agent is struggling to perform, it's a good idea to
offer him/her some extra attention. Conduct routine evaluations of the agent's calls and
provide timely feedback. When needed, provide additional targeted coaching that addresses
the agent's specific needs and reinforce what he/she is doing right. This feedback can be
formal or informal. For example, when a supervisor is walking by, he/she could take a few
minutes to stand by and listen to the agent and provide immediate feedback. Or, if the
agent is remote, the supervisor could dial in and listen to some calls to provide additional
feedback. Agents who know their management cares almost always perform better.
© 2009 DMG Consulting LLC - 62 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
• Monitor, track, and recognize improvements
Monitor the agent's performance to track progress and improvement. Motivate the agent by
recognizing and praising incremental improvements and continue to provide coaching and
support for performance opportunities. Reward employees for good performance. Provide
monthly recognition to employees with the best quality. Money is a nice reward, but public
recognition can keep employees focused on what is important: quality and customer
satisfaction.
Action Item: Integrate the results of the QA program into the department’s semi‐annual and yearly review process. Address agent performance issues identified during the QA process on a timely basis with directed, constructive and positive feedback.
© 2009 DMG Consulting LLC - 63 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.7 Rewards and Recognition
It’s important for agents to consider QA a departmental initiative that contributes to their
success, instead of a “gotcha” program. This can be achieved by using QA to readily identify and
reward outstanding performers. Establish a process for identifying and recognizing agents who
achieve quality monitoring excellence. This motivates the right behaviors and performance
throughout the department. QA reviewers, in cooperation with managers and trainers, should
identify and distribute best practice clips of outstanding agent interactions to all agents. Agents
learn best from their peers, so in addition to providing recognition, best practice clips provide
excellent examples of effective techniques for all agents to emulate.
There are many ways to acknowledge top performers, but it is essential to do so on a consistent
basis, and to make sure that the same people are not the ones always recognized. (Avoiding this
common pitfall will help the QA program succeed.) Among the ways to recognize different agents
on a monthly basis is to identify the top three monthly performers, the most improved agent from
the prior month, the top three performers for the quarter, etc. Incentives are ideal, but if prohibited,
then some other vehicle such as a lunch voucher, picture and write‐up in the contact center or
company newsletter, inclusion in a “walk of fame,” special parking spot, lunch with the CEO, etc.,
can be used. Invite top performing agents to take an active role in departmental activities, such as
coaching new hires, delivering an up‐training session, becoming a subject matter expert on a new
initiative, or cross‐training on a new function. This motivates the right performance behaviors and
keeps people engaged.
Action Item: Identify and reward outstanding performers, but build a process that rewards different people on a monthly basis to broaden the appeal and acceptance of the QA program.
© 2009 DMG Consulting LLC - 64 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.8 Updating Procedures/Training
Quality assurance is a continuous cycle of improvement and measurement that must be
adapted as products, processes, policies, customer expectations, systems, or business requirements
change. It’s important to develop a process for ensuring that guidelines, reference and training
materials are updated on a timely basis so that agents always have the right information. The QA
manager or specialists should be involved in keeping procedures up‐to‐date, as should the training
team. The appendix includes a sample format that managers can use to create policies and
procedures.
Action Item: Set up a process that keeps departmental policies, procedures and training materials up‐to‐date, based on feedback received from the QA program.
© ulting LLC - 65 - April 2009 2009 DMG ConsAll rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.9 Monitoring Quality Assurance Reviewers
Just as agents need to be evaluated on a consistent basis to monitor and measure their
performance, the same holds true for quality assurance reviewers. Managers must oversee the
performance of QA reviewers to make sure that they are meeting their productivity goals for the
number of QA reviews and coaching sessions each month and that they are doing so at the highest
level of quality. Exit interviews of agents have found that one of the main reasons they leave is
because they do not get timely feedback (and in many cases, any feedback) from management and
don’t think the organization cares about them.
Calibration, which is discussed in section 8, is important for maintaining the integrity and
consistency of the QA program. If managers attend some of the calibration sessions – which is a best
practice – they can also identify QA specialists who are struggling with the program. Management
can then address the performance issues and get the QA specialist re‐aligned with the goals of the
QA program.
It is also a good idea for the QA manager or the contact center manager (if they are overseeing the
program) to conduct monthly or periodic audits of all reviewers. Here is how the audit process
works:
• The QA manager sets an accuracy threshold that QA reviewers must maintain in order to pass
the monthly audit.
• The QA manager selects a certain number of interactions per channel to include in the audit (at
least the same number of interactions that are evaluated for each agent per month).
• The QA manager evaluates the interactions selected for the audit to create a baseline/master
evaluation against which every QA evaluator’s performance is compared.
• The QA manager sends the group of transactions to be used for the audit to all of the QA
reviewers to evaluate independently. The completed evaluations are returned to the QA
manger; each evaluation is compared against the baseline evaluation for that particular
interaction to identify scoring variances.
© 2009 DMG Consulting LLC - 66 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
• The QA manager produces a report that reflects scoring variances at the question, section and
overall evaluation level; based on the established accuracy threshold, the QA manager tracks
the performance of each reviewer and whether they or passed or failed the audit; the manager is
looking to make sure that each QA reviewer is evaluating transactions consistently.
• Based on the performance opportunities uncovered in the audit process, the QA manager
should reach out to various QA reviewers and provide feedback and coaching.
• Any QA reviewer who fails the monthly audit for more than three months and has been coached
should be removed from performing QA evaluations until they are fully retrained.
• After retraining a QA reviewer, audit them again to determine if they are re‐aligned with the rest
of the team; if they are, give them positive feedback and, if not, remove them from doing QA
reviews.
Action Item: Management should evaluate QA evaluators on a periodic basis to keep the QA program aligned with the needs of the business. It will also make it clear to agents that management is invested in this program and taking it very seriously. This will help reduce complaints from agents about being treated unfairly.
© 2009 DMG Consulting LLC - 67 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.10 Reporting
Accurate, timely and effective reporting is a requirement for any major initiative. An
important part of setting up a QA program is to define the organization’s reporting needs. If QA is
being done manually, it’s necessary to design and create the report templates. If using a QA
application, work with the vendor to enhance or create custom reports if the solution does not have
all the reports needed. However, regardless of the approach used, it’s a good idea to define the
organization’s reporting requirements early on in the process in order to know what information
needs to be collected.
Below are a few reports that many organizations have found useful.
Contact Center Quality Management Report for: December 2008
Average Contact Center QA Score: This report displays the average QA score for the contact center
for the current month and as compared to the last 2 months. It is useful for determining the contact
center’s overall quality performance.
Figure 12: Average Contact Center QA Score: December 2008 Month Average QA Score December 80.5% November 2008 74.6% October 2008 81.3%
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
80.50%74.60%81.30%
0%
20%
40%
60%
80%
100%
Oct-08 Nov-08 Dec-08
Average Scores by Evaluation Section: This report displays the monthly average scores for the
contact center at the evaluation section level, and as compared to section averages over the past
two months.
Figure 13:Contact Center Average Scores by Evaluation Section: December 2008 Section December November October
Greeting (5 pts) 3.4 2.3 3.9 Verification (10 pts) 8.6 7.6 6.8 Plan/Benefit Knowledge (25 pts) 22.7 21.8 22.3
Inquiry Resolution (25 pts) 19.6 21.4 18.9 System Knowledge/Usage (10 pts) 6.1 7.8 5.9
Hold/Mute/Transfer (6 pts) 4.3 4.7 3.9 Communication Skills (14 pts) 9.7 8.1 7.5
Closing (5 pts) 4.7 5.1 3.2
© 2009 DMG Consulting LLC - 68 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
0
5
10
15
20
25
Greetin
g (5 p
ts)
Verific
ation
(10 p
ts)
Plan/Ben
efit K
nowled
ge (2
5 pts)
Inquir
y Res
olutio
n (25
pts)
System
Kno
wledge
/Usa
ge (1
0 pts)
Hold/M
ute/Tran
sfer (6
pts)
Commun
icatio
n Skil
ls (14
pts)
Closing
(7pts
)
Average Scores by Evaluation Question: This report displays the monthly average scores for the
contact center at the evaluation question level, and as compared to specific question averages over
the past two months. The purpose is to surface new trends or areas where training opportunities
exist, and to determine if the prior month’s training was effective.
© 2009 DMG Consulting LLC - 69 - April 2009 All rights reserved.
© 2009 DMG Consulting LLC - 70 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Figure 14: Average Scores by Evaluation Question: December 2008 December November October
Greeting (5 pts) Used call greeting as defined in Greeting Policy (2 pts)
Documented caller name and phone number (3 pts)
Verification (10 pts) Verified caller as defined in Verification Policy (5 pts)
Verified eligibility as defined in Eligibility Policy (3 pts)
Asked caller for Member and/or Patient’s Date of Birth (1 pt)
Asked caller for address on file (1 pt)
Plan/Benefit Knowledge (25 pts) Accurately identified plan type, benefit, or claim (5 pts)
Demonstrated thorough knowledge of plan type, benefit or claim information (5 pts)
Provided complete and accurate information or instructions in accordance with established procedure (15 pts)
Inquiry Resolution (25 pts) Accurately understood the nature of the caller’s inquiry (5pts)
Effectively/accurately resolved inquiry/issue in accordance with established procedure (5 pts)
Completed fulfillment/ referral/follow-up as promised/required (5 pts)
Provided alternatives, as appropriate (3 pts)
Provided time frames, as appropriate (2 pts)
Ensured caller fully understood explanation, process, and/or next steps (5 pts)
© 2009 DMG Consulting LLC - 71 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
December November October
System Knowledge/Usage (10 pts) Effectively accessed and utilized all appropriate systems, screens and fields to obtain information to resolve inquiry (8 pts)
Accurately utilized wrap-up (2 pts) Hold/Mute/Transfer (6 pts) Utilized hold/mute as defined in Hold/Mute Policy (2 pts)
Performed transfer as defined in the Warm Transfer Policy (2 pts)
Transferred the call to the correct area (2 pts)
Communication Skills (14 pts) Maintained a courteous, pleasant, and respectful tone throughout the call (3 pts)
Conveyed information clearly and confidently and in a manner that was easily understood (3 pts)
Demonstrated effective listening skills (2 pts)
Expressed empathy and concern as appropriate (1 pt)
Efficiently managed time and call flow (call management) (2 pts)
Demonstrated professionalism (call etiquette) (3 pts)
Closing (5 pts) Used call closing as defined in Call Closing Policy (2 pts)
Accurately documented information in accordance with established procedure (3 pts)
© 2009 DMG Consulting LLC - 72 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Average QA Scores by Agent: This report displays the average QA scores for each agent in the
contact center, for the current month and prior two months. It is useful in identifying agents who are
not meeting goal and individual agent performance trends.
Figure 15: Average QA Scores by Agent: December 2008 Agent December November October
Agent Name Agent Name Agent Name Agent Name Agent Name Agent Name Agent Name Agent Name
QA Scores by Agent by Evaluation: This report provides a breakdown of the scores that each agent
achieved for their evaluations in the current month. It is helpful in identifying very good or very poor
performance or interactions that may require attention or follow‐up by the supervisor.
Figure 16: QA Scores by Agent by Evaluation: December 2008 Evaluation Scores for December 2008
Agent Evaluation 1 Evaluation 2 Evaluation 3 Evaluation 4
Agent Name Agent Name Agent Name Agent Name Agent Name Agent Name Agent Name Agent Name
© 2009 DMG Consulting LLC - 73 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Average QA Scores by Agent by Category: This report provides a breakdown of the average scores
by category for each agent. It helps identify individual needs and areas of coaching opportunities
specific to each agent or team
Figure 17: Average QA Scores by Agent by Category: December 2008
Agent Gre
etin
g
Verif
icat
ion
Plan
/Ben
efit
Kno
wle
dge
Inqu
iry R
esol
utio
n
Syst
em
Kno
wle
dge/
Usa
ge
Hol
d/M
ute/
Tran
sfer
Clo
sing
Action Item: Invest the time up front to define the reports needed to support your QA initiative. Ask all relevant constituents – contact center and QA managers, supervisors, QA specialists, trainers and possibly, marketing – what information they would like to see from the QA program, and design reports to meet these needs.
© 2009 DMG Consulting LLC - 74 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
9.11 QA Database
If an automated application is not being used for QA, it is necessary to build a database for
managing QA data. Most organizations build this in either Access or Excel. The goal is to make it
easy to input and retrieve data and to make it easy to run reports.
Action Item: If you are not using a QA/recording system, build a database for tracking agent QA scores so that this data can be used to identify trends and create reports.
© 2009 DMG Consulting LLC - 75 - April 2009 All rights reserved.
CC oo nn
tt aa acc c tt t CC Cee e nn n
tt t ee err r QQ QAA A GG Guu u ii i dd dee e
Cont
10. Kicking Off the QA Program
Once the QA program is built, it’s time to kick it off. The entire contact center staff must be
trained prior to the beginning the pilot. This is very important because it sets the tone for the
program. This is also where an organization can introduce their own “personality” into the program.
DMG suggests coming up with a name for the QA program so that it isn’t just called “the QA
program.” Some organizations run a name contest, which is another way of getting buy‐in and
support from contact center staff.
The kick‐off presentation should address the following topics:
• Introduce QA leaders
• Explanation of QA
• Description of its benefits
• What QA means to agents, including how it can help their careers
• Discussion of program mechanics
• Review of the QA evaluation form
• Upcoming training
• Rewards and recognition
Action Item: Make the QA kick‐off a celebratory and fun affair that explains the benefits of QA for agents, customers and the greater enterprise. Use this kick‐off to get staff buy‐in.
© 2009 DMG Consulting LLC - 76 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
10.1 Agent Training
Training is necessary to prepare agents for the QA program and the changes that it will bring
to the department. Training is also a great way to alleviate apprehension about the program; the
more information people have about the QA program, the more comfortable and welcoming they
will be. The outline in Figure 18 below presents the topics that should be covered in an agent QA
training program. However, every contact center should customize this basic training program to
meet their needs. Additionally, going forward, a QA module should be added to the new agent hire
training program.
Figure 18: QA Agent Training Outline Agenda Item/Activity Topics covered Time
QA and its Benefits
• What is QA? • How will QA benefit you and your
customers? • How was the program developed? • What are the program components? • What are the program goals? • Who are the people that will be involved?
1.5 hrs
Procedures • What are procedures? • Why are they important? • How should you use them? • How does QA use them? • How are they modified and updated?
If the procedure guide is new, the trainer will have to review the document and explain how it is organized and used.
• Role play to get accustomed to using the procedures.
3.5 hrs
Policies • What are policies? • How do policies differ from procedures? • How should you use them? • How does QA use them? • How are they modified and updated?
If the policy guide is new, the trainer will have to review the document and explain how it is organized and used.
• Role play to get accustomed to using the policies.
2.5 hrs
© ulting LLC - 77 - April 2009 2009 DMG ConsAll rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Agenda Item/Activity Topics covered Time
QA Form and Criteria
• How was the form developed? • What is the form’s purpose? • How is the form used? • How is the form scored? • What are the criteria used to score each
question/section? • How will the scores be used?
2 hrs
Program Mechanics
• How will the program work? • How many calls will be monitored per
week/month? • Who will be doing the evaluations and
coaching sessions? • How will calls/emails/chats be selected for
evaluation? • How will I receive my evaluations? • When can I review my evaluations? • How can I respond to my evaluations? • What if I disagree with my score?
2.5 hrs
Agenda Item/Activity Topics covered Time
Calibration • What is calibration? • Why is calibration necessary? • How often is calibration conducted? • Who participates in calibration?
Conduct a calibration session with agents using mock calls to demonstrate how calls will be evaluated and scored. (Agents will see that this is a complicated process that requires a great deal of skill.)
2.5 hrs
Evaluator Audits • Who will be audited? • Why are audits important? • How does the audit process work?
.5 hr
Recognition Program
• What are the recognition criteria? • How does the rewards and incentive
program work?
.5
Program Pilot • What is a program pilot? • How does the pilot work? • How long will the pilot be conducted? • Will scores during the pilot count?
1 hr
Action Item: Create a QA training program to teach agents about all aspects of the QA program and its benefits. Use this program to kick off QA. Also incorporate a QA module in your new agent training program.
© 2009 DMG Consulting LLC - 78 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
10.2 QA Program Pilot
After training all contact center staff and other interested parties, it’s time to conduct the
QA program pilot. There are many opinions about pilots and their purposes, and this is certainly true
for QA. If this is the organization’s initiation into QA, the pilot is necessary to give management, QA
reviewers and agents a hands‐on opportunity to get acquainted with QA in a non‐confrontational
environment where the QA scores are not being counted. If the organization is updating its QA
program, the pilot gives everyone an opportunity to get used to the new QA criteria and evaluation
forms. The timing of the pilot also varies based on the needs of the organization. If QA is new to the
organization, we suggest beginning with a 2‐ to 3‐month pilot. Then, once the QA scores begin to be
counted, we suggest setting a lenient QA goal so that more people will rate well and feel good about
the program. Then, after a 3‐month period, during which there is a great deal of training and
coaching, it’s time to begin in earnest and to set up QA goals that reflect customer expectations for
great service. While this means that it will take 4 to 6 months before the QA program is in full
operation, since the goal of QA is to improve agent performance and the customer experience (and
by doing so to improve agent productivity), it will not delay the benefits. It will, however, allow for
positive momentum to build and help the program succeed.
The pilot is a great opportunity to make sure that the program achieves its goals. Throughout the
pilot, QA leaders should be open and welcoming of all observations and feedback. Use the feedback
and time to enhance and tweak the QA form and evaluation criteria, and address training issues.
Action Item: Communicate and share pilot results on a regular basis to keep contact center staff engaged and alleviate apprehensions about the program.
© 2009 DMG Consulting LLC - 79 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
11. Advanced Quality Assurance Initiatives
This Guide describes the process for building a contact center quality assurance program.
Once the program is in place and working well, some organizations want to go further and extend
the benefits and uses of their QA program. QA programs can be enhanced by adding surveying,
using QA to determine the first call resolution rate (FCR), or to do customer experience monitoring.
Below is an explanation of these three business activities.
11.1 Surveying
The best way to know if customers are satisfied with the quality of an organization’s
products, services, processes and agents is to ask them. QA measures how well agents adhere to
internal policies and procedures; it provides an internal view. Surveying captures the customer
perspective; the external view. When survey feedback is combined with QA results the company
learns what customers consider good service and specifically, which agents provide it. They also find
out what processes and policies need to be changed. When done right, sharing customer survey
information about agent performance can help improve quality. Agents find it helpful when they see
survey results and read or hear customer feedback first‐hand. It helps them appreciate how their
performance impacts customer satisfaction and the customer’s perception of their company.
11.2 Customer Experience Monitoring
Customer experience monitoring is intended to provide insights into the total customer
experience. Typically, contact center managers use this process to track and review interactions that
have multiple segments, are put on hold, transferred, required conferences, or are repeat calls.
Customer experience monitoring is done to evaluate the overall customer experience, not just call
segments.
© 2009 DMG Consulting LLC - 80 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
11.3 First Call Resolution (FCR)
First Call Resolution is a unique key performance indicator that measures the effectiveness,
efficiency and customer satisfaction level of a contact center. It is the only single metric that
provides a balanced view of the contact center’s overall performance. The challenge in using FCR is
that it is difficult to determine which calls are fully resolved during the first contact. There are many
ways to identify or calculate FCR, including asking the QA staff to determine if calls are resolved
during the first contact based on the interactions that they evaluate.
© 2009 DMG Consulting LLC - 81 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
12. Quality Management/Liability Recording Suites
Quality Management/Liability Recording suites, also known as workforce optimization (WFO)
solutions, deliver actionable insights and findings in addition to incorporating built‐in automated
efficiencies to increase the effectiveness of the QA process and reviewer productivity. WFO suites
are complex solutions that include 8 modules intended for contact centers.
Complete workforce optimization suites include eight functional modules. See Figure 19. The core
modules are recording (either time division multiplexing (TDM) or Internet Protocol (IP)‐based), and
quality assurance. Recording systems log calls for regulatory compliance, protection against
lawsuits, and quality review. Quality assurance applications are used to determine how well agents
adhere to internal policies and procedures. The other six modules are:
1. Workforce Management – forecasts and schedules agent staffing needs
2. Agent Coaching – tools to communicate with agents to assist them in improving their
performance
3. eLearning – learning management capabilities that allow training courses to be created,
issued and tracked to ensure effectiveness
4. Surveying – Web and IVR‐based solutions for creating, issuing, tracking and analyzing
customer feedback
5. Performance Management – solutions that help align contact center activities with
enterprise goals; also provide scorecards and dashboards to improve the performance of the
contact center
6. Speech Analytics – captures, structures and analyzes customer phone calls to identify the
reasons for customer calls and to gather insights
The value and benefits of these suites increase for the enterprise as additional modules are added.
© 2009 DMG Consulting LLC - 82 - April 2009 All rights reserved.
CCnt
er
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
WFO Suites
In the last couple of years, use of workforce management, recording and, to a lesser degree,
quality assurance modules has expanded into enterprise back offices and branches. This trend will
continue because these solutions offer quantifiable benefits for many operating areas in an
enterprise.
Figure 19: Workforce Optimization Suites
Source: DMG Consulting LLC, April 2009
Valu
e Pr
opos
ition
QM/RecordingSUITE
Speech Analytics
Performance Mgmt.
Surveying
Back Office
eLearning
Agent Coaching
Workforce Mgmt.
Quality Assurance
RecordingCal
l Cen
ter
Con
tact
Cen
ter
Ent
erpr
ise
© 2009 DMG Consulting LLC - 83 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
In the last few years, the QA vendors have delivered many innovations to the market to make these
applications more helpful and actionable for contact centers. These enhancements include:
• Automated call scoring – every recorded interaction is systemically scored, allowing rapid
identification and action on calls that do not meet pre‐defined quality requirements. Instead
of evaluating a random sample of calls, this enables organizations to focus limited QA
resources on monitoring high‐impact calls.
• Flexible form development environments – the ability to easily create a wide variety of
evaluation forms and associate the most appropriate form based on call type, category or
disposition. Built‐in efficiencies include wizards for form creation or the ability to clone and
modify an existing form, a wide variety of scoring methodologies, the ability to create forms
based on skills or call components, the ability to add evaluator hints for scoring criteria,
comment boxes to capture coaching tips, spell check, form preview and test capabilities, and
the ability to create specialized forms for different types of evaluations such as agent self‐
evaluations, calibration or customer experience monitoring.
• Calibration – the ability to designate calls for calibration, automatically deploy calibration
sessions to supervisors and reviewers, track completion, and report on scoring variances at
the question, section or form level.
• Call categorization – the ability to systemically or manually categorize calls so that QA
efforts can encompass all call types received by the contact center and evaluate how agents
perform across all functions. This also allows reviewers to categorize calls and designate
them for review by the training department or a cross‐functional team if a business process
optimization opportunity exists.
Search/retrieval/replay – the ability to search and retrieve transactions based on a wide variety of criteria and metadata. Automatically delivers transactions that a reviewer is responsible for evaluating to the desktop and tracks completion status.
© 2009 DMG Consulting LLC - 84 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn n
tt t ee err r QQ QAA A GG Guu u ii i dd dee e
During desktop replay, evaluators can bookmark call segments and annotate them with
coaching comments that are tied to specific sections, questions or skills in the agent’s
evaluation.
• Coaching/eLearning – allows supervisors to easily create, deploy and track personalized
coaching sessions that address agent‐specific performance opportunities. Includes the
ability to share best practice clips with agents and the training department. eLearning
capabilities allow reviewers to assign learning sessions to agents while they are performing
an evaluation, or automatically deploy the courses based on performance thresholds.
• Reports and dashboards – automated summary and detailed reports that can be scheduled
to run or designated as a dashboard report for tracking and trending QA results.
The Market Place
The QM/recording (WFO) technology sector is a global market with more than 45
competitors. There are many strong offerings available to satisfy the needs of companies large and
small. The solutions, pricing and vendor specialties vary widely. This market gives prospects a wide
variety of opportunities to acquire a solution that meets the specific needs of their company at a
price point they can afford. Users also have many options for acquiring these solutions; they can buy,
use hosting, or go with a managed care offering.
Action Item: Determine if a QA/Recording or WFO suite will add value to your organization. If it will, find the right one for your company.
CC
nter
C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
Appendix A: Procedure Format Sample
Situation: Caller is requesting a change of address.
Important: Address changes can only be requested by the primary/joint card member.
Greeting (as defined in Greeting Policy) Complete Verification Procedure:
• Caller name • Last 4 digits of SSN • Mother’s maiden name • Date of birth
• IF THE CALLER IS NOT THE PRIMARY/JOINT CARD MEMBER:
Access: Memo screen Check: [ Special Instructions ] field to determine if the caller has authorization to make changes on the account (power of attorney) If caller is not listed as someone with authorization and/or power of attorney (POA) in the special instruction field: Advise: Only the primary/joint card member is authorized to request an address change Options: The primary/joint card member will need to call to request the address change. – or – The primary/joint card member can: Write to: Fax to: Process online at:
• IF THE CALLER IS THE PRIMARY/JOINT CARD MEMBER:
Verify:
• Current address
Check: [ Address ] field on the Customer screen
© 2009 DMG Consulting LLC - 85 - April 2009 All rights reserved.
© 2009 DMG Consulting LLC - 86 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Advise: If new address is on the system: If not: Tell the account holder that we have the correct address on file.
Access: Account Maintenance screen. Obtain new address from the account holder and enter the new address in the appropriate fields.
Read back the new address to the customer including spelling of the street address and town to confirm accuracy.
Access: Billing screen
Check: [Last bill date] field to determine of the next statement has already been mailed.
If next statement has already been mailed:
Advise: The new statement has already been sent to the old address. You will have a copy of the statement sent to the new address. The cardmember should expect to receive the copy with in 5 to 7 business days.
Access: Media Request screen
Click on Statement.
Enter in month/year of the statement that you are requesting in the [month and year] fields.
Hit enter to complete the transaction.
If next statement has not been generated/mailed:
Advise: The next statement will be mailed to the new address on file.
© 2009 DMG Consulting LLC - 87 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Ask: Is there anything else I can help you with? If yes, assist card member with the additional request(s). If no, perform call closing as defined in Call Closing Policy.
Source: DMG Consulting LLC, April 2009
© 2009 DMG Consulting LLC - 88 - April 2009 All rights reserved.
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
Appendix B: Policy Format Sample
Warm Transfer Policy All transfers initiated from the contact center should be warm transferred by performing the following steps:
• Representative advises the caller that the call needs to be transferred to another area that can assist the caller with their request
• Representative asks the caller for permission to transfer the call
Before initiating the transfer, the representative should ask the caller if there is anything else that they can assist the caller with
• If yes, the representative should assist the caller with the request (if applicable)
• Representative asks permission from the caller to place them on hold while they
connect to the appropriate department to transfer the call
• Representative clicks on the phone icon which will launch a separate window
• Representative clicks “Start conference,” enters the extension number of the
department that is receiving the call, and clicks “OK”
• When the receiving department answers the call, the representative should
provide:
• Account number
• Name of caller
• Nature of the call
• Representative clicks “Complete conference” and introduces the caller to the
representative in the department who is accepting the call, thanks the customer for calling and clicks the “Hang-up” button to complete the transfer
Source: DMG Consulting LLC, April 2009
© 2009 DMG Consulting LLC - 89 - April 2009 All rights reserved.
About VPI
CC C oo o nn n
tt t aa acc c tt t CC Cee e nn
tt eerr QQ QAA A GG Guu u ii i dd dee e
nter
VPI (Voice Print International, Inc.) is the premier provider of integrated interaction
recording and workforce optimization solutions for enterprises, small‐ to medium‐size
businesses, trading floors, government agencies, and first responders. Through its award‐
winning suite of solutions, VPI empowers organizations to proactively improve the
customer experience, increase workforce performance, ensure compliance, and align
tactical and strategic objectives across the enterprise. With the power to be proactive,
organizations are equipped to actively identify and maximize opportunities and minimize
risk. For more than a decade, VPI has been providing proven technology and superior
service to more than 1,000 customers in over 35 countries. For more information, visit
http://www.VPI‐corp.com.