In the world of computing technology and communications solutions, the last twenty years have been one revolution after another. Faster processes, niftier appliances, smarter phones, and virtualized services are everywhere. Contact centers have grown from hunt groups and operator consoles to full-blown CRM solutions riding on top of evolving, complex technologies.
Many of us are excited to use the new technologies because of three potentially significant benefits:
- generate cost savings
- facilitate increase revenues
- provide better customer service
Many businesses are really good at tracking the cost savings and revenue generation benefits associated with technology deployment and upgrades. But a recent IQ Services poll revealed that 68 percent of respondents did not have a proactive way to track the customer service quality offered by their IVR and other customer-facing technologies. Some of these businesses use internal monitoring to see if all the individual technologies are working. But they do not have an effective way to test or monitor from the top of the stack or the critical end-to-end caller perspective.
How to Get the Caller Perspective
So how do you get to the “top of the stack” of caller perspective of IVR and end-to-end contact center solution performance and service quality?
Although the answer to this question is “it depends on your business requirements, practices and objectives,” there are four general best practices almost any business can quickly use to get and maintain a holistic view of service quality delivered by customer-facing technologies:
The first three best practices – secret shoppers, surveys and social media – tend to receive a lot of buzz in CRM and technology circles. There is no lack of information about these critical outside-in techniques for assessing a business’ customer-facing technologies and customer service practices.
IVR Application Testing and Monitoring
Let’s briefly explore the fourth, less explored bullet – IVR testing and monitoring. To ensure IVR and customer-facing technologies perform and deliver the expected service quality, testing and monitoring techniques must be used before and after deployment. Before initial deployment or before any significant change in capacity, infrastructure, or application functionality is finalized, the following testing must be completed to ensure optimal service quality:
Application feature testing: helps ensure the whole application – every logical twist and turn – works as specified
Application load testing: confirms the integrated customer-facing technologies perform as specified under full load
Application usability testing: delivers crucial insight as to whether prospective users/customers are befuddled by the application or they can quickly and efficiently navigate the menu options.
When? Before Deployment
Rigorous application feature testing, load testing and usability testing are required to ensure IVR applications are easy to use and implemented according to business rules. There are so many things to be learned from these techniques:
- Does the application respond as expected?
- Are your premium customers treated in the manner you expected?
- Are unexpected results associated with a particular dialog state or call flow?
- Does the application “sound” right?
- Do early smoke test results conducted throughout development indicate discrepancies between the development effort and the design?
- Do critical flows through the application work as designed?
- Do global commands work at every state?
- How does the solution respond to an out-of-grammar or invalid input?
- What happens after consecutive invalid inputs?
- How does the system respond to a valid input after invalid input?
- What happens when an input barges into a playing prompt?
- What happens if no inputs are made?
- Does the application perform under expected load? Under peak conditions?
- Can callers complete the self-service transactions you’d most like to keep in the IVR?
For each best practice and service quality technique, there are a range of approaches that vary in terms of cost, resource commitment, value of insight delivered and more. Each business must evaluate the insight and value against the cost and risks of each approach to determine the right combination.
There’s the manual approach. Since you already have a captive staff to carry it off, you ask your coworkers and employees to use the IVR. They let you know whether or not they think the application flows as the documentation you sent them said it should. You assign each person to a handful of paths selected from hundreds of possibilities and ask them to email the results.
This approach can help minimize front-end costs. But it is not always efficient or effective since your co-workers may be busy and many of the things you need to learn from application testing are hard to uncover without a more structured and automated approach to testing. In the end, you can incur costs you did not plan for with this approach.
You can hire an army of testers – people who are trained to make test calls. With this approach, you can avoid over-burdening your existing staff. However, you may be dealing with the imprecision of people making phone calls, interpreting responses and taking notes. It may also mean you can only test a subset of the application’s functionality.
You might consider buying test equipment.
Automation adds precision. Someone must be trained or hired who knows how to program the test cases, connect the device to the network and run it. The results must still be evaluated and reported.
You could outsource the project. By using a company with facilities and expertise to comprehensively exercise all features of an IVR application, you avoid burdening your overworked team. You also know the most updated techniques and tools are leveraged to test report results.
Whatever option you choose, application testing MUST take place. There is too much at risk to put a system into production without having validated its functionality, usability and robustness.
When? After Deployment
Once an IVR or customer-facing solution is in production, a business must know it keeps working around the clock to take care of your customers.
There is no need to run through all the test cases every hour just to be sure. You need a quick snapshot of how the solution is performing from the caller perspective. Most likely there are a handful of paths through the IVR application that will signal a problem. By running through these paths in the application every few minutes – or even once an hour – you can be confident the overall system is delivery quality service from the caller perspective. As with pre-deployment testing, there are a variety of approaches to monitoring service quality of production technologies, manual calls, purchased equipment or outsourced services.
Whatever approach is right for your business, it is critical that you not simply trust all is well. Immediate warning about issues, daily metrics and other monitoring reports give you the oversight you need to deliver top notch customer service through IVR and self-service solutions.
Stay on top of your IVR and customer-facing technology performance by testing at the top of the stack. You owe it to your customers and to the business stakeholders who expect as much value from the technology as possible.
Thanks to IQ Services for the article.