How Testing Solutions Reduce Risk & Improve Customer Satisfaction

How Testing Solutions Reduce Risk & Improve Customer SatisfactionImagine you’re trying to book a flight. You call the toll-free number and use the interactive voice response (IVR) to get through to bookings, but instead you are put through to the baggage area. You hang up and try again, but this time you wind up speaking to the airline lounge. Do you try a third time or call a competitor? I know what I would do.

The IVR is now a key component to delivering a great customer experience, so what steps should a business take to ensure these systems are working optimally? Do they take proactive measures, or just wait until a customer lets them know that something is broken? And, by the time it gets to this stage, how many customers may have been lost?

There are some businesses out there taking unnecessary risks when it comes to testing the reliability of their communications systems. Instead of performing extensive tests, they’re leaving it up to their customers to find any problems. Put bluntly, they’re rolling the dice by deciding to deploy systems that haven’t been properly tested. This is the primary line of communication with their customers and, in many cases, it’s also how they generate significant revenue, why would they put both customer satisfaction and revenue in jeopardy?

Businesses have quite a few useful options when it comes to proactive testing. We recently acquired IQ Services, a company that tests these environments on a scheduled basis to make sure they’re working properly. It’s an automated process that tests how long it takes to answer, makes sure that the correct responses are given, and even performs a massive stress test with up to 80,000 concurrent calls. (It’s very useful for scenarios such as a large healthcare provider going through open enrollment.) These testing solutions are the way that businesses can ensure that their systems are working reliably under heavy load without leaving anything to chance.

In a world where we think of people as risk-averse, it’s interesting to observe anyone who chooses not to perform these tests. It’s not necessarily a conscious decision if the situation were actually framed in a way where someone knew exactly what they were putting at risk, they’d probably make a better choice. You wouldn’t buy car insurance after you already had an accident. It simply wouldn’t do you much good at that point. The same thing applies to your communications systems. It only makes sense to take a proactive approach to make sure things are working as expected.

Now that you’re aware of what’s at risk if you don’t perform these important tests, don’t make the conscious decision to wait until something has already gone wrong. We’re talking about the potential loss of millions of dollars per hour (or even per minute in certain cases). Some strategic planning can give you the peace of mind you’ll avoid catastrophic loss of revenue in the future. Whenever you do go live with a new feature, you can do so with confidence.

We’ve brought these new Testing Solutions into the Prognosis family. Above and beyond, we want to make sure people understand these capabilities are available. You don’t have to be reactionary, there are proactive solutions to stop you from rolling the dice when it comes to your business and customers. Don’t leave the livelihood of your organization to chance. Of course, if you’re in the mood to gamble your money, there’s always Vegas.

Thanks to IR Prognosis for the article.

Advertisements

Be Confident Your Contact Center Technology Delivers The Brand You Promise

Be Confident Your Contact Center Technology Delivers The Brand You PromiseToday, most B2C interactions involve some form of contact center technology. With the exception of in-store purchases, contact center technology is responsible for providing the vast majority of brand impressions that customers have through experiences with toll-free numbers, self-service IVR, and CTI screen pop—and that’s just one of the channels. There are also self-service websites, apps, social media scrapers, blended queuing processes, and more.

SERVICE DELIVERED VS. SERVICE INTENDED

VOC programs ask customers for feedback on the experience as they remember it – which is extremely important when determining if the experience delivered was pleasing, useful, efficient, or memorable. But it’s also critical to monitor the experience delivered by the organization’s technology and compare it to what was intended. Customers don’t know whether or not the technology with which they’re interacting is actually doing what it’s intended to do, they just know whether or not it’s available when they want to use it and lets them get what they need quickly and efficiently.

CUSTOMER VS. TECHNOLOGY PERSPECTIVE

Data that’s easy to collect with regularity is what’s frequently relied upon for decision making and tuning—the data collected inside the contact center related to congestion, CPU consumed, call arrival rate, etc. Accurate, unemotional, precise data about the experience actually delivered has to come from the outside in, the way customers really access and interact with technology, and be collected in a controlled fashion on a regular basis.

IS THERE A BETTER, MORE STRATEGIC WAY TO DO THIS?

There are ways to gain a true assessment of the customer service experience as delivered. It starts with documented expectations for the experience as it’s supposed to be delivered. Defined expectations establish functionality, performance, and availability specs as benchmarks for testing.

It’s in every company’s best interest to know that the technology it put in place is, first of all, capable of making all those connections it has to make, and then actually delivering the customer service experience that it intended to deliver. To do that, you need reliable data gathered from automated, outside-in, scripted test interactions that can be used to assess the functionality that’s been put in place, as well as the technology’s ability to deliver both at peak load and then continuously once in production.

Think of automated testing as using an army of secret shoppers to access and exercise contact center technology exactly as it’s intended to be used: one at a time, then hundreds, thousands, tens of thousands of virtual customer secret shoppers. Think also about the technology lifecycle: development – cutover – production – evolution.

Start with automated feature/function testing of your self-service applications— voice and web—to ensure that what was specified actually got developed. Precisely verify every twist and turn to ensure you are delivering what you intended. Do that before you go live and before you do unit testing or load testing. Using an automated, scripted process to test your self-service applications ensures you have reliable discrepancy documentation—recordings and transcripts—that clearly document functionality issues as they are identified.

Next step, conduct load testing prior to cutover. Use automated, scripted, virtual customer test traffic, from the outside-in, through the PSTN and the web—to ensure your contact center technology really can perform at the absolute capacity you’ve designed and also at the call arrival and disconnect rates that are realistic. Plan failure into the process—it’s never one and done. Leave enough time to start small, to identify and address issues along the way.

Once in production, continuously access and exercise contact center technology just like real customers, through the PSTN and through the web—to ensure it’s available, assess its functionality, and measure its performance so you can be confident that technology is delivering the intended experience, 24×7.

If you have self-service applications that undergo periodic tweaks or enhancements, automated regression tests using those initial test cases will ensure all functionality is still on task after tweaks to the software or underlying infrastructure.

WHAT’S THE NET RESULT OF TESTING THIS WAY?

VOC feedback tells you what customers think about your efforts and how they feel about dealing with your brand and technology. Automated testing and monitoring allow you to determine if the interfaces you’ve provided are, in fact, providing the easy-to-use, low-effort experience you intend.

It’s pretty straightforward—quality and efficiency of experience drive loyalty, and loyalty drives spend. You HAVE to have both perspectives as you tweak the technology, and that’s the message. Listen to your customers as you decide what you want your technology to do, and then make sure your technology is doing what you intend it to do.

Originally Published in Speech Technology Magazine.

Elevating IVR: Stop the Hatred for Automation

IVR.

In many customers’ minds, this three-letter acronym is a four-letter word. It’s not uncommon for callers to mutter a diverse range of other forbidden words whenever interacting – or trying to interact – with a contact center’s IVR system.

But IVR is not deserving of such hatred. IVR systems are not inherently flawed or evil, nor are the companies that use an IVR to front-end their contact centers. The reason why the general public’s perception of IVR is so negative is that so few of the systems that the public has encountered have been designed properly by the humans behind the scene. The technology itself has tons of potential; it’s what’s dumped into it by organizations overly eager to enjoy the cost-saving benefits of phone-based self-service that makes the machines such monsters.

Not all IVR systems in existence today are so beastly. Some, in fact, not only play nice with customers, they delight them and keep them coming back for more. So how have the owners of these much-maligned systems succeeded in getting callers to drop their pitchforks and torches and embrace IVR?

By adopting the following key practices, all of which I stole from a host of IVR experts and now pass off as my own:

Adhere to the fundamentals of IVR menu design. Most of what irritates and confounds customers with regard to IVR can be easily avoided. Callers often opt out of the system or hang up due to too many menu choices, confusing phrasing/commands, and fear of dying alone in IVR hell.

Here are a handful of essential menu features and functions common to the best-designed IVR applications:

  • No more than four or five menu options
  • The ability to easily skip ahead to desired menu choices (e.g., having the system recognize that the customer pressed “3” or said what they wanted before the system presented such options)
  • Use of the same clear, professional recorded voice throughout the IVR
  • (For touchtone systems specifically) Giving a description of an action/option prior to telling the caller what key to press for that action/option (e.g., “To check your balance without bothering one of our expensive agents, press ‘1’”; NOT “Press ‘1’ to check your balance without bothering one of our expensive agents.”)
  • The ability to opt out to and curse directly at a live agent at any time

Invest in advanced speech recognition. In leading contact centers, traditional touchtone IVR systems are being replaced by sleeker and sexier speech-enabled solutions. While you may not want to listen to a writer who thinks that IVR can be sleek or sexy, you should, as today’s advanced speech recognition (ASR) solutions have helped many customer care organizations vastly improve self-service, and, consequently, reduce the number of death threats their IVR system receives each day.

Powered by natural language processing, ASR systems provide a much more personalized and human experience than traditional touchtone ever could. Traditional touchtone is like interacting with Dan Rathers, while ASR is like talking to Oprah.

Even more importantly, ASR-driven IVR systems enable contact centers to vastly reduce the number of steps callers must take to get what they need. Customers can cut through unnecessary menu options by saying exactly what they want (e.g., “I would like the address of your call center so that I can punch the last agent I spoke to in the face”).

Use CTI to ensure smooth, smart transfers. Even if your IVR system is perfectly designed and features the universally appealing voice of James Earl Jones, many callers will still want to – or need to – speak to a live agent featuring the universally less-appealing voice of a live agent. And when this happens, what’s universally aggravating to callers is – after providing the IVR with their name, account number, social security number, height, weight and blood type – having to repeat the very same information to the agent to whom their call is transferred.

To avoid such enraging redundancy – and to shorten call lengths/reduce costs – leading contact centers incorporate CTI (computer telephony integration) technology into their IVR system. These applications integrate the voice and data portions of the call, then, with the help of magic fairies, deliver that information directly to the desktop of the agent handling the call. With today’s technologies, it’s really quite simple (though, granted, not always cheap), and the impact on the customer experience is immense. Rather than the caller starting off their live-agent interaction with a loud sigh or groan, they start off with the feeling that the company might actually have a soul.

Regularly test and monitor the system. Top contact centers keep a close eye on IVR function and callers’ interactions with the system to ensure optimum functionality and customer experiences.
One essential practice is load-testing any new IVR system prior to making it “open for business”. This involves duplicating actual call volumes and pinpointing any system snags, glitches or outright errors that could jam up the system and drive callers nuts.

Once the IVR system is up and running, leading contact centers frequently test it by “playing customer” – calling the center just as a customer would, then evaluating things like menu logic and speech recognition performance, as well as hold times and call-routing precision after opting out of the IVR.

Some contact centers have invested in solutions that automate the IVR-testing process. These potent diagnostic tools are able to dial in and navigate through an interactive voice transaction just as a real caller would – except with far less swearing – and can track and report on key quality and efficiency issues. Many other centers gain the same IVR-testing power by contracting with a third-party vendor that specializes in testing self-service systems.

Internal IVR testing alone is insufficient to ensure optimal customer experiences with the IVR. The best contact centers extend their call monitoring process to the self-service side. Quality specialists listen to live or recorded customer-IVR interactions and evaluate how easy it is for customers to navigate the system and complete transactions without agent assistance, as well as how effectively the IVR routes each call when a live agent is requested or required.

Today’s advanced quality monitoring systems can be programmed to alert QA staff whenever a caller gets entangled in the IVR or seems to get confused during the transaction. Such alerts enable the specialist – after having a laugh with his peers over the customer’s audible expletives – to fix any system glitches and perhaps contact the customer directly to repair the damaged relationship.

Thanks to Call Center IQ for the article. 

Quality Automation: 3 Ways to Make Self-Service Work for Customers

Quality Automation: 3 Ways to Make Self-Service Work for CustomersIn the eyes of many customers, self-service is not a compound word but rather a four-letter one. It’s not that there’s anything inherently bad about IVR or web self-service applications – it’s that there’s something bad about most contact centers’ efforts to make such apps good.

Relatively few contact centers extend their quality assurance (QA) practices to self-service applications. Most centers tend to monitor and evaluate only those contacts that involve an interaction with a live agent – i.e., customer contacts in the form of live phone calls or email, chat or social media interactions. Meanwhile, no small percentage of customers try to complete transactions on their own via the IVR or online (or, more recently, via mobile apps) and end up tearing their hair out in the process. In fact, poorly designed and poorly looked-after self-service apps account for roughly 10% of all adult baldness, according to research I might one day conduct.

When contact center pros hear or read “QA”, they need to think not only “Quality Assurance” but also “Quality Automation.” The latter is very much part of the former.

To ensure that customers who go the self-service route have a positive experience and maintain their hair, the best contact centers frequently conduct comprehensive internal testing of IVR systems and online applications, regularly monitor customers’ actual self-service interactions, and gather customer feedback on their experiences. Let’s take a closer look at each of these critical practices.

Testing Self-Service Performance

Testing the IVR involves calling the contact center and interacting with the IVR system just as a customer would, only with much less groaning and swearing. Evaluate such things as menu logic, awkward silences, speech recognition performance and – to gauge the experience of callers that choose to opt out of the IVR – hold times and call-routing precision.

Testing of web self-service apps is similar, but takes place online rather than via calls. Carefully check site and account security, the accuracy and relevance of FAQ responses, the performance of search engines, knowledge bases and automated agent bots. Resist the urge to try to see if you can get the automated bot to say dirty words. There’s no time for such shenanigans. Testing should also include evaluating how easy it is for customers to access personal accounts online and complete transactions.

Some of the richest and laziest contact centers have invested in products that automate the testing process. Today’s powerful end-to-end IVR monitoring and diagnostic tools are able to dial in and navigate through an interactive voice transaction just as a real caller would, and can track and report on key quality and efficiency issues. Other centers achieve testing success by contracting with a third-party vendor that specializes in testing voice and web self-service systems and taking your money.

Monitoring Customers’ Self-Service Interactions

Advancements in quality monitoring technologies are making things easier for contact centers looking to spy on actual customers who attempt self-service transactions. All the major quality monitoring vendors provide customer interaction re­cording applications that capture how easy it is for callers to navigate the IVR and complete transactions without agent assistance, as well as how effectively such front-end systems route each call after the caller opts out to speak to an actual human being.

As for monitoring the online customer experience, top contact centers have taken advantage of multichannel customer interaction-recording solutions. Such solutions enable contact centers to find out first-hand such things as: how well customers navigate the website; what information they are looking for and how easy it is to find; what actions or issues lead most online customers to abandon their shopping carts; and what causes customers to call, email or request a chat session with an agent rather than continue to cry while attempting to serve themselves.

As with internal testing of self-service apps, some centers – rather than deploying advanced monitoring systems in-house – have contracted with a third-party specialist to conduct comprehensive monitoring of the customers’ IVR and/or web self-service experiences.

Capturing the Customer Experience

In the end, the customer is the real judge of quality. As important as self-service testing and monitoring is, even more vital is asking customers directly just how bad their recent self-service experience was.

The best centers have a post-contact C-Sat survey process in place for self-service, just as they do for traditional phone, email and chat contacts. Typically, these center conduct said surveys via the same channel as the customer used to interact with the company. That is, customers who complete (or at least attempt to complete) a transaction via the center’s IVR system are invited to complete a concise automated survey via the IVR (immediately following their interaction). Those who served themselves via the company’s website are soon sent a web-based survey form via email. Customers, you see, like it when you pay attention to their channel preferences, and thus are more likely to complete surveys that show you’ve done just that. Calling a web self-service customer and asking them to complete a survey over the phone is akin to finding out somebody is vegetarian and then offering them a steak.

It’s your call

Whether you decide to do self-service QA manually, invest in special technology, or contract with third-party specialists is entirely up to you and your organization. But if you don’t do any of these things and continue to ignore quality and the customer experience on the self-service side, don’t act surprised if your customers eventually start ignoring you – and start imploring others to do the same.

Thanks to Call Centre IQ for the article. 

IQ Services: Standing Out with Collaboration and Customer Service

Business is a fluid thing; the past 30 years have seen innovation after innovation and with it, a shift in culture. The 1980s saw a focus on quality products. The 1990s illustrated enterprises centered on branding. In the 2000s what differentiates companies? Customer Service. Customer service has climbed the list of priorities, and today one can see companies putting money where their mouths are by investing in a quality customer experience.

Russ Zilles, CEO of IQ Services took some time to talk with TMC at a recent industry event to discuss IVR, customer service and carpet stores. Yes, carpet stores.

IQ Services Contact Centre TestingZilles views customer service as the key determining factor in where private citizens and enterprises choose to take their business. A company culture is a major factor in the quality of service provided; IQ takes more of a collaborative approach with customers. As Zilles says, “we all have similar tools,” but the level of service makes all the difference.

He likened the choice to going to a mechanic. Why do I go to the same mechanic; because I know that when I drop my car off, whatever the issue is, it will be fixed. I won’t be charged for unnecessary services, nor will I be ignored by inattentive employees. I trust my mechanic, as I should, so I keep going back.

In 1996, the team at IQ services thought there had to be a better way to test IVR. They had this ‘crazy’ idea of automating the testing process by detecting voice. Long story short, mission accomplished, and this totally self funded firm began its journey out of a 400 sq. ft. office in the back of a carpet store.

Today, its collaborative approach has IQ personalizing customer experiences. By learning who the customer is, the best possible solution is delivered. Its collaborative approach engages all involved in the “customer journey,” through any means necessary. The focus is on testing for what the customer would like tested, to best serve the clients requests.

IQ is a pioneer in the field of IVR testing. The team knows the ins-and-outs of how carriers and contact centers operate, and all the work is verifiable. The average ‘Joe’ doesn’t realize how complex an IVR system is, and understanding how all the pieces work is integral.

Today, IQ has the ability to route calls around the United States and Canada affording the ability for testing calls to come from various points for call centers. Testing schedules are made to be productive; customers do not pay a dime until they see the results for themselves.

Some fixes are very simple, others can be quite complicated. Carrier issues, high volume and location can all play a major role in functioning—with most fixes having to do with configuration. If an issue arises, the IQ solution sends a notification and remote adjustments can be made to the system.

IQ Services has embraced the WebRTC movement with the creation of a tool to monitor and generate traffic that fits virtually any implementation. This was introduced at a recent industry event and is browser-based, meaning it requires no download. Zilles says, “it’s about sending that data back and forth”—it truly is that simple. Customers are encouraged to test with a 3rd party to ensure the quality of the customer’s solution.

A key pillar for exceptional customer service is trust—no more, no less. Zilles proclaims, “We do what we say we do,” and there is certainly something to be said for that.

Thanks to TMCnet.com for the article.

Using Virtual Customers® to Optimize Customer Service Experience

Providing Contact Centers with Reliable, End-to-End Performance Metrics

Many people believe they are best served by real people, not by voice robots. That’s the rationale behind GetHuman.com. But the economics and utility of self-service as an alternative to live agent interactions are so compelling that self-service solutions are here to stay.

Providing multiple touch-points is a huge technology investment. Technology is great, but you can’t just diligently manage the implementation process and then assume all is well with the customer service experience. Because nothing is static in this world, it is extremely important to confirm from the customer perspective that your contact center technology really is capable of delivering the experience you intend, one that defends your brand promise and delivers on it every day in production.

In 18 years of supporting clients through the installation phase and into the production phase of the contact center lifecycle, we’ve learned many lessons about how to best evaluate and optimize the Customer Service Experience (CSE) that is the foundation of delivering your brand promise. This article introduces the process we’ve built based on our experience. It’s a process that ensures the contact center technologies for which you’ve spent hundreds of thousands of dollars and count on to take care of your customers the way they want to be taken care of is, in fact, offering up the customer service experience you intend, an experience that delivers on your brand promise and doesn’t push customers away.

How?

IQ Services VC 101Introducing the Virtual Customer® Process

The Virtual Customer Process positions the customer service perspective as a key element of your technology management toolkit. It’s a proven process that ensures the customer service experience delivered is aligned with the intentions of the Customer Experience & Brand Management teams, because its first step is identifying key customer types and defining how they will interact with the contact center technology you put in place. By doing so, the Virtual Customer Process goes beyond using only internal metrics that confirm everything is Working As Designed (WAD), it monitors and measures actual customer service experience as it’s delivered.

Once you have actual Customer Service Experience data, you can create a feedback loop by tweaking your systems and observing the impact on the actual CSE delivered, not just on internal metrics such as CPU time or QoS.

When you know the service experience delivered by your contact center technologies defends your brand standards, you can also be confident the experience delivered increases satisfaction, builds loyalty and creates advocates.

What is the Virtual Customer Process?

The Virtual Customer Process is a multistep approach that first defines and then deploys Virtual Customers (VCs) to perform real, end-to-end transactions for the purpose of evaluating application and technology performance variables that impact Customer Service Experience.

What are “Virtual Customers?”

Virtual Customers are automated processes that follow test case scripts to interact with the Contact Center just like real customers, performing real transactions.

How Does the Virtual Customer Process Work?

The first step in the Virtual Customer process is a communications assessment. During this step, key customer activities are identified, the associated Virtual Customer interactions are defined and scripted, and the plan for deploying the VCs so they can collect actionable data is mapped out. The challenge is selecting the best type(s) of VCs to perform the activity required to evaluate contact center performance. Actionable data is CSE information that can be used to evaluate business solution performance relative to defined CSE objectives and metrics.

Deploying Virtual Customers

Once the VCs are defined and the ramp-up and rollout plans are established, the VCs are deployed as test traffic to access contact center technololgy from the outside-in, providing your company with reliable, end-to-end performance metrics from the customer perspective.

Key considerations in deploying VCs include:

  • Risk analysis and consequences
  • Section of the right VC interactions
  • Clearly defined availability and performance objectives and metrics
  • Benchmark assessment
  • Reporting and notification criteria

What is CSE Optimization?

  • A process for deploying VCs to collect data that can be used to evaluate and improve business solution performance, relative to defined objectives and metrics
  • An iterative process that tunes CSE delivered

Conclusion

Properly implemented, the Virtual Customer Process® is a critical element of an integrated continuous improvement process. It hones and perfects a customer service experience that defends your brand promise, thereby positively impacting key metrics such as customer effort, customer loyalty, and net promoter score. Experiences that mirror your brand promise ultimately have bottom line impact. Optimizing customer service experience is a direct path to enhancing ROI.

Thanks to IQ Services for the article.

Testing IVR at the Top of the Stack: How and When to Stay on Top of the Caller Perspective

The IQ Services DifferenceIn the world of computing technology and communications solutions, the last twenty years have been one revolution after another. Faster processes, niftier appliances, smarter phones, and virtualized services are everywhere. Contact centers have grown from hunt groups and operator consoles to full-blown CRM solutions riding on top of evolving, complex technologies.

Many of us are excited to use the new technologies because of three potentially significant benefits:

  • generate cost savings
  • facilitate increase revenues
  • provide better customer service

Many businesses are really good at tracking the cost savings and revenue generation benefits associated with technology deployment and upgrades. But a recent IQ Services poll revealed that 68 percent of respondents did not have a proactive way to track the customer service quality offered by their IVR and other customer-facing technologies. Some of these businesses use internal monitoring to see if all the individual technologies are working. But they do not have an effective way to test or monitor from the top of the stack or the critical end-to-end caller perspective.

How to Get the Caller Perspective

So how do you get to the “top of the stack” of caller perspective of IVR and end-to-end contact center solution performance and service quality?

Although the answer to this question is “it depends on your business requirements, practices and objectives,” there are four general best practices almost any business can quickly use to get and maintain a holistic view of service quality delivered by customer-facing technologies:

The first three best practices – secret shoppers, surveys and social media – tend to receive a lot of buzz in CRM and technology circles. There is no lack of information about these critical outside-in techniques for assessing a business’ customer-facing technologies and customer service practices.

IVR Application Testing and Monitoring

Let’s briefly explore the fourth, less explored bullet – IVR testing and monitoring. To ensure IVR and customer-facing technologies perform and deliver the expected service quality, testing and monitoring techniques must be used before and after deployment. Before initial deployment or before any significant change in capacity, infrastructure, or application functionality is finalized, the following testing must be completed to ensure optimal service quality:

Application feature testing: helps ensure the whole application – every logical twist and turn – works as specified

Application load testing: confirms the integrated customer-facing technologies perform as specified under full load

Application usability testing: delivers crucial insight as to whether prospective users/customers are befuddled by the application or they can quickly and efficiently navigate the menu options.

When? Before Deployment

Rigorous application feature testing, load testing and usability testing are required to ensure IVR applications are easy to use and implemented according to business rules. There are so many things to be learned from these techniques:

  • Does the application respond as expected?
  • Are your premium customers treated in the manner you expected?
  • Are unexpected results associated with a particular dialog state or call flow?
  • Does the application “sound” right?
  • Do early smoke test results conducted throughout development indicate discrepancies between the development effort and the design?
  • Do critical flows through the application work as designed?
  • Do global commands work at every state?
  • How does the solution respond to an out-of-grammar or invalid input?
  • What happens after consecutive invalid inputs?
  • How does the system respond to a valid input after invalid input?
  • What happens when an input barges into a playing prompt?
  • What happens if no inputs are made?
  • Does the application perform under expected load? Under peak conditions?
  • Can callers complete the self-service transactions you’d most like to keep in the IVR?

For each best practice and service quality technique, there are a range of approaches that vary in terms of cost, resource commitment, value of insight delivered and more. Each business must evaluate the insight and value against the cost and risks of each approach to determine the right combination.

There’s the manual approach. Since you already have a captive staff to carry it off, you ask your coworkers and employees to use the IVR. They let you know whether or not they think the application flows as the documentation you sent them said it should. You assign each person to a handful of paths selected from hundreds of possibilities and ask them to email the results.

This approach can help minimize front-end costs. But it is not always efficient or effective since your co-workers may be busy and many of the things you need to learn from application testing are hard to uncover without a more structured and automated approach to testing. In the end, you can incur costs you did not plan for with this approach.

You can hire an army of testers – people who are trained to make test calls. With this approach, you can avoid over-burdening your existing staff. However, you may be dealing with the imprecision of people making phone calls, interpreting responses and taking notes. It may also mean you can only test a subset of the application’s functionality.

You might consider buying test equipment.

Automation adds precision. Someone must be trained or hired who knows how to program the test cases, connect the device to the network and run it. The results must still be evaluated and reported.

You could outsource the project. By using a company with facilities and expertise to comprehensively exercise all features of an IVR application, you avoid burdening your overworked team. You also know the most updated techniques and tools are leveraged to test report results.

Whatever option you choose, application testing MUST take place. There is too much at risk to put a system into production without having validated its functionality, usability and robustness.

When? After Deployment

Once an IVR or customer-facing solution is in production, a business must know it keeps working around the clock to take care of your customers.

There is no need to run through all the test cases every hour just to be sure. You need a quick snapshot of how the solution is performing from the caller perspective. Most likely there are a handful of paths through the IVR application that will signal a problem. By running through these paths in the application every few minutes – or even once an hour – you can be confident the overall system is delivery quality service from the caller perspective. As with pre-deployment testing, there are a variety of approaches to monitoring service quality of production technologies, manual calls, purchased equipment or outsourced services.

Whatever approach is right for your business, it is critical that you not simply trust all is well. Immediate warning about issues, daily metrics and other monitoring reports give you the oversight you need to deliver top notch customer service through IVR and self-service solutions.

Stay on top of your IVR and customer-facing technology performance by testing at the top of the stack. You owe it to your customers and to the business stakeholders who expect as much value from the technology as possible.

Thanks to IQ Services for the article.