How Testing Solutions Reduce Risk & Improve Customer Satisfaction

How Testing Solutions Reduce Risk & Improve Customer SatisfactionImagine you’re trying to book a flight. You call the toll-free number and use the interactive voice response (IVR) to get through to bookings, but instead you are put through to the baggage area. You hang up and try again, but this time you wind up speaking to the airline lounge. Do you try a third time or call a competitor? I know what I would do.

The IVR is now a key component to delivering a great customer experience, so what steps should a business take to ensure these systems are working optimally? Do they take proactive measures, or just wait until a customer lets them know that something is broken? And, by the time it gets to this stage, how many customers may have been lost?

There are some businesses out there taking unnecessary risks when it comes to testing the reliability of their communications systems. Instead of performing extensive tests, they’re leaving it up to their customers to find any problems. Put bluntly, they’re rolling the dice by deciding to deploy systems that haven’t been properly tested. This is the primary line of communication with their customers and, in many cases, it’s also how they generate significant revenue, why would they put both customer satisfaction and revenue in jeopardy?

Businesses have quite a few useful options when it comes to proactive testing. We recently acquired IQ Services, a company that tests these environments on a scheduled basis to make sure they’re working properly. It’s an automated process that tests how long it takes to answer, makes sure that the correct responses are given, and even performs a massive stress test with up to 80,000 concurrent calls. (It’s very useful for scenarios such as a large healthcare provider going through open enrollment.) These testing solutions are the way that businesses can ensure that their systems are working reliably under heavy load without leaving anything to chance.

In a world where we think of people as risk-averse, it’s interesting to observe anyone who chooses not to perform these tests. It’s not necessarily a conscious decision if the situation were actually framed in a way where someone knew exactly what they were putting at risk, they’d probably make a better choice. You wouldn’t buy car insurance after you already had an accident. It simply wouldn’t do you much good at that point. The same thing applies to your communications systems. It only makes sense to take a proactive approach to make sure things are working as expected.

Now that you’re aware of what’s at risk if you don’t perform these important tests, don’t make the conscious decision to wait until something has already gone wrong. We’re talking about the potential loss of millions of dollars per hour (or even per minute in certain cases). Some strategic planning can give you the peace of mind you’ll avoid catastrophic loss of revenue in the future. Whenever you do go live with a new feature, you can do so with confidence.

We’ve brought these new Testing Solutions into the Prognosis family. Above and beyond, we want to make sure people understand these capabilities are available. You don’t have to be reactionary, there are proactive solutions to stop you from rolling the dice when it comes to your business and customers. Don’t leave the livelihood of your organization to chance. Of course, if you’re in the mood to gamble your money, there’s always Vegas.

Thanks to IR Prognosis for the article.

5 Ways to Get Agents to Embrace Quality Monitoring

5 Ways to Get Agents to Embrace Quality MonitoringFew processes in the contact center are as contentious as quality monitoring. When not carefully explained and carried out with tact and sensitivity, monitoring smacks of spying. Cries of “Big Brother” and micromanagement are not uncommon in such environments, resulting in agent burnout, attrition, and poisonous darts being shot at QA staff.

Several studies have revealed that call monitoring can cause significant stress and dissatisfaction among agents. In one study – conducted by Management Sciences Consulting, Bell Canada – 55% of all employees responded that some form of telephone monitoring added to the amount of stress they experienced with their job to a large or very large extent.

In order to achieve the level of agent engagement and customer advocacy that today’s contact centers seek, managers need to aim for agents to not only accept and tolerate quality monitoring, but to embrace it. You may ask, “What kind of freak actually looks forward to having their every word recorded and keystroke captured while on the job?”

Well, I’m not saying that agents need to be so excited about monitoring that they beg for it or do a jig when they find out that they will have 10 calls a month evaluated. However, in the best contact centers I have seen, agents do look forward to being monitored and coached occasionally – because they recognize the positive impact it can have on their performance, the customer’s experience and the organization’s success.

So how do these contact centers get their staff to embrace quality monitoring rather than run in fear from it? Let me count the ways:

They educate new-hires on the reasons for – and value of – quality monitoring. In leading contact centers, managers don’t just tell new agents that they’ll be monitored on a regular basis, they tell them why. In fact, many centers do this with agents even before they become agents – taking time to explain monitoring policies and practices (and the reasons behind them) during the hiring process so that applicants know exactly what to expect before they take the drug test.

When describing the center’s monitoring program, it’s a good idea to lie just a little to make it sound more rewarding than it actually is. Tell agents that monitoring is not used to “catch them” doing things wrong, even though you know that it usually works out that way. Explain that having calls evaluated enhances professional development, builds integrity and helps to ensure customer loyalty, even though you know that such things are true only if your agents care about themselves, the company, and the future. But hey, it’s worth a shot.

They incorporate post-contact customer ratings and feedback into monitoring scores. For some reason, agents would rather have a customer than a supervisor tell them that they aren’t good at providing service. That’s why the best contact centers have incorporated a “voice of the customer” (VOC) component into their quality monitoring programs – tying direct customer feedback from post-contact surveys into agents’ overall monitoring scores.

Adhering to the VOC-based quality monitoring model, the contact center’s internal QA staff rate agents only on the most objective call criteria and requirements – like whether or not the agent used the correct greeting, provided accurate product information, and didn’t call the customer a putz. That internal score typically accounts for anywhere from 40%-60% of the agent’s quality score, with the remaining points based on how badly the customer said they wanted to kiss or punch the agent following the interaction.

They provide positive coaching. While incorporating direct customer feedback into monitoring scores is key, it won’t do much to get agents to embrace monitoring if the coaching that agents receive following an evaluated call is delivered in a highly negative manner.

During coaching sessions, the best coaches strive to point out as many positives about the interaction as they do areas needing improvement. This provides a nice balance to the evaluation and makes agents less likely to strike the coach with a blunt instrument. Even if the call was handled dreadfully, good coaches always find something positive to comment on, such as the agent’s consistent breathing throughout the interaction, or how well they were able to make words come out of their mouth.

They empower agents to self-evaluate their customer interactions. There are few better ways to gain staff buy-in to quality monitoring/coaching than to trick agents into thinking that they have even the slightest bit of control during the process. The best contact centers always give agents the chance to rate their own call performance; the center then pretends to factor such self-evaluations into the overall quality score that’s recorded.

Many managers report that agents are often harder on themselves than the QA specialist or supervisor is when evaluating call performance. Sometimes, after listening to a call recording, agents become so upset by their own performance and/or the sound of their own voice that they try to physically harm themselves during the coaching session, which adds a nice touch of comic relief to an otherwise stressful situation for coaches.

They reward solid quality performance. Generally speaking, people are more likely to embrace an annoying or uncomfortable process if they know there is at least a chance for reward or positive recognition. I mean, if it weren’t for the free toothbrush, who would ever visit the dentist? And if it weren’t for the free alcohol, who would ever celebrate the holidays with family?

The same goes for quality monitoring. I’m not saying that you should give agents a free toothbrush and some alcohol after every call that is evaluated – just the ones where the agent didn’t make the customer or themselves cry. And remember, there are other ways to reward and recognize staff than with toothbrushes and alcohol, I just can’t think of any right now.

Thanks to Call Centre IQ for the article.

Be Confident Your Contact Center Technology Delivers The Brand You Promise

Be Confident Your Contact Center Technology Delivers The Brand You PromiseToday, most B2C interactions involve some form of contact center technology. With the exception of in-store purchases, contact center technology is responsible for providing the vast majority of brand impressions that customers have through experiences with toll-free numbers, self-service IVR, and CTI screen pop—and that’s just one of the channels. There are also self-service websites, apps, social media scrapers, blended queuing processes, and more.

SERVICE DELIVERED VS. SERVICE INTENDED

VOC programs ask customers for feedback on the experience as they remember it – which is extremely important when determining if the experience delivered was pleasing, useful, efficient, or memorable. But it’s also critical to monitor the experience delivered by the organization’s technology and compare it to what was intended. Customers don’t know whether or not the technology with which they’re interacting is actually doing what it’s intended to do, they just know whether or not it’s available when they want to use it and lets them get what they need quickly and efficiently.

CUSTOMER VS. TECHNOLOGY PERSPECTIVE

Data that’s easy to collect with regularity is what’s frequently relied upon for decision making and tuning—the data collected inside the contact center related to congestion, CPU consumed, call arrival rate, etc. Accurate, unemotional, precise data about the experience actually delivered has to come from the outside in, the way customers really access and interact with technology, and be collected in a controlled fashion on a regular basis.

IS THERE A BETTER, MORE STRATEGIC WAY TO DO THIS?

There are ways to gain a true assessment of the customer service experience as delivered. It starts with documented expectations for the experience as it’s supposed to be delivered. Defined expectations establish functionality, performance, and availability specs as benchmarks for testing.

It’s in every company’s best interest to know that the technology it put in place is, first of all, capable of making all those connections it has to make, and then actually delivering the customer service experience that it intended to deliver. To do that, you need reliable data gathered from automated, outside-in, scripted test interactions that can be used to assess the functionality that’s been put in place, as well as the technology’s ability to deliver both at peak load and then continuously once in production.

Think of automated testing as using an army of secret shoppers to access and exercise contact center technology exactly as it’s intended to be used: one at a time, then hundreds, thousands, tens of thousands of virtual customer secret shoppers. Think also about the technology lifecycle: development – cutover – production – evolution.

Start with automated feature/function testing of your self-service applications— voice and web—to ensure that what was specified actually got developed. Precisely verify every twist and turn to ensure you are delivering what you intended. Do that before you go live and before you do unit testing or load testing. Using an automated, scripted process to test your self-service applications ensures you have reliable discrepancy documentation—recordings and transcripts—that clearly document functionality issues as they are identified.

Next step, conduct load testing prior to cutover. Use automated, scripted, virtual customer test traffic, from the outside-in, through the PSTN and the web—to ensure your contact center technology really can perform at the absolute capacity you’ve designed and also at the call arrival and disconnect rates that are realistic. Plan failure into the process—it’s never one and done. Leave enough time to start small, to identify and address issues along the way.

Once in production, continuously access and exercise contact center technology just like real customers, through the PSTN and through the web—to ensure it’s available, assess its functionality, and measure its performance so you can be confident that technology is delivering the intended experience, 24×7.

If you have self-service applications that undergo periodic tweaks or enhancements, automated regression tests using those initial test cases will ensure all functionality is still on task after tweaks to the software or underlying infrastructure.

WHAT’S THE NET RESULT OF TESTING THIS WAY?

VOC feedback tells you what customers think about your efforts and how they feel about dealing with your brand and technology. Automated testing and monitoring allow you to determine if the interfaces you’ve provided are, in fact, providing the easy-to-use, low-effort experience you intend.

It’s pretty straightforward—quality and efficiency of experience drive loyalty, and loyalty drives spend. You HAVE to have both perspectives as you tweak the technology, and that’s the message. Listen to your customers as you decide what you want your technology to do, and then make sure your technology is doing what you intend it to do.

Originally Published in Speech Technology Magazine.

Embracing Change in Contact Centres

Embracing Change in Contact Centres

Ottawa Regional Contact Centre Association Presents:

Embracing Change in Contact Centres

June 11, 2015: Kanata Recreation Complex
100 Walter Baker Place, Ottawa
1:30 to 4:00 pm

People naturally resist change. In this Change Management session hosted by ORCCA, you will discover that there are predictable reactions to change and that resistance can be reduced if it is identified and addressed before it sets in. How can your team embrace the changes required to improve? Managers and frontline staff must step out of their comfort zones to develop and accept coaching support, metrics and quality assessments that drive the customer experience. This interactive session will provide insights into change management methodology explore today’s millennial employee and provide an understanding of how change and culture are integral to evolve as a high performance contact centre.

Our Speaker Moosha Gulycz

Moosha is a founding Partner of AtFocus Inc. She developed and regularly delivers ‘Countdown2Change’, an AtFocus proprietary Organizational Change Management program. The program focuses on the personal journey of change and the natural resistance to change associated with organizational change projects. ‘Countdown2Change’ methodology has contributed to significant cultural change.

Moosha has over 15 years consulting experience, advising many organizations on change management strategies, service delivery improvements, process assessments and quality performance. She has authored, co-authored and contributed to the following books:

  • A Journey to Personal and Professional Success
  • Performance Driven CRM: How to Make Your Customer Relationship Management Vision a Reality
  • Customer Relationship Management: A Strategic Imperative in the World of e-Business

Attend this thought-provoking Change Management session hosted by ORCCA

Register early, space is limited
RSVP to by email info@callcentres.org

Thursday, June 11, 2015
Kanata Recreation Complex, 100 Walter Baker Place
1:30 to 4:00 pm
Free onsite parking
Free to ORCCA members
Non-Members – $30.00 in advance payable by Visa or MasterCard

Visit our website at www.callcentres.org

 

Elevating IVR: Stop the Hatred for Automation

IVR.

In many customers’ minds, this three-letter acronym is a four-letter word. It’s not uncommon for callers to mutter a diverse range of other forbidden words whenever interacting – or trying to interact – with a contact center’s IVR system.

But IVR is not deserving of such hatred. IVR systems are not inherently flawed or evil, nor are the companies that use an IVR to front-end their contact centers. The reason why the general public’s perception of IVR is so negative is that so few of the systems that the public has encountered have been designed properly by the humans behind the scene. The technology itself has tons of potential; it’s what’s dumped into it by organizations overly eager to enjoy the cost-saving benefits of phone-based self-service that makes the machines such monsters.

Not all IVR systems in existence today are so beastly. Some, in fact, not only play nice with customers, they delight them and keep them coming back for more. So how have the owners of these much-maligned systems succeeded in getting callers to drop their pitchforks and torches and embrace IVR?

By adopting the following key practices, all of which I stole from a host of IVR experts and now pass off as my own:

Adhere to the fundamentals of IVR menu design. Most of what irritates and confounds customers with regard to IVR can be easily avoided. Callers often opt out of the system or hang up due to too many menu choices, confusing phrasing/commands, and fear of dying alone in IVR hell.

Here are a handful of essential menu features and functions common to the best-designed IVR applications:

  • No more than four or five menu options
  • The ability to easily skip ahead to desired menu choices (e.g., having the system recognize that the customer pressed “3” or said what they wanted before the system presented such options)
  • Use of the same clear, professional recorded voice throughout the IVR
  • (For touchtone systems specifically) Giving a description of an action/option prior to telling the caller what key to press for that action/option (e.g., “To check your balance without bothering one of our expensive agents, press ‘1’”; NOT “Press ‘1’ to check your balance without bothering one of our expensive agents.”)
  • The ability to opt out to and curse directly at a live agent at any time

Invest in advanced speech recognition. In leading contact centers, traditional touchtone IVR systems are being replaced by sleeker and sexier speech-enabled solutions. While you may not want to listen to a writer who thinks that IVR can be sleek or sexy, you should, as today’s advanced speech recognition (ASR) solutions have helped many customer care organizations vastly improve self-service, and, consequently, reduce the number of death threats their IVR system receives each day.

Powered by natural language processing, ASR systems provide a much more personalized and human experience than traditional touchtone ever could. Traditional touchtone is like interacting with Dan Rathers, while ASR is like talking to Oprah.

Even more importantly, ASR-driven IVR systems enable contact centers to vastly reduce the number of steps callers must take to get what they need. Customers can cut through unnecessary menu options by saying exactly what they want (e.g., “I would like the address of your call center so that I can punch the last agent I spoke to in the face”).

Use CTI to ensure smooth, smart transfers. Even if your IVR system is perfectly designed and features the universally appealing voice of James Earl Jones, many callers will still want to – or need to – speak to a live agent featuring the universally less-appealing voice of a live agent. And when this happens, what’s universally aggravating to callers is – after providing the IVR with their name, account number, social security number, height, weight and blood type – having to repeat the very same information to the agent to whom their call is transferred.

To avoid such enraging redundancy – and to shorten call lengths/reduce costs – leading contact centers incorporate CTI (computer telephony integration) technology into their IVR system. These applications integrate the voice and data portions of the call, then, with the help of magic fairies, deliver that information directly to the desktop of the agent handling the call. With today’s technologies, it’s really quite simple (though, granted, not always cheap), and the impact on the customer experience is immense. Rather than the caller starting off their live-agent interaction with a loud sigh or groan, they start off with the feeling that the company might actually have a soul.

Regularly test and monitor the system. Top contact centers keep a close eye on IVR function and callers’ interactions with the system to ensure optimum functionality and customer experiences.
One essential practice is load-testing any new IVR system prior to making it “open for business”. This involves duplicating actual call volumes and pinpointing any system snags, glitches or outright errors that could jam up the system and drive callers nuts.

Once the IVR system is up and running, leading contact centers frequently test it by “playing customer” – calling the center just as a customer would, then evaluating things like menu logic and speech recognition performance, as well as hold times and call-routing precision after opting out of the IVR.

Some contact centers have invested in solutions that automate the IVR-testing process. These potent diagnostic tools are able to dial in and navigate through an interactive voice transaction just as a real caller would – except with far less swearing – and can track and report on key quality and efficiency issues. Many other centers gain the same IVR-testing power by contracting with a third-party vendor that specializes in testing self-service systems.

Internal IVR testing alone is insufficient to ensure optimal customer experiences with the IVR. The best contact centers extend their call monitoring process to the self-service side. Quality specialists listen to live or recorded customer-IVR interactions and evaluate how easy it is for customers to navigate the system and complete transactions without agent assistance, as well as how effectively the IVR routes each call when a live agent is requested or required.

Today’s advanced quality monitoring systems can be programmed to alert QA staff whenever a caller gets entangled in the IVR or seems to get confused during the transaction. Such alerts enable the specialist – after having a laugh with his peers over the customer’s audible expletives – to fix any system glitches and perhaps contact the customer directly to repair the damaged relationship.

Thanks to Call Center IQ for the article. 

Quality Automation: 3 Ways to Make Self-Service Work for Customers

Quality Automation: 3 Ways to Make Self-Service Work for CustomersIn the eyes of many customers, self-service is not a compound word but rather a four-letter one. It’s not that there’s anything inherently bad about IVR or web self-service applications – it’s that there’s something bad about most contact centers’ efforts to make such apps good.

Relatively few contact centers extend their quality assurance (QA) practices to self-service applications. Most centers tend to monitor and evaluate only those contacts that involve an interaction with a live agent – i.e., customer contacts in the form of live phone calls or email, chat or social media interactions. Meanwhile, no small percentage of customers try to complete transactions on their own via the IVR or online (or, more recently, via mobile apps) and end up tearing their hair out in the process. In fact, poorly designed and poorly looked-after self-service apps account for roughly 10% of all adult baldness, according to research I might one day conduct.

When contact center pros hear or read “QA”, they need to think not only “Quality Assurance” but also “Quality Automation.” The latter is very much part of the former.

To ensure that customers who go the self-service route have a positive experience and maintain their hair, the best contact centers frequently conduct comprehensive internal testing of IVR systems and online applications, regularly monitor customers’ actual self-service interactions, and gather customer feedback on their experiences. Let’s take a closer look at each of these critical practices.

Testing Self-Service Performance

Testing the IVR involves calling the contact center and interacting with the IVR system just as a customer would, only with much less groaning and swearing. Evaluate such things as menu logic, awkward silences, speech recognition performance and – to gauge the experience of callers that choose to opt out of the IVR – hold times and call-routing precision.

Testing of web self-service apps is similar, but takes place online rather than via calls. Carefully check site and account security, the accuracy and relevance of FAQ responses, the performance of search engines, knowledge bases and automated agent bots. Resist the urge to try to see if you can get the automated bot to say dirty words. There’s no time for such shenanigans. Testing should also include evaluating how easy it is for customers to access personal accounts online and complete transactions.

Some of the richest and laziest contact centers have invested in products that automate the testing process. Today’s powerful end-to-end IVR monitoring and diagnostic tools are able to dial in and navigate through an interactive voice transaction just as a real caller would, and can track and report on key quality and efficiency issues. Other centers achieve testing success by contracting with a third-party vendor that specializes in testing voice and web self-service systems and taking your money.

Monitoring Customers’ Self-Service Interactions

Advancements in quality monitoring technologies are making things easier for contact centers looking to spy on actual customers who attempt self-service transactions. All the major quality monitoring vendors provide customer interaction re­cording applications that capture how easy it is for callers to navigate the IVR and complete transactions without agent assistance, as well as how effectively such front-end systems route each call after the caller opts out to speak to an actual human being.

As for monitoring the online customer experience, top contact centers have taken advantage of multichannel customer interaction-recording solutions. Such solutions enable contact centers to find out first-hand such things as: how well customers navigate the website; what information they are looking for and how easy it is to find; what actions or issues lead most online customers to abandon their shopping carts; and what causes customers to call, email or request a chat session with an agent rather than continue to cry while attempting to serve themselves.

As with internal testing of self-service apps, some centers – rather than deploying advanced monitoring systems in-house – have contracted with a third-party specialist to conduct comprehensive monitoring of the customers’ IVR and/or web self-service experiences.

Capturing the Customer Experience

In the end, the customer is the real judge of quality. As important as self-service testing and monitoring is, even more vital is asking customers directly just how bad their recent self-service experience was.

The best centers have a post-contact C-Sat survey process in place for self-service, just as they do for traditional phone, email and chat contacts. Typically, these center conduct said surveys via the same channel as the customer used to interact with the company. That is, customers who complete (or at least attempt to complete) a transaction via the center’s IVR system are invited to complete a concise automated survey via the IVR (immediately following their interaction). Those who served themselves via the company’s website are soon sent a web-based survey form via email. Customers, you see, like it when you pay attention to their channel preferences, and thus are more likely to complete surveys that show you’ve done just that. Calling a web self-service customer and asking them to complete a survey over the phone is akin to finding out somebody is vegetarian and then offering them a steak.

It’s your call

Whether you decide to do self-service QA manually, invest in special technology, or contract with third-party specialists is entirely up to you and your organization. But if you don’t do any of these things and continue to ignore quality and the customer experience on the self-service side, don’t act surprised if your customers eventually start ignoring you – and start imploring others to do the same.

Thanks to Call Centre IQ for the article. 

IQ Services: Standing Out with Collaboration and Customer Service

Business is a fluid thing; the past 30 years have seen innovation after innovation and with it, a shift in culture. The 1980s saw a focus on quality products. The 1990s illustrated enterprises centered on branding. In the 2000s what differentiates companies? Customer Service. Customer service has climbed the list of priorities, and today one can see companies putting money where their mouths are by investing in a quality customer experience.

Russ Zilles, CEO of IQ Services took some time to talk with TMC at a recent industry event to discuss IVR, customer service and carpet stores. Yes, carpet stores.

IQ Services Contact Centre TestingZilles views customer service as the key determining factor in where private citizens and enterprises choose to take their business. A company culture is a major factor in the quality of service provided; IQ takes more of a collaborative approach with customers. As Zilles says, “we all have similar tools,” but the level of service makes all the difference.

He likened the choice to going to a mechanic. Why do I go to the same mechanic; because I know that when I drop my car off, whatever the issue is, it will be fixed. I won’t be charged for unnecessary services, nor will I be ignored by inattentive employees. I trust my mechanic, as I should, so I keep going back.

In 1996, the team at IQ services thought there had to be a better way to test IVR. They had this ‘crazy’ idea of automating the testing process by detecting voice. Long story short, mission accomplished, and this totally self funded firm began its journey out of a 400 sq. ft. office in the back of a carpet store.

Today, its collaborative approach has IQ personalizing customer experiences. By learning who the customer is, the best possible solution is delivered. Its collaborative approach engages all involved in the “customer journey,” through any means necessary. The focus is on testing for what the customer would like tested, to best serve the clients requests.

IQ is a pioneer in the field of IVR testing. The team knows the ins-and-outs of how carriers and contact centers operate, and all the work is verifiable. The average ‘Joe’ doesn’t realize how complex an IVR system is, and understanding how all the pieces work is integral.

Today, IQ has the ability to route calls around the United States and Canada affording the ability for testing calls to come from various points for call centers. Testing schedules are made to be productive; customers do not pay a dime until they see the results for themselves.

Some fixes are very simple, others can be quite complicated. Carrier issues, high volume and location can all play a major role in functioning—with most fixes having to do with configuration. If an issue arises, the IQ solution sends a notification and remote adjustments can be made to the system.

IQ Services has embraced the WebRTC movement with the creation of a tool to monitor and generate traffic that fits virtually any implementation. This was introduced at a recent industry event and is browser-based, meaning it requires no download. Zilles says, “it’s about sending that data back and forth”—it truly is that simple. Customers are encouraged to test with a 3rd party to ensure the quality of the customer’s solution.

A key pillar for exceptional customer service is trust—no more, no less. Zilles proclaims, “We do what we say we do,” and there is certainly something to be said for that.

Thanks to TMCnet.com for the article.