Voice of Customer (VoC) is the in-depth process of capturing a customer’s expectations, preferences and aversions. It plays a huge role in ensuring an optimum customer experience. As Steve Cannon, CEO of Mercedes-Benz USA, said recently, “Customer experience is the new marketing.” So if that’s the case, it’s important that companies get as much out of their data as they can.
It is accepted, if not obvious, that customer experience has a direct impact on financial performance. Customers who had the best past experiences spend 140 percent more compared with those who had the poorest past experience. And when it comes to customer churn, a customer who rates as having the poorest experience has only a 43 percent chance of being a customer a year later, whereas a satisfied customer would have a 74 percent chance of remaining a customer for another year.
In the UK, a relatively sophisticated market, variability in churn rates in telecoms/media range to 15 percent annually (Virgin 14.9 percent and Sky 10.1 percent) and utilities are higher (British Gas was 11 percent against an industry average of 20 percent). If Virgin, for example, were just able to obtain the same churn as Sky, this would add another £200m of sales annually.
For a $10 billion company, even a modest shift in customer experience could result in the following annual revenue change: buying more products, $64 million; reduction in churn, $116 million; word of mouth, $103 million.
Net Promoter Scores (NPS) is one popular framework involving VoC and seeking to quantify the dramatic effect that customer experience has on revenue and profits. For each American Express promoter (those that score 9-10 in satisfaction surveys), they see a 10-15 percent increase in spending and four to five times increased retention, both of which drive shareholder value. iiNet said that a 1-point increase in its NPS equaled a $1.6 million increase in net profit after tax. Whereas Heineken NPS promoters spend 2.5 times more than detractors (those scoring their customer experience from 0-6 out of 10).
So with all this impact, it’s unsurprising that Gartner forecast VoC programs to be one of the most significant strategic investments over the next five years. “VoC is now being viewed as a must-have strategy. What businesses would really like is a nice out-of-the-box mature market that they can go and pick the technology from. But it is not there,” said Gartner Research Director Jim Davies.
But, invariably, organizations are not performing their VoC analyses effectively. Several specific challenges are behind this problem.
For one thing, VoC data, such as social media, product reviews, and complaints, can be sporadic and incidental. Even in systematically collected data, such as CRM data and surveys, there are challenges to interpret the “unknown unknown” signals unless they are predefined.
There are also big differences among business sectors. In B2B, and in B2C services such as utilities, you might have detailed CRM records. But in some markets with minimal customer touchpoints, like consumer products, it might be difficult to link the different sources to a particular customer.
In addition, data can be dirty, and this doesn’t just refer to duplicate or typos. Based on a recent analysis of social media data by Networked Insights, nearly 10 percent of the data from social media posts that brands analyze is not coming from real consumers. These posts come from sources such as social bots (scripts or programs that behave like people posting on social media), celebrities, brand handles, and inactive accounts. Spam is a major concern with forums, which report that up to 28 percent of all posts are from non-consumers. Social spam is a complicated problem when listening in on brand conversations; social media spamming grew by 658 percent between 2013 and 2014, and some brands have reported that more than 90 percent of their recorded social media posts can be classified as spam.
As a result, according to the New York Times, 50-80 percent of a data scientist’s time now involves cleansing data. This not only elongates the process, but also leaves little time for any meaningful customer insight to be gleaned.
Finally, dealing with unstructured data such as text and phone and web logs is challenging, both joining the data and structuring and mining it to extract the insight.
So, in most cases, obtaining a true 360-degree view of relevant customer data is not achievable. Also, just being able to see the data is very different than how you predict and generate recommended actions.
How then do you get actionable insight from what you have?
There are highly sophisticated analytics packages from well-known vendors that are selections of predefined tools for handling text, e.g., text analytics and sentiment analysis. But if you want to look for emergent patterns of “unknown unknowns” without hypotheses, then a specific bottom-up approach is required to cluster or model appropriately. What this means is that a data scientist will build a custom model using state-of-the-art data science algorithms, text analysis, and knowledge of the specific business process to join and cleanse the datasets and construct the right transformations to extract the right insights.
Of course, there are also specific VoC platforms that seek to gather all data to try to obtain the 360-degree, “holy grail” view. These are aggregators of survey data that display the customers’ view. They do not seek to mine the existing data to obtain insight.
However, a new breed of analytics for VoC data is transforming not only the amount of customer insight that can be made available, but also the speed at which it is delivered.
Newly developed algorithms are able to work with unstructured, disparate, and dirty data sets and extract the relevant information required for meaningful insight. Rather than a bottom-up approach, it is possible to analyze all the data in a top-down way. This means that exhaustive naïve transformations are rapidly analyzed in real-time to look for correlating patterns no matter the dataset type or data quality. By using Automated Information Retrieval in this way, the new techniques automatically structure all free text information from multiple channels without supervision and can cluster and classify what is being said in real-time without pre-definition. This can then be analyzed alongside structured data in a machine-learning algorithm for predictive analytics to predict customer behavior from the signals. Dictionaries of specific terms can help sharpen the resolution, and these are suggested (and validated by an end-user) rather than having to be pre-defined.
Immediately this saves 50-80 percent of the time it takes to typically cleanse and transform data for VoC-type analyses. What’s more, the process is automated and the outputs are not just more data, but relevant recommendations that can be implemented right away. As well as the time saving, the need for a data scientist is removed in the basic case and greatly enhances the work that a skilled data scientist can do for more sophisticated use cases. (I am not suggesting for one moment that the role of the data scientists will be automated; quite the opposite!) This new breed of analytics can be implemented by the end user, if necessary, to gain actionable insight.
More than just sentiment analysis, users can understand the customer journey and how events impact the journey both positively and negatively – offering predictive analysis for customer service and marketing professionals and a real-time monitoring solution to flag impactful events before they happen. This means companies can minimize churn, maximize loyalty and NPS, and maximize up-sell and cross-sell opportunities during a conversation with a customer.
Utilities and media companies are using this technology to generate recommendations and prioritize the actions they need to take to improve customer satisfaction and NPS. One large transport provider recently used the new technology to create insight and actionable solutions from unstructured and textual data from its call centers, surveys, social media, and review websites. Through automated analysis, the company was able to generate predictors and actions for specific customers to limit the churn of profitable customers and prioritize the most effective outbound actions. The company also identified changing customer requirements so it could optimize the adoption of up-sell and cross-sell opportunities, all in real-time.
So, for companies using VoC to improve the customer experience, options for getting the most of data just opened up significantly. These options not only eliminate the huge issue of analyzing dirty social media data but also place the power and capability directly with the end user and/or marketing department while empowering analysts to do a lot more, more quickly.
Nigel Howlett is Director at Big Data analytics firm Warwick Analytics. Warwick Analytics spun-out from The University of Warwick following more than a decade of academic and later commercial research. With a provenance in manufacturing, its proprietary algorithms have been used to provide predictive analytics and big data analysis for companies such as Rolls-Royce, Airbus, Pfizer, and Motorola.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.