Can volumes of customer data translate into meaningful trends?
With so many of our interactions today being digital, organisations are able to capture massive volumes of unstructured data from customer reviews, web enquiries, social media comment and feedback. This creates a fantastic opportunity to really understand customer needs, but the sheer volume of data collected often poses problems for anyone trying to extract any insight of real value.
Text Analytics uses Natural Language Processing to extract meaning from large volumes of text based data but Automated Text Analysis is quite rightly often under scrutiny with a continual disappointment in the crudeness of the semantic analysis which in turn provides misleading results, so how can businesses overcome this problem?
We know that just leaving it to the software can produce ill-fitting results – qualitative data still needs that human touch. The initial automated coding needs to be refined over time, with the analyst manually shifting comments or components into more suitable categories, or combing categories into more overarching themes. This combination of automation and manual supervision and refinement leads to a powerful text analytic process which is quick, consistent, replicable and meaningful.
The Bright Minds at Arkenford have helped their clients achieve just this by providing rigorous analysis of the data set combined with human analysis providing systematic insight into the contents and generating new insight which was previously invisible.
Client 1 (University) gathered a wide range of data from students at varying points in their university career and part of this data included verbatim comments that accompanied the NSS surveys. They were unable to analyse these comments in detail due to the volume but wanted to understand them more comprehensively and identify any trend changes over time. Arkenford designed a bespoke application which automated this process and combined it with human expertise to quantify the data. It was then cross compared with other data including both the NSS scores and demographic information and an interactive data reporting tool was created enabling the client to cross analyse across faculty, year group and annual trends providing a greater visualisation and understanding of the trends.
Client 2 (Charity) had 11,000 text based web enquiries of varying nature and wanted to understand the complexities of the requests so they could respond to their clients sooner. Arkenford were able to analyse the data providing 50,000 individual issues which created a massively rich data set that was previously invisible to the client. From this we created an interactive tool which allowed staff to examine the structure of the requests, identify touch points, understand the characteristics linked to each touchpoint and access individual requests providing a much greater understanding of the people affected.
Client 3 (Sporting body) collected 13,000 visitor experience comments to their events over a year. They attempted to read through these comments to get a sense of what was being said but it was time consuming and imprecise. Arkenford created a key word/phrase categorisation using Natural Language Processing including positive and negative semantics and were able to cross reference this to the event, demographics and their customer experience. An output of the study created a lexicon that the client uses for analysis every year thereby increasing speed, reducing cost and adding clear strategy to the analysis.
By using the text analytics process all of these clients were able to understand the meaning and overall picture and are able to access the detail in a fast and efficient manner. But combining it with human analysts has provided a powerful and robust result that has saved time and money and driven marketing strategy to new heights.