Quality assurance and big data analytics

Agama has been one of the companies at the forefront of client monitoring.

Agama has been one of the companies at the forefront of client monitoring.

Applying user behaviour analytics to ensure quality of delivery and shape service offerings has been a hot topic in the quality assurance world for some time. But can big data be too much data? Stuart Thomson reports.

Quality assurance has until relatively recently been a second-ranked priority for many service providers, who have relied on customer complaints to deal with problems as they occur or, in the case of OTT service providers, have delivered best effort services without a guarantee of quality.

However, the use of IP delivery now means that service providers have the opportunity to delve in much greater depth into patterns of user behaviour and the responses of users to different experiences on different platforms.

The adoption of OTT or semi-managed delivery platforms by service providers means that there is an ever greater array of variable factors that could determine how well video is delivered over different networks to different platforms.

Client monitoring

Quality assurance specialist Agama has been a strong supporter of the client monitoring approach to quality assurance. According to CEO Mikael Dahlgren, client device monitoring emerged as particularly important for OTT providers that relied on multiple CDNs to deliver their content to end users.

“The market has begun to understand how to create value from data but not for the whole chain,” says Dahlgren. “The quality of the data is one thing but drawing the right conclusions is another.” For Dahlgren, the use of data to proactively prevent problems emerging, rather than the reactive use of data to analyse behaviour after the fact, promises the greatest benefit to service providers.

“Proactive work holds out the larger possibility of real gains. If you can reduce trouble you will get more satisfied customers. If you can eliminate the problem before the customer calls in, it is better,” he says.

Dahlgren says that correlating user behaviour with quality metrics is a useful approach, giving operators data that can be used at senior management level to take informed decisions. More and more operators, he says, are giving executives at senior level direct responsibility for customer relations management, bringing quality to the fore.

This approach, suggests Dahlgren, allows operators to make incremental but significant improvements in quality over time and enables them, for example, to manage changes and new software releases better. “If you roll out a configuration change on the network our customers can see in minutes how it affects the whole population of clients.” This, he says, enables them to take steps to “lower the amount of customer frustration straight away”.

Having data available will not help unless all parts of the organisation are pulling in the same direction however. “At the top level it is important to have a parameter that both technical departments and top managment can agree on,” says Dahlgren. He suggest that organisations benefit from starting out looking at top-level data before drilling down into areas such as analysing the impact of their specific distribution infrastructure choices.

For Dahlgren, the ‘reactive’ side of data use – responding to customer calls and complaints – is a challenging area. It is crucial for front-line support staff to make the right decisions straight away for the service provider to further its goals. Too many frustrated customers will at the very least have a negative impact on the organsiation’s ability to upsell them to new products.

“Actionable insights are key but those are different for different parts of the organisation,” he says. “The first line support staff have to be quick and efficient in understanding what the problem is. At management level they need to have a broader understanding of whether they need to swap something out or expand network capacity in some area or other. The quality solutions must be able to give each type of user the right type of information.” Agama, he says, tries to “understand this and deliver pre-packaged solutions for different user groups”.

While the use of quality assurance technology to work out, for example, whether problems are confined to a specific geographic area or linked to a fault in a specific server is well established, Dahlgren says that deeper analysis of user behaviour is now attracting interest.

“What is new now is that discussions are going into various things such as when customers leave a service and how much of that is related to quality issues or to other causes. One way of doing this is to ask the customers themselves, but customers may blame quality when in fact they want to leave for other reasons, so it is good to test this,” he says.

For Dahlgren, the use of data analytics to inform decision-making still varies hugely between service providers, but interest is being driven by the OTT operators in particular: “There is a wide spread in the market. Some do almost no mapping of data against behaviour and others do a lot. The large new OTT players like Netflix have a deep interest in this and have made big gains in quality. That is pushing other operators to take action to stay at the same level.”

Global data

Ian Franklyn, managing director, EMEA at quality assurance and data specialist Conviva, says that there is growing interest in client-level monitoring, despite earlier scepticism about its value. “I think it used to be the case that you would have to convince service providers of the benefits of user-centric real-time monitoring, but service providers recognise they need to deliver a high quality of service. Their job is to make sure customers are happy and they can use data to predict whether customers will be happy and form a strategy to prevent them going elsewhere,” he says.

Operators are also showing a lot of interest in using data gathered from other implementations to plan ahead of their own service launches, he says. “We have global data, and customers that are thinking about launching a service in a particular territory or launching a new service are interested in what the expectations of quality are and how to satisfy them. [Quality expectations] do vary from country to country and service to service. This data gives the service providers targets and benchmarks of quality to work to – how much bandwidth they need, what ISPs, CDNs to work with, what codecs – before they even launch the service.”

Franklyn says that Conviva’s global reach means that it has been able to note significant differences, for example in tolerance of poor quality video, between different markets.

This trend is feeding into changes in who within an organisation uses the available data, with a shift evident from technical operations staff towards people who are dealing with product development and content strategy executives.

At the customer service level, there is also a move towards a more proactive approach, as identified by Agama’s Dahlgren. While customer relations staff are still using data to react to problems called in by customers themselves, he says that there is growng trend towards teams “looking at trends” and using analysis to pre-empt calls by, for example, sending out emails offering to compensate customers for problems they have suffered even before those problems are reported.

In addition to providing real-time dashboards, Conviva is increasingly therefore being called on to deliver reports identifying trends. “We have seen a huge surge in demand for this level of data analysis and consultancy – this is not for technical people who are used to dealing with data but for product people who are familiar with reports and conclusions,” says Franklyn.

As well as allowing service providers to tailor video quality to the needs of the market, user data can also help them choose which CDNs to use and gives them better control over vendor agreements.

In addition to real-time dashboards and reports, Conviva can also supply data via APIs, allowing big service providers with large data warehouses to aggregate Conviva-supplied data with other data. When all the data from multiple sources is aggregated, it can then be poured over and analysed by business intelligence teams. The company can, says Franklyn, provide a monitoring store that can be customised for different user groups. “If you are in the delivery department and you want to see the performance of CDNs or see issues related to DRM then you can access the bits relevant to you,” he says. The key metrics are ultimately those related to user engagement and satisfaction. “We use engagement metrics that we can tie to quality metrics to show links between the two, such as tolerance of low-resolution delivery of live content versus long-form content,” says Franklyn.

Franklyn says that the barriers to entry to client monitoring are in fact relatively low, with Conviva providing a cloud-based system that requires only “a bit of code in the player” at the client side. “Increasingly service providers recognise that every single customer is worth a certain amount of money to them. That is why they invest in premium content and it stands to reason that they want to make sure viewers are happy with the content,” he says. “We are seeing an increase in investment in monitoring and maintaining Quality of Service. If you don’t do it you are wasting the money you have invested in content in the first place. Can you afford to ignore the experience of even one of your customers, or do you reach out and proactively try to keep them as a customer. Launching a service, marketing it and winning back customers from competitors are all signficant costs, and some service providers really get that.”

Franklyn says Conviva is also innovating in the way it delivers its platform to customers, by for example launching a software development kit that allows them to adapt more quickly to a constantly changing environment. The results can in turn be fed back into the core product. “Our SDK project involves providing source code and documentation to customers to allow them to build the libraries [necessary to support new devices] themselves. You might have [the introduction of] a particular flavour of player and DRM that previously created a bottleneck before we could support it,” he says. “Now we say ‘here is the code – go build your own libraries’. It is a revolutionary approach that allows customers to know how to support devices and get consistent data. But it means we also get more data across more devices which can feed into our group data tool.”

Franklyn says that data gathered across the 1.5 billion devices globally that are supported by Conviva can be aggregated and anonymised to benefit the company’s customers collectively. “The more customers we have with the more devices, the more data we have,” he says.

Big problem

Not all industry participants are convinced that user-generated data will have a significant impact, at least in the near future.

For Simen Frostad, CEO of quality assurance specialist Bridge Technologies, the big problem with end user device monitoring is “understanding what is significant and what is not”.

“If you have huge amounts of data…but you don’t know to correlate and process that data, it is useless,” says Frostad. User behaviour is so varied and unpredictable that there is a huge challenge in filtering out ‘false positives’ such as those caused by people fast-forwarding content at the same point in programmes. “For the first time, with multiscreen we can get some simple data from devices, and by correlating data from different types of phone and OS, and also location, you can get some interesting insights. But it still has to be done with great care,” says Frostad.

To get meaningful user data means getting data back possibly from millions of devices in the field. If households have multiple playback devices, not only is there a lot of data to correlate but getting it back to the headend can itself put a strain on the network, says Frostad. He says that operators have to “only focus on the data that could be significant”.

Frostad also says that it is difficult to work out a commercial model that makes sense around user behaviour monitoring both for the system vendor and for the operator.

He also says there is an issue around the readiness of operators to cope with massive amounts of user data. “Just knowing certain things is not good enough if it just raises questions rather than answers them. If you have a million users on multiscreen devices, then individual user behaviour means nothing – you need group data. It is still challenging,” he says.

Users respond differently to buffering and other quality parameters in different households, and it is very difficult to draw clear conclusions, suggests Frostad.

“There are so many devices in the delivery chain that monitoring everything can’t be done,” he says.

Frostad says that quality assurance in the multiscreen world still suffers from the chicken-and-egg dilemma of a lack of will to prioritise quality for service that are still  for the most part free. But paid-for models are unlikely to evolve without quality assurance in place.

For pure OTT providers such as Netflix, the story is different as they rely on OTT multiscreen delivery as their principal business activity. Netflix has itself invested heavily in its own quality assurance solutions. But free-to-air broadcasters and others that may have limited resources are not yet in a position to make a signifiant investment in end-point monitoring technology, says Frostad.

Frostad also suggests there is still a lack of strategic direction within organisations when it comes to quality. While senior management are interested in one or two Quality of Experience metrics that can show relationships between, for example, fluctuations in quality and the length of time customers stay with the service or sign up for new contracts, approaches vary between organisations. More traditional cable operators, for example, often stick to the view that waiting for customers to call before sending out technicians is the best approach to quality. “The market is still quite young. Some operators have invested a lot and spend a lot of time in trying to eliminate problems and have had good success. In the early days IPTV operators found it difficult to achieve a high quality of service, but now they often have higher quality than cable operators,” he says.

Also somewhat sceptical about the value of user behaviour data is John Maguire, chief strategy officer at quality assurance specialist S3 Group. “We don’t think it is sufficient,” he says. However, says Maguire, the data is there and should not be ignored. “It is available data. You would be mad not to gather it because it gives good insight into customer behaviour and quality,” he says. One problem is that data gathered from end devices may provide an incomplete picture, he says, simply because it relies on monitoring from within the device rather than what the customer is actually seeing. The second issue is that service providers may not want to wait until customers are using the functionality they deploy before they test that functionality. “With an innovative app, you can’t afford for people to start Tweeting that it doesn’t work,” he says. Operators ultimately need to test delivery in the real world rather than in the lab, but not to rely on end users, he says.

Another problem identified by Maguire related to end user monitoring is that it requires “users to do interesting things” that fully test the functionality the operator wants to deploy.

More generally, says Maguire, operators are still held back to some extent by silos within organisations and by management issues as much as by technology. Using data and QoS monitoring effectively requires organisation and resources. Maguire cites the example of an OTT operator that selects a particular problem to solve at a time rather than trying to solve all problems identified by the software in one go. “They gather a book of results at the end of each week and send it out to collate data and build a picture of what to improve each week,” he says. “It is not very scientific. For example, how do you know you are tackling your biggest problem? If it is really big someone will shout about it.”

Mobile video

For IneoQuest’s VP of corporate strategy, Stuart Newton, much activity currently in the quality assurance space is being driven for example by the likely growing use of  mobile networks to view video. At the recent Mobile World Congress in Barcelona the company highlighted its partnership with deep packet inspection specialist Qosmos integrating deep packet inspection with its probes to do advanced analytics in the mobile core network. The company also demonstrated user behaviour analytics for iOS and Android devices.

“People are realising it is harder to do multiscreen video than they thought,” he says. “If you stream video to phones it goes across multiple CDNs to the phone network or roaming network and then to thousands of types of phones. If you compare that to TVs with a controlled edge network, it is a completely different world.”

In this environment, operators are typically looking for basic quality assurance before they take the plunge into the bottomless depths of user behaviour analysis. Similar conditions apply, he says, in the fixed world. “Delivering to multiscreens is just really complex,” he says.

Newton says that end user analytics ultimately has to be combined with network monitoring via probes. “If you look at the client side and correlate that with the network you can see you have a packaging issue at the headend, for example,” he says. The combination of technologies can give video providers “visibility and power” to manage their relationship with customers in real time.

Newton says the addition to client monitoring to more traditional quality assurance techniques has delivered “a huge amount of data” including behavioural data that can, managed properly, be used to provide tools for “proactive channel management”.

“We look at it from two aspects – the customer journey as a whole and adding KPIs on top of that, and the journey of a single asset,” he says.

The wider use of data to inform business decisions and create revenue-earning services is still at an early stage. “Our objective with this two years ago was real-time analysis for the converged video landscape. You have a lot of convergence going on at the moment in video delivery, but there are still lots of different silos and battles within organisations,” says Newton. Nevertheless, he says, operators have an ambition to go beyond creating “a holistic view for a baseline service assurance” to “demographic mapping for targeting”, for example of advertising. While user behaviour data can tell operators which ads were watched and when, the ultimate goal is to create a paltform to enable real-time bidding for ad slots.

Read Next