Anyone responsible for data analytics or business intelligence in their organisation faces two challenges: dealing with an ever-growing volume of data and processing data more quickly.
The volume of data has grown exponentially over the past decade, with IDC estimating that 59ZB (zettabytes) of data was created in 2020 alone. But for decision-makers, the emphasis is often less on how much data is available – arguably, there is already too much – and more on how quickly that data can be turned into information they can act on.
For some, the goal is to act immediately – and even without human intervention. The need for faster decision-making, and the potential to tie data analysis into machine learning and autonomous systems, is prompting data analytics specialists and tools suppliers to move towards processing data in real time, or as close to real time as is practical. But so far, real-time analytics has largely been confined to specialist use cases where the results justify the – often substantial – investment.
According to a study carried out for data management and analytics provider InterSystems, just 11% of retail, consumer packaged goods and manufacturing firms have access to data that is under an hour old, and although businesses have access to more data than ever before, the majority still rely on business analytics or intelligence systems that produce reports or, at best, an online dashboard.
Inevitably, this slows decision-making and reduces the options for automating business processes.
Defining real time
Although faster data analysis is always useful, there is no single industry definition of real-time analytics. Industry watcher Gartner, for example, distinguishes between “on-demand real-time analytics”, triggered by a user or system request, and “continuous real-time analytics,” which provides alerts or other intelligence as events happen.
Moreover, software providers use other terms, such as streaming analytics, event stream processing, or real-time stream processing to describe technologies that can process and provide insights on live data.
And how fast a system should analyse data will depend, in turn, on how the business needs to respond and how quickly it can act.
Simon Doyle, University College London
Deploying technology to process data more quickly is of little value if the organisation’s response time does not change too. A fraud prevention system will be more effective if it can block suspect card transactions in seconds, or even under a second. Real-time analytics of sales in fashion will make little difference to the business if new stock has a six- to eight-week lead time while it is shipped from manufacturers.
“For me, real-time analytics is anything that is sub-second and requires an instantaneous response,” says Simon Doyle, an associate professor at University College London, who specialises in business analytics.
“From the point of observation or collection, the data has to go through some feedback cycle where the analysis occurs and a decision is made. The best common example of this would be in Formula 1. Telemetry from the vehicle is projected to the pit crew and either a human being makes a decision, or some form of algorithm or heuristic performs a semi-automated or fully automated change or decision.”
As Doyle suggests, the usefulness of the real-time analysis largely depends on how quickly it results in action.
“Real time is an often misused moniker,” says Nigel Robinson, UK analytics lead at PA Consulting. “There are clearly applications for it in business and the public sector. We see it in law enforcement, or in retail through recommendation engines. That is the public perception of real time. But often clients see real time as reporting twice-a-day or twice-daily updates to a model.”
Nonetheless, the majority of business analysis suppliers are adding real-time capabilities to their tools, making them easier to roll out and use.
“Real time has been seen as the next big thing, but organisations have tried to implement these solutions [only to find] that the organisation, or its technology, has not been set up to deploy it,” cautions Robinson.
Barriers include poor or variable data quality, siloed data or business processes, and even a lack of clear understanding from senior management of what a real-time data project can achieve.
But there are examples of where the technology is being deployed, and works well.
Solving clear problems
As the term suggests, real-time analytics is most useful when a problem needs to be solved quickly. In practice, it comes into its own when it is deployed to solve a known problem with a clearly defined outcome or action. If that outcome can be automated, so much the better.
“Today, real-time analytics is moving away from dashboards, and is becoming more about how applications react to events,” says Yiftach Schoolman, co-founder and chief technology officer at Redis Labs. “A lot of this is being used for pricing in financial services, with the internet of things [IoT] for monitoring infrastructure and dynamic content management. It is also being used in AI itself, to monitor drift.”
Scanning online or card transactions for fraud is one use of real-time systems, and scanning biometric identity documents at a border is another. In both cases, the system only needs to allow or block the transaction, or open the airport gate, or alert a border officer.
PA Consulting’s Robinson points to systems that can check for bus lane infringements, or congestion in airports, stations and other passenger hubs. The firm worked with Dutch rail operator NS to add real-time analysis of carriage congestion to its smartphone app. The “seat finder” function uses IoT data and Lidar technology, and currently covers trains operating between Arnhem, Nijmegen and Den Bosch.
Transport and logistics are popular applications for real-time analytics because they can produce relatively simple decisions, which lend themselves to automation. Even if a human needs to intervene – such as to check a passport manually, or open up another ticket desk – a simple alert will be effective.
In other situations, including retail and again transport, real-time analytics can be used to improve the customer experience. This can be by alerting supervisors before a small problem turns into a much larger one, or by helping customers to find resources online.
Using analytics in this way creates a “white glove experience”, says Maxie Schmidt-Subramanian, a principal analyst at Forrester.
One example is a power company using analytics on its website to detect which customers wanted to delay bill payments during the pandemic. Automated services directed customers to the relevant resources without the time-consuming, and potentially embarrassing, need to talk to a customer service adviser.
An airline might use real-time analytics to spot delayed customers and send a rep to help them through security to meet a connection, or even offer a drink in the lounge. This, says Schmidt, is a way of using real-time data to offer the “next best experience”.
“Even if someone at the gate says, ‘Ms Schmidt, I can see you’re having a tough time,’ it rescues the situation,” she says. “But you have to be able to engage in the moment.”
A conventional BI report or dashboard, though useful, does nothing for that individual customer on that day.
Data streams, streaming analytics
Using real-time data to improve the customer experience provides organisations with a quick return on investment: fewer helpdesk and customer service calls will save money.
But businesses can drive further savings by tying real-time analytics into their operational infrastructure, and by making use of sensors and connectivity.
Aircraft engine makers are using real-time data, coupled with analysis and machine learning, to predict when parts need servicing. If there is a fault, they can even send instructions back to the aircraft, to preserve the engine and cut the risk of it failing mid-flight. Clearly, these decisions need to be quick.
Other examples include supply chain management, monitoring financial transactions and cyber security. According to Gartner, in a paper for analytics company Snowplough, stream processing is being driven by “the need for continuous intelligence” and works by detecting patterns in the data and, increasingly, by handing it off to a machine learning system. But it is not limited to industrial or engineering applications.
“It is about what is happening right now,” says Geoff Clark, general manager for Europe, the Middle East and Africa at Aerospike, a real-time data platform. “PayPal is one of our customers and its processes 500 payment requests per second. There is no manual intervention. The core system is based on machine learning, and each spawns about 200 lookups, against 100TB [terabytes] of data. There is a vast quantity of data being captured in real time.”
Clark expects to see businesses using real-time analytics to tackle increasingly complex problems as the technology develops and becomes more accessible, especially through the cloud.
However, businesses will only benefit from real-time analytics if they can act on the signals it provides. If business processes themselves are not agile, or managers prefer to rely on gut instinct, the investment will not pay its way.
This is why successful real-time analytics projects have clearly defined goals and parameters, clearly measurable outcomes and management support.
“Just because you can get information to someone sooner, doesn’t mean they will act on it,” says PA Consulting’s Nigel Robertson. “That is not a CIO problem, it’s one of broader culture.”
Unless that is addressed, it is a “battle of the truths”, he says. “You have to be led by the data.”