AI more likely to complement than replace predictive analytics

0
370
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

While in data analytics terms, tools for activities such as data extraction and exploration are quite mature and well adopted, the situation for predictive and prescriptive analytics is quite another story.

Predictive capabilities make it possible to forecast future events based on past and present performance, while prescriptive, or instructive, analytics offerings examine data to enable organisations to answer questions such as “what should we do?”.

But as David Semach, partner and head of artificial intelligence (AI) and automation for Infosys Consulting in Europe, the Middle East and Africa, points out: “The adoption of predictive analytics is still relatively low and the technology is maturing, while the take-up of prescriptive tools, which is the next stage on, is almost non-existent.”

Semach believes this situation results from three key factors. Firstly, he says, in a predictive analytics context, there is “no one silver bullet” tools-wise. Instead, implementing such technology is a complex and expensive undertaking, requiring a  “multi-tool solution”, large amounts of data and a solid business case.

Secondly, it takes time and effort not only to build the predictive models themselves, but also to aggregate the necessary external and internal data from across the business to feed into the system. Thirdly, there is often resistance from business leaders who are used to undertaking forecasting themselves and do not necessarily trust the findings of machines.

“A survey we did in early 2020 found that 91% of business decisions are made with a lack of supporting data but are based on human experience and gut feeling,” says Semach. “Nonetheless, data quality is becoming less of an argument now as the digitisation brought about by Covid means people are getting their data into a better state and are moving it to the cloud.”

Understanding the predictive analytics market

As to which industries count among the early adopters here, they include fast-moving consumer goods and retail, life science and pharmaceuticals, and energy and oil and gas, with uptake having increased considerably over the past 18 months.

Business functions that are particularly keen on the technology include finance, sales, HR to a certain extent and supply chain in a demand forecasting context. Popular use cases, meanwhile, range from forecasting late payments to predicting customer purchasing activity and prioritising the most likely sales prospects.

Mike Gualtieri, a vice-president and principal analyst at Forrester Research, says: “A major corporation may be working with six to 12 use cases today, but there could be hundreds, so we’re just at the beginning of this. Sometimes organisations do their own custom work, but most software vendors are starting to put predictive models in their applications, so as you upgrade, there’ll probably be a few use cases in there, too.”

Interestingly, though, it appears that in many instances, the statistics-based predictive analytics tools of the past are now being subsumed into broader machine learning-based platforms.

“We’ll probably drop ‘predictive analytics’ as a term next year as you’re not going to find any of the 50 or so vendors that market products using it – they’re increasingly calling it a ‘data science’ or ‘machine learning’ platform,” says Gualtieri. “The reality is that these platforms often have lots of older statistical methods in them as well as machine learning, but machine learning is seen as the hot, leading-edge technology.”

In the case of those suppliers that sell traditional extraction and exploration products and offer predictive offerings as part of their portfolio, Semach believes that unless they “drastically evolve” and adopt a machine learning approach sooner rather than later, they will “die eventually”.

The same is not true of the overall market for predictive capabilities, however, which Semach is confident will continue its steady growth. “It won’t happen tomorrow – it’ll be more like three to five years – but it will happen,” he says.

In the meantime, here are two organisations that are ahead of the curve and already deploying such technology to great effect.

Case study: University Hospitals of Morecambe Bay NHS Foundation Trust

“If you take the view that there needs to be transformative change in healthcare to optimise services, make them more affordable and deliver better outcomes, then embedding AI and predictive analytics is the way to do it,” says Rob O’Neill, head of analytics at the University Hospitals of Morecambe Bay NHS Foundation Trust.

O’Neill first implemented the organisation’s analytics and data science strategy about three years ago after being selected to become a member of the NHS Digital Academy’s first cohort, which was set up to train digital leaders.

“The big driver was that healthcare isn’t sustainable in the way it’s delivered now,” he says. “There are significant population-wide health challenges, delivering services is expensive and there’s a gap in funding, so we’re always financially challenged.”

Another key issue is that the specific population the Trust serves is dispersed over a wide geographical area with varying levels of deprivation. As a result, because of the scale of the challenge involved, the organisation began undertaking a significant change programme to “redesign how care is delivered”.

The aim is to replace the traditional hospital-with-GP-led-care approach with a more holistic “system” to ensure services are provided in the safest and most cost-effective way possible while enhancing the patient experience at the same time. Predictive analytics and data science are considered vital tools in enabling this shift.

“I’d argue that the only way to solve these challenges is by using data, and data science, machine learning and other predictive techniques in particular, to shake up how we model and understand what’s happening,” says O’Neill. “It’s not about predicting the future, but changing it through well-thought-through and clinically bought-into strategies – that’s the ambition.”

To this end, the Trust has implemented Snowflake’s cloud-based data warehouse, DataRobot’s tools to create predictive analytics models and Qlik’s data analytics tools to support clinicians’ decision-making in a range of care settings.

The power of prediction

For instance, the system is currently used to accurately predict the number of patients attending A&E on any given day and to assess how acute their requirements are likely to be to ensure enough resources are available. It is also employed to predict the likely length of a patient’s stay in hospital and their risk of readmittance following discharge in order to undertake unscheduled demand forecasting.

A third predictive model uses volatile pandemic-related data to forecast patients’ personal risk levels when waiting for elective operations that have had to be deferred because of the pandemic. A fourth, which is currently going through clinical review, is focusing on identifying hypertensive patients before they are even admitted to hospital.

“There’s an ageing population in the West as people are living longer with an increasingly complex range of chronic conditions,” says O’Neill. “So if we can intervene earlier, great – and if we can understand the patterns of demand associated with that, we can also better align our resources internally.”

As part of the process of making the Trust a more data-driven organisation, it has now embedded members of its analytics team in each of the transformational work streams and into its clinical business units.

“Having clinical engagement and being clinically led is very important to ensure the technology is accepted and embedded,” says O’Neill. “But it’s such a busy, complex environment that clinicians work in, and they’re so focused on delivering safe and appropriate patient care that any solution which can help in supporting decision-making tends to be really welcomed.”

While he acknowledges that the Trust is one of only a “fairly small handful” in the UK that has got as far as Morecambe in terms of predictive analytics, he believes this situation is likely to change quite rapidly with the work of NHSX, a UK government unit responsible for setting national policy and developing best practice for NHS technology, digital tech and data.

“If we’re looking at a national and global strategy for health, it really needs AI and predictive analytics to deliver,” says O’Neill. “Different organisations are currently at different points on the maturity curve, but it is the future.”

Case study: National Express

During the UK’s various Covid-related lockdowns, predictive analytics models were key in enabling National Express West Midlands (NEWM) to ensure that the supply of vehicles on its bus network aligned with socially distanced customer demand.

The company had started its predictive analytics journey pre-pandemic when it implemented CitySwift’s cloud-based platform, which was developed specifically to forecast journey times and passenger demand for urban bus networks. The use of big data and machine learning techniques made it possible to optimise timetables by taking traffic levels and other external events into account.

The decision to adopt this approach was taken when it had become clear that ageing demand analytics software had “maxed out” in terms of boosting operational efficiency and optimising profitability, says Andy Foster, deputy commercial director at National Express.

“The big challenge we face is congestion, which is a double whammy for us – if services are slowed down by congestion, we need to put more buses on to offer the same service, but each bus costs £100,000-£150,000 to operate, so it’s not a decision you take lightly,” he says.

A key conundrum here is that if services are slow, they become less attractive to passengers, who therefore start using them less. This situation “pushes costs up and revenues down, so you end up being caught in a horrible pincer movement”, says Foster.

Another problem for NEWM was that it had no way of analysing the impact of changing schedules, adding extra buses or altering the frequency of a service.

“So we could go to the bosses and say ‘route 97 has a problem’, but we couldn’t quantify how successful any particular action would be or not,” says Foster. “We could only say it may improve it, but we couldn’t quantify the benefits.”

As a result, in November 2019, route 97, which provides services to 80,000 passengers a week, became the first to benefit from SwiftSchedule scheduling technology. The software was used to analyse the time it took for buses to get from one stop to another, the number of stops made and how many passengers got on and off. External historical data, such as weather and traffic patterns, was also added to better understand real-world impacts.

This information, in the shape of about 750,000 data points each day, was then used to make changes to bus times, which increased their punctuality from an average of 89% to 92.5% and saved 2.4% in operational costs.

Tackling the pandemic

But when the pandemic struck in March 2020, the impact was profound. Overnight, congestion issues ceased to be a problem, while continually changing lockdown-related demand and social distancing took over.

To address these challenges, NEWM started using CitySwift’s SwiftMetric software to better understand the impact of the situation in bus reliability, efficiency and demand terms in order to inform its decision-making.

“We could see how demand and the speed of operations were changing and we could do so very quickly as we were getting fully analysed data in less than 24 hours,” says Foster. “By Tuesday afternoon, we could analyse what had happened on Monday – this enabled us to get buses into the right place and ensure there was enough capacity to take essential workers to work without them becoming overcrowded.”

The company also created a website page for anxious travellers to check how busy any given journey was likely to be, so they could adjust their plans accordingly.

Now that lockdown is over and congestion has returned, the use of the technology has returned to route optimisation, identifying savings and reinvesting them in improved bus frequency. On route 16, for example, time-keeping has been improved by 4%, which has led to a 2% increase in passenger usage and a 4.6% rise in the number of journeys undertaken.

The technology will also be rolled out in September to cover 40% of NEWM’s operations. A further aim is to share congestion information with the local highway authority in the hope that it will reallocate road space to reduce the problem.

Data will also be provided to Transport for West Midlands, which is responsible for coordinating services in the region, to make the case for increased funding under the government’s “Bus back better” national bus strategy.

“We were always fairly good at data use, but predictive analytics is taking it to a higher level and enabling us to make better decisions on where and how we move resources around,” says Foster. “It’s a powerful tool.”

Source is ComputerWeekly.com

Vorig artikelBoots leading drive for ‘professionalisation of IT’ in retail
Volgend artikelApple Creates $100 Million Fund to Pay App Developers in a Settlement