Oracle’s Steve Miranda: Customers nearing inflection point with Fusion

0
126
Indefinite storage: What it is and why you might need it

Source is ComputerWeekly.com

Steve Miranda, executive vice-president of Oracle Applications product development, spoke to Computer Weekly at Oracle Cloud World 2023 in Las Vegas about what he sees as an inflection point among the firm’s customers – a tipping point from project-based to business-driven conversations with the supplier. He also spoke about how generative AI is a big deal, and how it will play out for Oracle and its customers.

What follows is an edited and compressed version of that interview, which has become a regular catch-up about Oracle’s strategy.

At the very end of your keynote, you were talking about an “inflection point” in how customers are using your Fusion cloud applications suite. I just wonder what you meant by that. And why now?

“Now” is harder to answer. If I look back – and I think that’s one of my frustrations with these events, where people focus on “what are the three new features?” – we announced that we were going to build a brand-new SaaS [software-as-a-service] application. Well, we built that, and you start to get early adopters, and it continues.

Now, I probably do three dozen-ish customer sponsorships where I’m talking to a customer either weekly, bi-weekly, sometimes monthly, or sometimes quarterly on their implementation of the cloud applications. And overwhelmingly, in the last three to six months, I’ve seen a shift. The conversations are either that the company’s complete with their roll-out, including many phases or expansions, or the roll-out isn’t their number one worry, they’ve rolled out enough of the business so they have other things to think about.

So, the conversation has shifted towards: “We’re on the SaaS platform, and we’re getting these quarterly updates. We have received some business benefits. But what’s next? I’m asking you to take my DSO [day sales outstanding] from 45 days to 42 days. And how can you help reduce my financial close from the 18th to the 16th?”

It’s becoming a much more business-driven conversation, much less project-oriented. And so, I think the inflection point is one where the tone has changed.

We had a discussion in 2019 about why Oracle didn’t use fancy names for its machine learning (ML) and artificial intelligence (AI), like Leonardo, Einstein, Coleman or Watson. About how the ML was more baked in, under the covers. And so, I wouldn’t have been shocked if Larry Ellison said in his keynote this year, regarding generative AI: “We’ve been doing AI for years, we don’t make a big deal of it.” But he said the opposite of that. He said, in effect, this is revolutionary, this is a completely new paradigm, it’s like Sputnik in the late 1950s.

Well, I think what he said was: “We’ve been doing AI for a long time, but generative AI is very different.” But I would also say that AI is going to be fundamental in what we do, and then ground-changing of what we do.

I got a lot of questions, now that we’ve announced these AI features [in the Fusion applications suite]. “Are you going to charge for me?” Let’s play that out a little bit. Let’s suppose we charge for the AI. Are we saying to prospects, “Would you like AI financials? Or non-AI, human intelligence financials?” Of course, you have AI financials, but it’s not an extra thing you pay for.

But given generative AI’s well-known flaws – its hallucinations, its certainty that it is right when it isn’t – what safeguards is Oracle putting in place to ensure that these 50-plus generative AI tools don’t mislead users or otherwise cause serious errors? How are you going to protect your users?

Across our 50 use cases, there are some key driving habits. Number one, we never use customer data to train the LLM [large language model]. And the reason for that is we have certain service-level agreements with our customers to protect their data. And we believe it’s their data. And until and unless we have the capability of allowing customers to opt in should they choose, and so provide that functionality, we don’t do that. That’s part one. Second, we never pass PII data [personally identifiable information] to the LLM, because we don’t want PII data exposed. Third, we have all the AI that’s generated human reviewable today.

“[The use of cloud applications is] becoming a much more business-driven conversation, much less project-oriented. I think the inflection point is one where the tone has changed”

Steve Miranda, Oracle

We do not take an AI-generated sales proposal and automatically send it to your client. Because you’re correct – there’s still some hallucination and some things you may not want to send to a client. It is human reviewable. We generate the AI, we present it to a human, who could read it, inspect it, and then they press the send button.

We are very confident that over time – it might be three months, it might be three years, [although] I think it’s probably closer to three months – the LLM will get better and better. And [once that happens we] will start modifying those policies. If you told me six months from now that a substantial part of the hallucination problem would be solved, that wouldn’t surprise me.

In terms of enterprise use cases for generative AI, have you got a scale in your mind of what it is really useful for and what it is less useful for, in terms of the Fusion applications suite?

I think where you’re generating text, it’s very useful. Humans, sadly, don’t write very well. And they sometimes don’t like to write, and ChatGPT LLMs do a very good job of that. I would say also summarising. The LLMs do a very good job of summarising data today, whether a PowerPoint deck, a book or a movie. [If] you apply that to businesses and business reporting, there are a lot of things that you could do – a management summary, well-written, would be a great functionality for the enterprise.

Photo of Steve Miranda delivering keynote at Oracle Cloud World 2023
Steve Miranda delivers his keynote at Oracle Cloud World

Or think of any screen, that’s a table or anything that you would download to a spreadsheet within our applications. The generative AIs do a great job of summing and adding that up. Take a public LLM, like ChatGPT, then take a spreadsheet with countries, a job code and some salaries, and you can say, “What is the average salary for a product manager in France?” So, all of a sudden, for what we used to call ad hoc reporting or building a report, you just need a text bar.

But you could also ask, “What’s the average salary of a product manager in Europe?” And that column doesn’t have to have “Europe” in it, it just has countries, because the LLM knows what Europe is.

Talking to various executives over the year so far, coming to events like this, what I’ve heard quite a few times is, “It’s the children.” The children of C-level executives have been coming home from school saying, “You really need to check this out.” So projects they’d put in a “let’s get to that in five years’ time” box they are getting to now, straight away, with a sense of urgency.

That is the value of naiveté, right? Sometimes it’s better when you don’t know any better. I would definitely agree with that.

I would talk to our head of AI, really pressing her: “I need use cases.” And it was really difficult. But, you know, you get this simple, charismatic, popular innovation like the LLM. I think that’s how you’d use three words to describe it. It is very simple – anybody could use it. It was very popular, right? And it’s very charismatic. You can interact with it at whatever level you want. Even a child could interact with it. It wasn’t a mathematical AI. It wasn’t machine learning that derived from the AI that we have for IoT [internet of things]. It wasn’t AI that we have for audit transactions.

I think that’s what got it to explode. No offence to the technology, but the fact that it was charismatic, simple and popular really had it take off.

Source is ComputerWeekly.com

Vorig artikelFlash prices drop towards spinning disk levels in 2023
Volgend artikelAzure Machine Learning – General Availability for October