Podcast: Storage at the edge – impact and opportunities

0
310
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

We talk to Tobias Flitsch, head of product at Nebulon, about the rise of the edge as a site for compute and data services and the impact this will have on data storage.

In this podcast, we look at how the rise of edge processing is affecting topologies from datacentres out to remote locations, the constraints the edge imposes and the growth of data services in these locations.

Flitsch talks about how topologies are evolving to get around the challenges of latency and bandwidth, and how that means storage must be resilient, secure and centrally manageable.

Adshead: What are the implications for storage of the rise of edge networks?

Flitsch: What’s happening right now is that we’re seeing a lot of organisations that are re-architecting their IT infrastructure topology because they are either in the middle of their digital transformation journey or already through most of it.

And, IT has always been about data and information processing, and cloud was and still is a key enabler for digital transformation. That’s because services are quickly instantiated and scaled throughout the transformation journey.

So, many organisations, as part of their digital transformation, have leveraged public cloud services and spun up new services there. Now that businesses are becoming more digital, more data-driven, more data-centric, and understand their best use of digital assets, their reasons and requirements for more data access and data processing changes or gets more refined.

So, where and how they process data and for what purpose are now key decision criteria for them, specifically for IT architecture and the topology. It’s not just cloud or the datacentre any more. Now edge plays a key role.

I understand edge can be a tricky word because you can get a different definition depending on who you ask.

Edge to me means putting servers, storage and other devices outside of the core datacentre or public cloud and closer to the data source and users of the data, which could be people or machines. And how close? That’s a matter of the specific application needs.

We’re seeing increase in the number of data producers, but also the need for faster and continuous access to data, and you can see that there is the need to provide more capacity and data services locally in the edge sites.

There are a couple of reasons for that. Low-latency applications that you often find in industrial settings cannot tolerate the latency round-trip between an edge site and a core datacentre or a cloud when accessing a database, for example.

So, local data is required to support latency-sensitive applications, and there are also remote office and branch office applications that don’t have the luxury of a high-bandwidth, low-latency access network to a corporate datacentre. But users still need to collaborate and exchange large amounts of data, and content distribution and collaboration networks depend on local storage and caching storage to minimise bandwidth utilisation and therefore optimise costs.

Lastly, there is the driver of unreliable networks. We’re seeing a significant growth in data analytics, but not all data sources and locations can benefit from a reliable high-bandwidth network to ensure continuous data flow to the analytics service, which is often done in the cloud.

So, local caching, data optimisation – at the extreme doing the data analytics directly at the edge side – requires reliable, dense and versatile storage to support those needs. What this means for storage is that there is an increasing demand for dense, highly available and low-maintenance storage systems in the edge.

Adshead: What are the challenges and opportunities for storage with the rise of edge computing?

Flitsch: If you look at storage specifically from an edge perspective, it certainly needs to adjust to the demands of the specific application at the edge. In the past, we’ve always deployed storage and storage systems in the central datacentres with plenty of rack and floor space, power and cooling, access to auxiliary infrastructure services, management tools, skilled service personnel and, of course, strong security measures.

Most of this is not available at the typical edge site, which means storage solutions need to adjust and work around those restrictions, and that’s a real challenge.

Take the issue of security as an example. I recently spoke with a manager in the transportation business that is responsible for their organisation’s 140 edge sites that are set up in a hub and spoke topology around their redundant core datacentres.

They cannot rely on skilled personnel at these edge sites and it’s not easy to secure these facilities, so key infrastructure might easily be tampered with and it would be really hard to tell.

Because these edge sites are connected to the core datacentre, this puts their entire infrastructure at risk, let alone the problem of data exfiltration or perpetrators stealing storage devices, for example.

I think this is the main challenge right now: securing infrastructure and data at the edge, especially with the rise of ransomware attacks and other cyber security-related threats.

But, I believe that a reliable data protection and speedy recovery solution can address this problem.

I also believe that modern infrastructure and storage can address these other challenges that I mentioned if it is centrally and remotely manageable, if it is dense and highly redundant, if it is affordable and features to write data services.

Finally, I believe the need for local storage at the edge will continue to grow and become more and more important for customers, and I think the benefits of having data accessible at low latency with resiliency outweigh those challenges for storage by a lot.

Source is ComputerWeekly.com

Vorig artikelIT Sustainability Think Tank: How collaboration and partnerships enable a circular economy
Volgend artikelAn In-Depth Look at Data Center Architecture