Where now for storage? Dell EMC, NetApp and HPE

0
387
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

In this first article of two, we look at Dell, HPE and NetApp. All three are making a big play towards the cloud – with Dell EMC and HPE’s consumption models very prominent – and NetApp, the only pure play storage supplier of the three, being noticeably vocal about things like containers.

IDC forecasts that more than 50% of core datacentre infrastructure and 75% of edge infrastructure will be sold as a service by 2024.

That trend is driven by the cloud and its as-a-service mode and it potentially hits storage suppliers hard, with their historic dependence on the sale of hardware products.

So, we have seen the big players in storage adapt to the new world by offering consumption models of purchase, via the cloud and on-premise, and via hybrid modes that straddle the two.

But that’s not the only trend. We also have the tendency towards the edge, and to analytics-based IT activities, sometimes combined. There is also the rise of containers, as a rapidly scalable method of application deployment.

And of course the storage array is not dead yet. But it is – for primary storage use cases at least – almost always flash-based and nearly always available with NVMe for very high performance. Elsewhere, even in secondary use cases, flash is making inroads, in particular via the latest, bulk storage-focussed flash generation, QLC NAND.

The biggest storage players manifest these trends, and further characteristics beyond, according to their history, size and reach in IT and beyond. Here we look at we look at Dell, HPE and NetApp. 

Dell EMC

Dell’s big push is towards making everything in IT infrastructure available as a service.

That’s not to say there have been no storage hardware developments. There have. But if you had to characterise Dell Technologies’s main thrust, it’s summed up by Project Apex.

Project Apex was launched last summer at the virtual version of its annual shindig. It offers customers an Opex model of consumption for Dell Power-branded products via local datacentre, edge and cloud.

Project Apex services are coming online this year, starting with storage-as-a-service and Dell EMC storage. Further Project Apex rollouts will include hyper-converged infrastructure, Dell PowerEdge servers and PowerOne networking, then eventually workstations and laptops.

Having said all that, last year Dell EMC did launch its new PowerStore midrange array, PowerScale NASPowerFlex software-defined storage, and rugged versions of its VxRail hyper-converged infrastructure and PowerEdge XE2420 server. Later the XE7100 storage server came along, targeted at real-time analytics in the hybrid cloud.

Via Dell’s web portal – Cloud Console – customers can order IT resources for delivery on premises as a service. Customers specify type of storage, capacity, performance, SLAs and pricing requirements in the Cloud Console.

Dell EMC also still has the SC and PS series storage arrays – formerly Compellent and EqualLogic – on their books.

NetApp

Like others, NetApp has struggled with customers migrating from on-premises storage to the cloud. NetApp’s strategy therefore centres on offering its storage software as cloud-native services. These subscription services include NetApp Cloud Volumes on AWS and Google Cloud Platform, and Azure NetApp Files.

NetApp also sells Cloud Volumes OnTap through AWS, which is an Amazon Machine Instance that uses Amazon Elastic Block Storage (EBS) to serve as the equivalent of an on-premises OnTap storage node.

Meanwhile, NetApp launched Project Astra last April. This centres on a containerised version of OnTap and is a data management service that manages, protects, and moves Kubernetes containerised workloads in public cloud and on-prem.

NetApp made other moves into containerisation in 2020. NetApp Spot Storage and Spot Ocean abstract the compute and cloud storage needed to run a Kubernetes farm while Spot’s continuous optimisation platform combines analytics and automation and brokers cloud pricing to help organisations control costs.

Earlier container-focussed work included NetApp’s Trident open-source driver for provisioning container storage. Last year, NetApp also bought Talon Storage which brought global file caching and data sync capabilities and CloudJumper to provide delivery of virtual desktops for customers.

NetApp introduced Keystone as the consumption model for its hardware storage products in 2019. Customers commit to a minimum storage capacity and timeframe and select from three performance levels and service offerings, such as file, block or object. NetApp installs and supports the equipment.

NetApp’s FAS array line-up expanded in 2020 with the FAS500f high-capacity model outfitted with quad-level cell (QLC) NAND solid-state drives (SSDs). Compared to previous NAND generations, QLC flash has a limited endurance and performance profile, with the trade-off being a lower cost per gigabyte. 

HPE

HPE – with its Greenlake consumption model – has said it wants everything to be available in the cloud by 2022.

It also places importance on activities at the edge, as well as in containers and analytics.

In his keynote at the company’s 2020 Discover event CEO Antonio Neri said new HPE services would address the needs of customers to adapt edge and on-prem workloads to work with the cloud.

New elements in the portfolio include the Ezmeral Container Platform and Ezmeral ML Ops which will be delivered as cloud services through GreenLake.

At the event HPE also unveiled HPE Cloud Volumes Backup, which converts proprietary backup data sets to a common data format. That allows multiple data sets drawn from backup to be available in one format to secondary workloads – including analytics – running from the public cloud.

Prior to the 2020 event HPE had also launched the Primera storage platform, which will possibly replace its 3PAR range.

Primera is an all-flash enterprise storage array sold as Tier 0 enterprise storage. Primera uses HPE InfoSight to provide an intelligent storage platform that incorporates AI and machine learning to predict and prevent storage disruptions.

HPE Primera uses custom chips to enable massively parallel transport of data across dedicated PCI express lanes. It is equipped to support NVMe flash and persistent storage memory used to train massive AI data sets.

Elsewhere HPE made its InfoSight predictive analytics resource management capabilities available on its HPE SimpliVity hyper-converged infrastructure platform.

On the container front HPE has its Container Platform, which combines the supplier’s BlueData and MapR acquisitions with an open source Kubernetes layer. BlueData provides persistent data stores that can support stateful legacy applications. MapR is a distributed file system.

Source is ComputerWeekly.com

Vorig artikelIR35 reforms: Private sector start date prompts mixed picture of predictions for contracting market
Volgend artikelUS court issues Google API with Java ruling