Benefits of edge computing over large data centers

0
19
VMware CEO tells enterprises to become 'cloud-smart' to speed up pace of digital transformation

Source is ComputerWeekly.com

Edge computing is growing with the increasing demand for real time processing and reduced latency in today’s digital landscape. The rise in distributed IT systems, cloud computing and virtual networks has fundamentally changed the role of today’s data center.

For successful edge adoption, IT leaders must consider the requirements for data collection, process visibility, security frameworks and the physical design of edge deployments.

Evolution of edge computing

Traditional data centers centralize every aspect of data processing and storage. Today, they also power a range of new technologies that are geographically dispersed, such as IoT, 5G, robotics, machine learning (ML) and AI. The edge provides data-intensive processing in which storage, management and analysis occur near the remote device or end user.

Reliance on this distributed computing approach continues to expand. According to Gartner, 75% of enterprise-generated data will be created and processed outside a traditional data center in 2025. The edge functions primarily as one part of a hybrid strategy for distributed processing, typically connected to a larger data center. However, an increasing number of organizations are deploying modular micro centers as replacements for traditional data centers.

Organizations are also choosing edge deployments because traditional data centers have high maintenance costs, under-utilized resources, poor energy efficiency and limited scalability. Environmental impacts increase in proportion to the physical data center footprint.

Scalability and flexibility with containers

In terms of physical design, edge data centers comprise several form factors. Both modular and scalable, these micro centers are uniformly small and self-contained for localized data processing and storage. Appliances can vary in size, ranging from 10-foot square base units to larger systems. These larger systems are often assembled using prefabricated components along with power and cooling to meet the custom needs of workloads.

Organizations can connect multiple small units to right-size their deployments and achieve processing goals. Easily deployed modular edge units improve data transfer rates. Organizations can expand their edge initiatives as IT resource needs change and remote devices and IoT sensors grow in number.

The use of containers offers another means for standardizing application and device deployments, ensuring consistency at the edge. Containerization combines an application and its dependencies, like libraries, binaries and configuration files, into a compact system that operates across multiple environments. Containers ensure portability, efficient application processing and minimal system resource overhead — all critical for edge functionality.

Manage data, gain visibility and ensure security at the edge

Local processing at the edge is designed to be near-instantaneous to support diverse IT operations, from remote asset and equipment performance in heavy industries and manufacturing, to fraud detection in finance, supply chain management in retail and autonomous vehicles. Organizations can achieve considerable cost savings and lower latencies using real-time data aggregation and analytics at the edge.

Data management

Only a small fraction of data must be sent to a cloud service or primary data center for in-depth analysis. The goal is to uncover long-term trends, large-scale patterns or operational indicators, which often require extensive computational resources to conduct analysis.

When managing data generated at the edge, businesses employ metadata systems to achieve three primary goals: directing traffic between the edge and the network core, understanding overall context and making fast data routing decisions. Utilities in the energy industry, for example, can better manage their smart grids, industrial companies can boost equipment and asset monitoring, and healthcare facilities can process metadata in real time to improve patient outcomes.

Visibility

The need for visibility remains a persistent goal. IT administrators employ data center infrastructure management tools to manage deployments at a granular level, from optimizing power distribution and monitoring environmental conditions to tracking assets and inventory. These features also include visualization tools that provide health metrics, schematics and auto-generated rack diagrams accessed through unified dashboards.

Enterprises also use AIOps tools that correlate events across geographically dispersed locations, detect anomalies and pinpoint issues. For example, edge computing and ML can significantly improve quality and consistency in manufacturing; offer real-time operational insights and customer service in retail; and maintain smart grids that are responsive to dynamic usage changes in the energy industry.

Security

Ensuring airtight cybersecurity at the edge is crucial because devices are prime targets for vulnerability exploitation. A zero-trust architecture adapted for distributed environments is ideal for providing continuous authentication and authorization across the edge. Using a “never trust, always verify” approach, zero trust effectively protects the broad attack surface and geographically dispersed devices targeted by hackers. Since every access request is treated as a potential threat, zero-trust architecture verifies identities, validates access privileges and looks for behavior anomalies to detect intrusions.

Encryption is also important as data is regularly in transit across edge networks. Organizations can protect data during transmission by using secure protocols, such as TLS or HTTPS, and the Advanced Encryption Standard for data at rest. Encryption protocols also ensure companies adhere to compliance and privacy requirements, like HIPAA.

Enterprises must maintain edge-specific data governance to have correct storage practices, policies and accuracy of accrued data, ensuring that it complies with sovereignty laws based on regional jurisdictions. Since information can be generated from various sources, administrators and IT leaders must be vigilant about processing edge data within specific geographic regions.

Considerations before adopting edge computing

The edge offers significant benefits that result from increased operational efficiency. However, edge sites need to be carefully planned and designed to achieve desired goals. As organizations consider edge adoption, an essential first step is identifying high-priority data for immediate processing versus data that can be transmitted to the cloud or central data center for deeper analysis.

Advanced operations often lead to further complexity. IT leaders must consider the best way to implement additional strategies, such as developing data synchronization mechanisms, edge caching approaches and data replication protocols to ensure consistency across their distributed environments.

Kerry Doyle writes about technology for a variety of publications and platforms. His current focus is on issues relevant to IT and enterprise leaders across a range of topics, from nanotech and cloud to distributed services and AI.

Source is ComputerWeekly.com

Vorig artikelAWS sees revenue and profit rise in Q2, bats away competitive concerns