Skip to main content

Edge virtualization


Edge virtualization is the practice of using software versions of physical computing resources at the edge of a network, closest to the devices that produce data. In virtualization, the entire software stack -- including operating systems (OSes) and everything that runs on them -- is separated from the underlying hardware. Instances can then be copied and distributed to many different types of hardware. This is valuable for at the edge because the hardware at the edge where the environment is involves limited bandwidth and latency issues, is varied and dispersed geographically, and needs to be managed independently from the geographically distant core data center.

Edge virtualization is important because it extends the software-defined concept of the cloud universally. This software-defined approach enables the remote provisioning, management and monitoring of edge devices across large geographical footprints, providing a more secure and cost-effective alternative.

Edge virtualization, by its very nature, is a combination of software-defined compute, storage and networking -- much as you would find in the cloud, but where these resources are remote and, in most cases, of modest scale.

Where the cloud has massive compute, storage and networking resources that are tightly geographically associated, usually in one or two very large data centers, edge virtualization has a very large number of locations, each with a modest complement of resources.

Edge virtualization extends the fabric of compute, storage and networking to many remote locations, enabling the processing of a wide range of workloads in places where they need to execute.

In retail and hospitality, edge virtualization is rapidly becoming a key requirement due to enhanced levels of customer experiences driving the need for local processing. Applications such as point of sale (POS), loyalty, kitchen management systems and RFID (radio frequency identification) require faster local processing.

Virtualization's role in edge computing

Edge computing is the term that describes the duties of devices at the edge of the network. Every network has a core and an edge, a trunk and branches. The core (trunk) is where the network starts, and the edge (branch) is where it ends. The edge is the point at which external users interact with the network. These end users may be actual people or other external devices known as peripherals.
As the number of endpoints at the edge increases and become more varied and dynamic, it becomes less efficient for each endpoint to reach back to the core for resources at every external request. This is where virtualization comes in.

Virtualization puts resources at the edge. It allocates virtual resources to the endpoints -- or branches -- so that instead of constantly requesting resources from the core, endpoints have their own instance or portion of those resources, locally. The distance that a resource request must travel is much shorter in a virtualized infrastructure.

Basically, the vaster and more varied an organization's edge "ecosystem" is, the less feasible it is to service that ecosystem from a few core locations. Disparate endpoints need to function more independently. When branches are farther from the root/trunk, they need to be able to hold more water and for longer.

Edge virtualization vs. core virtualization

Conceptually, edge virtualization contrasts with core data center virtualization in the following ways. Individual "instances" of edge virtualization:
  • are on a much smaller scale;
  • are located far from the edge's control point without local support;
  • may have limited communications bandwidth, which may also suffer significant latency; and
  • are required to support a wide range of workload types that interface with real-world peripherals using a wide range of technologies, many of which may be legacy or nonstandard.

Technically, the "virtualization" part of edge virtualization is not that different from traditional core virtualization in the data center.

The main difference comes from the unique compute challenges at the edge, which are often different than that of the data center. Edge virtualization requires integration with internet of things (IoT) systems, retail systems, specialized client devices and peripherals.

Edge virtualization integrates old and new peripherals regardless of heritage. Peripherals are separate devices (scanner, printer, etc.) that remain connected to network endpoints where they are needed. All their workloads are securely connected to the virtualized software running on a local server. The workload of each endpoint is moved onto a virtualized server.

Use cases

Any massively distributed business or organization may benefit from network virtualization at the edge. It can be used in a variety of industries, such as:
  • Retail
  • Hospitality
  • Healthcare
  • Banking, specifically retail banks that still maintain a branch network
  • Infrastructure companies with many remote outstations like water, electricity and telecoms



For example, in a retail environment, a virtualized POS requires seamless integration to peripherals such as the barcode scanner, magnetic stripe reader, cash drawer or receipt printer. A virtualized software stack unifies these and allows end users to seamlessly integrate them regardless of any variation or incongruence in their underlying hardware.

In this example, POS software is hosted on a small virtualized server positioned in a business's back office, out of sight from customers. Essentially, the software is decoupled from the hardware. The user could theoretically remove the physical disk from the POS hardware (the most common point of failure) and run the POS as a virtual machine (VM) on the server.

How edge virtualization ties into 5G

5G requires a massively distributed infrastructure to function. The edge nodes in 5G will interface with a variety of peripherals, IoT and networked devices of various heritages. Because a 5G network needs to be highly distributed, each network node or edge device needs to have local virtualized resources. A 5G network would be impossible to implement if each node needed to access resources from a centralized location for updates or maintenance, because of the sheer number of edge nodes and their complex reliance on each other to function.

Comments

Popular posts from this blog

Understanding the Evolution: AI, ML, Deep Learning, and Gen AI

In the ever-evolving landscape of artificial intelligence (AI) and machine learning (ML), one of the most intriguing advancements is the emergence of General AI (Gen AI). To grasp its significance, it's essential to first distinguish between these interconnected but distinct technologies. AI, ML, and Deep Learning: The Building Blocks Artificial Intelligence refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. Machine Learning, a subset of AI, empowers machines to learn from data and improve over time without explicit programming. Deep Learning, a specialized subset of ML, involves neural networks with many layers (hence "deep"), capable of learning intricate patterns from vast amounts of data. Enter General AI (Gen AI): Unraveling the Next Frontier Unlike traditional AI systems that excel in specific tasks (narrow AI), General AI aims to replicate human cognitive abilities across various domains. I...

Normalization of Database

Database Normalisation is a technique of organizing the data in the database. Normalization is a systematic approach of decomposing tables to eliminate data redundancy and undesirable characteristics like Insertion, Update and Deletion Anamolies. It is a multi-step process that puts data into tabular form by removing duplicated data from the relation tables. Normalization is used for mainly two purpose, Eliminating reduntant(useless) data. Ensuring data dependencies make sense i.e data is logically stored. Problem Without Normalization Without Normalization, it becomes difficult to handle and update the database, without facing data loss. Insertion, Updation and Deletion Anamolies are very frequent if Database is not Normalized. To understand these anomalies let us take an example of  Student  table. S_id S_Name S_Address Subject_opted 401 Adam Noida Bio 402 Alex Panipat Maths 403 Stuart Jammu Maths 404 Adam Noida Physics Updation Anamoly :  To upda...

How to deal with a toxic working environment

Handling a toxic working environment can be challenging, but there are steps you can take to address the situation and improve your experience at work: Recognize the Signs : Identify the specific behaviors or situations that contribute to the toxicity in your workplace. This could include bullying, harassment, micromanagement, negativity, or lack of support from management. Maintain Boundaries : Set boundaries to protect your mental and emotional well-being. This may involve limiting interactions with toxic individuals, avoiding gossip or negative conversations, and prioritizing self-care outside of work. Seek Support : Reach out to trusted colleagues, friends, or family members for support and advice. Sharing your experiences with others can help you feel less isolated and provide perspective on the situation. Document Incidents : Keep a record of any incidents or behaviors that contribute to the toxic environment, including dates, times, and specific details. This documentation may b...