Back to the Future: The Rise of Edge Computing

Back to the Future: The Rise of Edge Computing

We are in the midst of a digital transformation era. More than ever before, enterprises of all sizes are investing in data-driven artificial intelligence/machine learning (AI/ML) processes that help create efficient business operations, discover new revenue opportunities, and gain competitive advantages. Initially, organizations turned to public cloud to make sense of their data. Public cloud provides high availability and rapid scaling of compute resources for big data AI/ML applications while reducing infrastructure complexity. However, with explosion of data at the edge, a new problem emerged -- getting the data to the cloud.

For many organizations, most of their data comes from the edge -- remote or branch offices (ROBO), manufacturing facilities, retail stores, restaurants, oil rigs, vehicles, weapon systems, humanitarian outposts. The data itself is produced by the ever-increasing number of sensors and Internet of Things (IoT) devices deployed at edge locations. The challenge is that for many use cases, we need to have immediate actionable insights from this data. Sending vast amounts of sensor data for processing to a remote data center or public cloud is not always efficient or feasible. And this is where the need for edge computing comes in.

With edge computing, we are going back to the future of placing compute close to where data is created, like we did in the era before the cloud. Per 451 Research report ”The Edge to Cloud Continuum“, edge computing will become an increasingly important part of the enterprise digital strategy and become a critical capability for organizations that:

Some of the key verticals and use cases for edge computing include:

Outside of AI/ML data science use cases above, there is a variety of mission-critical workloads that an organization may also want to run at the edge. These include automation control systems, hyper-converged infrastructure (HCI) software such as VMware vSAN, virtual desktop infrastructure (VDI), point of sale (PoS), digital signage, video surveillance, data aggregation/compression and many more. There may be a need to run a legacy operating system on a bare-metal server and at the same time have a virtualized environment with a variety of virtual machines (VMs), and then also have a compute node with graphic processing unit (GPU) support for data analytics. The challenge is running all these workloads at the edge, where there are space constrains and, often, exposure to extreme temperatures, dust, humidity, and vibration. There are several ruggedized server systems on the market. However, none of them offer the compute density and form factor required to accommodate the demands of today’s edge.

What we see in the market right now is a need for dense and modular small form factor systems that can easily be configured to run a variety of workloads. Dell has answered the call by expanding their popular XR line of ruggedized servers with a unique product that will address all these challenges. The new product will be unveiled later this year. Dell is promising to deliver a high-performance, multi-node edge server, purpose-built for ultra-short depth, low power, with modular sleds. Dell has hinted that the new XR line will allow customers to create a self-contained 2-node vSAN cluster instance to accommodate VMware vSAN requirements in a footprint not much larger than a shoe box. This all sounds very exciting, and we can’t wait until the official announcement!

Images Powered by Shutterstock