How to explain edge computing in plain English

How to explain edge computing in plain English

What is edge computing? How does it relate to cloud computing? We cut through the confusion and help you explain this term, even to non-technical audiences

288 readers like this
CIO Edge computing myths

Edge computing is already in use all around us – from the wearable on your wrist to the computers parsing intersection traffic flow. Other examples include smart utility grid analysis, safety monitoring of oil rigs, streaming video optimization, and drone-enabled crop management.

And those applications appear poised to expand. Today, less than 10 percent of enterprise-generated data is created and processed at the edge, according to Gartner; but by 2025, that will grow to 75 percent, Gartner predicts.

Yet, explaining edge computing to non-technical audiences can be tough – in part, because this type of data processing can take place in any number of ways and in such a variety of settings. At its simplest, edge computing is the practice of capturing, processing, and analyzing data near where it is created.

[ Why does edge computing matter to IT leaders – and what's next? Learn more about Red Hat's point of view. ]

What is edge computing?

"For edge devices to be smart, they need to process the data they collect, share timely insights and if applicable, take appropriate action.  Edge computing is the science of having the edge devices do this without the need for the data to be transported to another server environment," says Red Hat chief technology strategist E.G. Nadhan. "Put another way, edge computing brings the data and the compute closest to the point of interaction."

“If we think about a hub-and-spoke model, the cloud is the hub and everything on the outside of the spokes is the edge.”

It’s “everything not in the cloud,” says Ryan Martin, principal analyst with ABI Research, “If we think about a hub-and-spoke model, the cloud is the hub and everything on the outside of the spokes is the edge.” This decentralized approach enables organizations to move processes like analytics and decision making closer to where the actual data is produced.

“Put simply, edge computing is data analysis that takes place on a device in real-time,” says Nima Negahban, CTO of Kinetica. “Edge computing is about processing data locally, and cloud computing is about processing data in a data center or public cloud.”

Edge computing definitions

That eases us into the edge conversation. We also asked other experts to chime in with their particular definitions of edge computing in clear terms to that may prove useful for IT leaders in various discussions - including those with non-technical people.

Rosa Guntrip, senior principal marketing manager, cloud platforms at Red Hat: "Edge computing refers to the concept of bringing computing services closer to service consumers or data sources. Fueled by emerging use cases like IoT, AR/VR, robotics, machine learning, and telco network functions that require service provisioning closer to users, edge computing helps solve the key challenges of bandwidth, latency, resiliency, and data sovereignty. It complements the hybrid computing model where centralized computing can be used for compute intensive workloads while edge computing helps address the requirements of workloads that require processing in near real time." 

Dr. James Stanger, chief technology evangelist at CompTIA: “As the Internet of Things (IoT) connects more and more devices, networks are transitioning from being primarily highways to and from a central location to something akin to a spider’s web of interconnected, intermediate storage and processing devices. Edge computing is the practice of capturing, storing, processing and analyzing data near the client, where the data is generated, instead of in a centralized data-processing warehouse. Hence, the data is stored at intermediate points at the ‘edge’ of the network, rather than always at the central server or data center.”

Richard Villars, Vice President, Data Center and Cloud at IDC: “IT at the edge is about the consumption of IT resources in increasingly ‘smart’ edge locations. These are the urban cores, hospitals, factories, transportation hubs, and a wide range of spaces where we all work, play, and live as well as where we all want to use ‘smart’ things to deliver optimal digital experience.”

Jason Mann, Vice President of IoT at SAS: “Edge computing refers to the engagement and analysis of data at its point of origin with available computing and storage capabilities.”

Nima Negahban, CTO at Kinetica: “Technically, edge computing is a distributed paradigm that enables data to be processed locally across smart objects, mobile phones, and local networks. Instead of sending massive amounts of data generated by devices on the Internet of Things back up to the central cloud for processing – which takes more time, requires more bandwidth, and tends to cost more – data analysis can now take place on a user’s device at the edge of the network.”

[ How are media companies using edge computing? Watch the MWC panel discussion: Tips and tricks for 5G and the network edge. ]

Ryan Martin, principal analyst at ABI Research: “Edge computing is the idea that processing should happen closer to where data is created, when it makes sense. Edge intelligence refers to the ability to intelligently distribute computing between edge and cloud resource.”

Todd Loeppke, Lead CTO Architect at Sungard Availability Services: “Prior to edge computing, data was collected from distributed locations outside the traditional data center. The data was then sent to the data center where it would be processed, meaning either a decision was made based on the data or the value of the data was determined. With the advent of edge computing, decisions can be made at the collection point or at a location physically close to the collection point. This significantly improves the time required to make a decision based on the data, which is critical for many use cases that utilize real-time decisions, such as autonomous cars communicating with each other.”

Bonus tip: The edge computing pizza place analogy

Sometimes, nothing works like a food analogy. Try this one out:

"A pie baked at the main location would get cold on its way to a distant customer.”

Michael Clegg, Vice President and General Manager of IoT and Embedded at Supermicro: “By processing incoming data at the edge, less information needs to be sent to the cloud and back. This also significantly reduces processing latency. A good analogy would be a popular pizza restaurant that opens smaller branches in more neighborhoods, since a pie baked at the main location would get cold on its way to a distant customer.”

So how do you know when you need to build those neighborhood pizza joints, er, use edge computing? Let's look at some examples:


Stephanie Overby is an award-winning reporter and editor with more than twenty years of professional journalism experience. For the last decade, her work has focused on the intersection of business and technology. She lives in Boston, Mass.

7 New CIO Rules of Road

CIOs: We welcome you to join the conversation

Related Topics

Submitted By Kevin Casey
October 30, 2020

SOAR technologies strive to automate some of the repetitive human effort required to maintain a strong security posture. Here's how SOAR tools fit into an enterprise security strategy.

Submitted By David Egts
October 29, 2020

As the remote workforce heads into the winter months, many workers may struggle to stay engaged. Consider these approaches to boost your team's motivation.

Submitted By Sharon Mandell
October 29, 2020

Consider this version of Maslow’s hierarchy of basic human needs, adapted to the CIO. IT leaders can use this framework to elevate work - even at a time of global disruption


Email Capture

Keep up with the latest thoughts, strategies, and insights from CIOs & IT leaders.