The basic concepts of edge computing are relatively simple to understand. Similar to a centralized data center, edge infrastructure delivers compute and other resources that applications need – but it brings these requirements much closer in physical proximity to those apps and their data.
So while the term “edge server” might sound, well, edgy – it really just refers to moving the functions of a traditional server closer to their most optimal location, whether for performance, security, cost, or other reasons.
Here’s how we defined it in one of our previous explainers: “Edge servers exist outside of a traditional data center to perform compute, networking, storage, and security functions close to where users need them – for example, where data is created in a healthcare setting or manufacturing site.”
Another way to think about it: Edge computing expands the meaning of the traditional IT term “on-premises.” Instead of referring to a physical data center or other centralized location, with edge, the “premises” could be virtually anywhere your applications and data reside.
[ For a deeper dive on this topic, check out our related article: How to explain edge computing in plain English. ]
Once you grasp the concept, the natural next question is: What can I do with edge computing?
This is a very busy area, and some use cases are forward-looking. There’s much ado about the relationship between edge, IoT, and 5G, for example – even if the latter piece of that trio is not quite ready for primetime. But these categories feed off of one another in exciting ways, which creates further interest and excitement.
“Edge is driving [enterprise] 5G adoption and is an enabling platform to run a variety of use cases,” says Shamik Mishra, CTO for connectivity, Capgemini Engineering. “These use cases can benefit from having a massively distributed cloud environment that’s connected to 5G low latency networks. This enables automation, new innovation, and new business models that leverage data and cloud.”
There’s sort of an enterprise power trio of edge-cloud-data emerging here, even if it’s still relatively early days for real-world applications of edge computing.
According to Dean Bogdanovic, CTO of Alef, most enterprises are still in a learning phase with edge computing. The network operators (aka telcos) themselves are the real early adopters with the most deployed use cases, such as virtualized Radio Access Networks and content delivery. Red Hat technology evangelist Gordon Haff notes that vRANs will become an increasingly important use case in 2022.
As operator-driven use cases mature, they’ll pave the way for wider edge adoption and innovation.
[ Related read: 6 edge computing trends to watch in 2022. ]
“This is a good step forward for enterprises, too, as the edge computing ecosystem is growing, improving, and hardening, making it easier for adoption by enterprises,” Bogdanovic says.
3 ways to use edge computing now
With that in mind, let’s take a big-picture look at three examples of how some enterprises are implementing and/or using edge architectures, hardware, and applications today (or will soon). We’ll also tie in the key takeaways that will likely foster interest in additional use cases translated for other industries or organizational contexts. Each should help connect the basic concepts to real-world usage.
[ Learn more about edge computing in modern architecture design: 5 processor architectures making machine learning a reality for edge computing ]
1. Enabling remote monitoring and predictive maintenance
Edge and IoT fit naturally together. As a result, there’s a lot of focus on edge applications in manufacturing, warehousing, and supply chain contexts.
Edge infrastructure is what enables a “smart” factory floor, for example, armed with sensors and other connected devices that generate endless streams of data.
“The manufacturing and warehousing sectors have been early adopters, with use cases like preventive maintenance and augmented reality/virtual reality (AR/VR) remote assistance applications powered by on-prem edge compute,” Mishra says. “Warehouse automation through robotics, location-based solutions, and supply chain optimization are also viewed as key use cases for edge.”
A specific technology to watch for here is computer vision: the artificial intelligence (AI) discipline focused on computer-based recognition of images and/or video.
“Manufacturing is doing really interesting work in the smart factory floor with quality control using computer vision to identify a slip in production quality before it becomes detectable to humans,” says Paul Legato, VP of platform engineering at Wallaroo.
Experts expect that computer vision applications, powered by edge infrastructure, will be a hotbed of new use cases going forward.
“Video cameras can be the ultimate sensor and are either already present – like security cameras – or can easily be added to an edge computing environment,” says Saurabh Mishra, senior manager of IoT at SAS, adding that computer vision is behind the most interesting use cases he’s seeing today. Moreover, there’s a high ceiling for additional applications.
“Employing computer vision at the edge can support a variety of use cases because it’s flexible,” Mishra says. “It’s especially useful in cases involving quality inspections in discrete manufacturing, in safety-related use cases like detecting intruders or objects in restricted spaces, or in detecting the proximity, count, and flow of people in a space to help with social distancing or understanding traffic patterns.”
If you’re up for some math, Legato shares a hypothetical of the upside of wider adoption of edge computing to power such scenarios: “If you were able to increase the yield on each step of a 200-step manufacturing process from 99.5% to 99.9%, that increases overall yield to 82%, versus 37% previously.”
Take out the word “manufacturing” and just think in terms of process optimization broadly, the possibilities continue to grow. Even something as “basic” as managing an organization’s energy consumption can be improved in this manner. It can also be predictive, not just reactive.
“Predictive maintenance is another use case where you have an industrial PC on equipment [that takes] in raw temperature, vibration, humidity data streams, and predicts when maintenance should be performed in advance of actual equipment failure,” Legato says.
Bogdanovic, the Alef CTO, notes that this has potential uses in many industries, especially those where the stakes of downtime or failure are high. “For example, airplane jet engine manufacturers already applied predictive maintenance, where upon airplane arrival, the maintenance team is ready to correct and replace the needed jet engine parts,” Bogdanovic says. “Healthcare enterprises deploy it for patient monitoring, improving the service, as well as patient data protection.”
Key takeaways: This is all about automation (fueled by AI/ML, in the case of computer-vision use cases) and optimization to improve outcomes. While the focus on manufacturing/warehousing/supply chain contexts makes sense, matters of quality, safety, productivity, and efficiency translate across industries and business requirements.
2. Improving application performance and user experience
Content delivery networks (CDNs) are one of the better present-day examples of edge concepts at work: They improve many consumer web experiences by bringing web content closer to the person consuming (i.e., reading, listening, watching) it.
This applies widely to external and internal customers: If your users demand speed (as they so often do), then latency is a killer. Bringing traditional infrastructure resources as close as possible to where applications are running and/or data is processed can help reduce latency.
[ Related read: Edge computing strategy: 5 potential gaps to watch for. ]
“Companies can deploy applications at the edge, resulting in low latency – the top demand from consumers when using applications,” says Shahed Mazumder, global director, telecom solutions at Aerospike. “Latency depends on network distance traversed, computation load, and database transactions volume/processing time required.”
Consumer expectations – how long did it take a website to load or a video to begin playing – may seem most prominent, but they readily translate into everyday business settings. In fact, the shift to remote/hybrid workplaces is underlining the requirement for high-performance applications.
“With remote work here to stay, videoconferencing and productivity/collaboration tools are some of the most ubiquitous and essential edge use cases,” Mazumder says.
When time is of the essence – especially time measured in its smallest increments – then edge computing may be beneficial. Mazumder says that fraud detection will be a critical area for edge computing, for another example. “Fraud detection that typically requires determination in sub to low milliseconds,” Mazumder says.
Like other experts, Mazumder expects an increasing intersection between AI/ML and edge computing use cases, too: “Object/video recognition use cases benefit from low latency and are increasingly gaining traction in telecom.”
Key takeaways: Edge computing is useful in many scenarios where low latency is a significant requirement. And edge should rarely be looked at as a standalone technology or strategy – it will increasingly go hand-in-hand with other major components of an IT leader’s portfolio, including hybrid cloud/multi-cloud, AI/ML, and IoT.
3. Adopting new security approaches that reflect modern environments
Another core principle of edge computing is that it also allows a more distributed approach to application logic and/or business logic. The logic moves out to the edge where it’s needed rather than being tethered to a centralized location, such as a traditional data center or cloud.
You’re likely to see and hear a lot of hand-wringing about edge security, similar to the early days of cloud. While security should always be a priority, looking at edge purely from a risk standpoint misses out on a significant opportunity: Edge architectures can actually bolster security.
“By moving authorization and access control logic to the edge, enterprises are able to standardize authorization logic across multiple applications while offloading this logic from their individual applications,” explains Josh Johnson, enterprise architect at Akamai.
Johnson says that this is particularly important for API traffic.
“In most enterprises, various APIs are managed by many separate teams and business units, hosted in disparate data centers, with API consumers located around the world,” Johnson explains. “To get around this issue, enterprises can use edge computing to standardize access control logic regardless of where their users and applications reside, meaning there will no longer be any performance sacrifice.”
A security-focused edge architecture can also limit the potential “network effect” of an incident by limiting problematic traffic to a particular point (and not allowing a breach to spread throughout a cloud environment or data center).
The potential security advantages can also power additional performance-minded architectural decisions.
“Once application logic is at the edge, the next logical step is to distribute data to the edge,” Johnson says. “Moving application logic and data close to users supports development of fast, low-latency microservices APIs. Whether the APIs are then consumed from web pages, server-to-server traffic, or IoT devices, reducing latency improves the user experience and efficiency when making API requests.”
Key takeaways: Edge security will be a hot topic, and rightly so. Just don’t get misled into thinking that edge is more risk than reward; it can actually enhance security in a distributed environment. That intertwines with the potential performance advantages of edge computing, too.
[ Want to learn more about edge and data-intensive applications? Get the details on how to build and manage data-intensive intelligent applications in a hybrid cloud blueprint. ]