Edge server architecture: How edge and cloud fit together

What does edge computing do for your hybrid cloud strategy, and what does edge server architecture look like in action? Let's look at the latency, consistency, security, and cost issues with experts - as well as some edge use cases
225 readers like this.
Edge server architecture: How edge and cloud fit together

Edge computing and cloud computing have a natural relationship with each other: Think of the edge as an extension of cloud, and especially hybrid cloud infrastructure.

Core cloud resources can move out to the edge when and where needed, and the edge can flow back to your core infrastructure when that makes sense.

[ What's the latest in edge? See Red Hat's news roundup from Mobile World Congress 2022. ]

You can also think of the relationship as bi-directional: Core cloud resources can move out to the edge when and where needed, and the edge can flow back to your core infrastructure when that makes sense.

[ Get a shareable primer: How to explain edge computing in plain English.] 

Flexibility and consistency are two of the primary advantages of this symbiotic relationship.

“Businesses need flexibility in terms of where they place their workloads, and if their strategy changes, [they need] consistency of operations – for both ITOps and developers, so as to enable them to react quickly and with minimal disruption,” says Rosa Guntrip, senior principal marketing manager, cloud platforms at Red Hat.

Edge server architecture in action

A general principle to consider here: As you move out toward the edge, infrastructure (and the applications and services running on it) tends to become more specialized; as you move back in toward the cloud, things tend to become more general or commoditized.

"At a high level, edge environments feature more and smaller devices, which are more specialized."

“At a high level, edge environments feature more and smaller devices, which are more specialized,” says Jacob Smith, VP of strategy at Equinix Metal. “For instance, in an IoT or industrial context, there might be thousands or even millions of sensors and devices at the edge doing and capturing very specific things, but as you work towards the core cloud, the type of infrastructure becomes more generic.”

Smith notes that while the actual architectural implementations of the cloud-edge relationship are still emerging and evolving, there is most definitely a complementary relationship. One basic pattern is that there is an “infrastructure edge” that sits between your hybrid cloud or core cloud and the outermost edge of your environment – those IoT sensors, for example, or many other possible devices and applications. That infrastructure edge can process data, route traffic, or perform latency-sensitive tasks while sending the rest back to the cloud, Smith explains.

"In the cloud, we expect a lot of consistency and not a lot of specialization."

“Edge architectures vary widely, but the relationship between cloud and edge can broadly be described by latency, specialization, and consistency,” Smith says. “In the cloud, we expect a lot of consistency and not a lot of specialization. Cloud enables a ‘just get more’ mindset that often isn’t possible in constrained edge environments. Additionally, latency and workload requirements, along with constrained space and power, often mean that edge infrastructure is more specialized, honing in on doing specific tasks. Otherwise, send it back to the cloud, where the variety, scale, and cost is much cheaper.”

[ Want to learn more about implementing edge computing? Read the blog: How to implement edge infrastructure in a maintainable and scalable way. ]

Edge server architecture benefits: Consistency and operational efficiency

The cloud-edge connection is not just a nice coincidence. It’s necessary for operational efficiency, especially in any organization with a growing edge footprint.

“When we look at the challenges of scale and operational consistency, the edge cannot be seen as a point solution that then needs to be managed separately or differently across hundreds of sites – this would be incredibly complex,” Guntrip says. “In order to be successful, you need to manage your edge sites in the same way you would the rest of your places in the network – from the core to the edge. This helps minimize complexity and deliver on the operational excellence that organizations are striving for.”

Workload fit (matching the right workload to the right infrastructure environment) is one advantage of the edge as an extension of hybrid cloud. Compute-intensive workloads, for instance, may be better served in a public cloud or on-premises data center, whereas those specialized IoT requirements are likely best supported at the edge.

"Lightweight edge computing is like party-sized Lego sets. Each edge capability allows you to do specific things very quickly."

“To use an analogy, a heavy compute deployed in the cloud is like a huge box of Legos – with enough time, you can build anything you want,” says Ari Weil, VP of product marketing at Akamai. “Lightweight edge computing is like party-sized Lego sets. Each edge capability allows you to do specific things very quickly.”

[ Want to learn more about edge and data-intensive applications? Get the details on how to build and manage data-intensive intelligent applications in a hybrid cloud blueprint. ]

Edge's cost and security connections

Cost optimization is another benefit.

“As you distribute processing, your core data center is better able to scale as not all data needs to be backhauled back to the data center or cloud,” says Guntrip from Red Hat. This can mean savings in terms of hardware needed in a data center footprint, Guntrip explains, or data warehousing and other costs – especially as data keeps increasing with IoT sensors and devices.

Weil also points to security as another area where the edge can deliver value. It can help to build a more unified security posture that better protects your core cloud and/or data center assets, including the ability to limit how much of your company’s backend is exposed to the internet to only what is necessary.

[ Check out our primer: How edge servers work. ]

“Controls are applied closer to where attackers launch their campaigns,” Weil says. “Security deployed at the edge also benefits from visibility into all requests and responses to the apps and infrastructure it protects. This means that the edge solutions can apply insights into malicious IPs, behavior, and actions across the entirety of a company’s apps and infrastructure.”

Edge computing and cloud: What are the use cases?

So how is this relationship between edge and cloud infrastructure actually manifesting? It’s a mix of today and tomorrow examples.

“The biggest use cases now involve a lot of boring things: IoT is a good example. All those smart home devices you have? That’s the device edge,” Smith says. “Same thing for industrial and smart city use cases, even retail like the McDonald’s checkout or Circle K stores. These are the use cases currently flourishing due to obvious demand.”

As 5G spreads, the infrastructure edge – that middle layer Smith described above – is likely going to become critical connective tissue between hybrid cloud infrastructure and the outermost points of the edge, for reasons of latency, speed, data sovereignty, cost optimization, and more.

"As 5G rolls out, we're starting to see the infrastructure edge come alive, serving use cases like gaming and entertainment."

“As 5G rolls out, we’re starting to see the infrastructure edge come alive, serving use cases like gaming and entertainment,” Smith says. “This is where hybrid cloud and edge cloud intersect, as the high-powered needs of use cases drive access to more resources that are nearby if not on-premises.”

Ryan Murphy, VP and head of Capgemini’s North America cloud center of excellence, sees growing interest in the cloud-edge relationship and architecture in both the manufacturing and travel and leisure industries. In both categories, the interest is driven in part by a desire to push a growing portfolio of cloud-native services out to the edge.

“What we have seen [there] is hyper-converged environments capable of running modern workloads in containers,” Murphy says. “This can be a software-defined solution or a hyperscale edge solution, allowing customers to extend their cloud-native services to edge environments.”

Speaking of cloud-native, Weil from Akamai notes that microservices architecture is another reason for the budding relationship between cloud and edge, in part because it allows for an even more granular approach to “what should run where” decisions. Weil points to a car dealer client that deployed a geolocation microservice to a cloud to be able to deliver customized content and user experiences to customers. But round-trips to the cloud added an untenable amount of latency, leading to a less-than-stellar user experience. Moving the microservice to the edge cut latency down to under 20 ms, boosting customer experience while also reducing its cloud costs.

 

 

Meanwhile, plenty of other related services are still better fits in the cloud, such as storing inventory and customer data, user behavior insights, and certain transaction processing – so that’s where they remain.

“These use cases do not require real-time actions and require CPU and memory resources that do not make sense at the edge," Weil says.

[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ] 

Kevin Casey writes about technology and business for a variety of publications. He won an Azbee Award, given by the American Society of Business Publication Editors, for his InformationWeek.com story, "Are You Too Old For IT?" He's a former community choice honoree in the Small Business Influencer Awards.