Edge computing strategy: 5 potential gaps to watch for

Edge technology is expected to gain momentum in the coming year – but before you build your strategy, consider this expert advice
123 readers like this.

Is 2022 the year that edge computing gets down to business?

In financial terms, edge has already arrived: IDC predicts that companies will spend $176 billion on edge computing worldwide this year, roughly a 15% increase from 2021.

Ultimately, that’s just a (big) number. There may be more qualitative signs of edge computing’s maturation in terms of architectural approaches, technical capabilities, enterprise use cases, security tactics, and more.

“Even if we see echoes of older architectures in certain edge computing deployments, we also see developing edge trends that are genuinely new, or at least quite different from what existed previously,” Gordon Haff, technology evangelist at Red Hat, wrote recently in his analysis of edge trends to watch in 2022. “And they’re helping IT and business leaders solve problems in industries ranging from telco to automotive, for example, as both sensor data and machine learning (ML) data proliferates.”

IT leaders don’t often tackle business problems without a plan, which is why edge strategies – and related categories like IoT and machine learning – figure prominently on their roadmaps. In Red Hat’s 2022 Global Tech Outlook report, for example, 61% of respondents reported plans to run IoT or edge workloads (or both) in the next 12 months.

[ Related read: 5 Kubernetes trends to watch in 2022. ]

5 edge computing pitfalls to avoid

With that in mind, we asked a group of IT leaders and edge computing experts to shine a light on some of the gaps – if not full-blown ROI-wrecking pitfalls – that they see occurring in enterprise edge strategies. Here are five areas of concern to make sure you’re addressing in your edge plans.

1. Don't lean too much on universal definitions of "the edge"

Like with other big tech terms, there’s an industry tendency toward dogmatic definitions that don’t account for the day-to-day reality of a specific team or organization. But a one-size-fits-all definition implies a one-size-fits-all strategy.

There is no one-size-fits-all solution, and that’s the first gap to mind in your strategy, says Shamik Mishra, CTO for connectivity, Capgemini Engineering: Don’t try to force your goals into an edge strategy (or technology platform) that doesn’t fit.

“The edge has varying interpretations,” Mishra says. “The mobile device can be an edge as much as an on-prem micro data center.”

One company’s “edge server” might mean specialized hardware, while another’s same “edge server” could mean a conventional server in an unconventional location.

The same holds for use cases. While repeatable use cases built on industry or other contextual lines will continue to emerge, enterprise strategies need to be enterprise-specific.

“The applications of edge computing vary from industry to industry, and from region to region,” Mishra says. “While drone-based inspection may be of interest in one geography, that same use-case may not be of interest in another.”

One company’s “edge server” might mean specialized hardware, while another’s same “edge server” could mean a conventional server in an unconventional location.

This is not to say that there are no universal concerns. Security is a good example: An edge strategy that ignores security is incomplete.

Automation is another common denominator. “A lack of automation may also result in higher maintenance costs that can nullify the business benefits of edge, so adequate automation strategies need to be considered upfront,” Mishra says.

2. Discount change management at your own risk

For experienced IT leaders, this is more of a polite reminder rather than breaking news, but it’s still worth putting in print: Ignoring the impacts of a significant edge computing initiative on people’s day-to-day jobs isn’t a great idea.

“One of the biggest gaps in an edge strategy is a failure to involve all the necessary stakeholders,” says Josh Johnson, enterprise architect at Akamai. “Migrating workloads to the edge is not a ‘lift-and-shift’ exercise, but a project that involves changes across a number of teams.”

Within IT itself, virtually every broad function will require some learning and/or adaptation, especially if you’re not already running a lot of workloads in an edge architecture and can leverage past experience. Examples include:

Developers: The folks primarily responsible for writing your code may need to learn best practices for edge development and deployments, for example.

"Moving away from an environment with a relatively small number of servers in just a few locations to an environment with thousands of individual smaller locations requires completely different design and architectural considerations," Johnson says.

[ Read also: 3 things CIOs should know about developers in the cloud era

Operations/DevOps/SRE: People responsible for operational needs like instrumentation, monitoring, and configuration management may need to rethink their practices and tools for edge compute.

“Without visibility into the code executing at the edge, it is difficult to validate that the application is running as expected,” Johnson says.

Security: As more workloads move to the edge (no matter an organization’s particular definition of the term), security will naturally become a significant area of focus. That will require changes to traditional security playbooks, just as the broader shift to distributed IT environments (think hybrid cloud and multi-cloud) required similar evolution.

“Security teams need to evolve their practices to ensure applications at the edge are protected,” Johnson says. “Code and data live at the edge, outside of the protection of traditional firewalls within the data center.”

3. Prioritize consistency, predictability, and repeatability

Edge strategies that depend on one-off “snowflake” patterns for their success will cause long-term headaches.

This is another area where experience with hybrid cloud architecture will likely benefit edge thinking: If you already understand the importance of automation and repeatability to, say, running hundreds of containers in production, then you’ll see a similar value in terms of edge computing.

“Follow a standardized architecture and avoid fragmentation – the nightmare of managing hundreds of different types of systems,” advises Shahed Mazumder, global director, telecom solutions at Aerospike. “Consistency and predictability will be key in edge deployments, just like they are key in cloud-based deployments.”

Indeed, this is an area where the cloud-edge relationship deepens. Some of the same approaches that make hybrid cloud both beneficial and practical will carry forward to the edge, for example. In general, if you’ve already been solving some of the complexity involved in hybrid cloud or multi-cloud environments, then you’re on the right path.

If you’ve already been solving some of the complexity involved in hybrid cloud or multi-cloud environments, then you’re on the right path.

“Edge environments are heterogenous by nature, and organizations should prepare to tackle this,” says Saurabh Mishra, senior manager of IoT at SAS. “This is particularly relevant when attempting to create a level playing field at the edge by using containers and Kubernetes. It also helps with shifting workloads from the cloud to the edge as edge gains more prominence.”

4. Understand how you'll handle management at scale

Number three feeds directly into number four: You won’t want to figure out how you’re going to manage everything after you’ve started running in production. Like with cloud management, a centralized platform is a good idea for any significant implementation.

“When investing in a platform, it’s important to focus on one that allows for the central management of edge infrastructure and workloads,” Mishra from SAS says. “While most edge use cases aspire to execute workloads with constant connectivity to the cloud, it’s key to have a management platform that allows for configuration changes and pushing new workloads from the cloud to the edge. Reporting status and alerts from the edge to the cloud is what drives enterprise scale and adoption.”

[ Edge and cloud go hand-in-hand. Also read Hybrid cloud: 4 trends to watch in 2022. ]

The relationship between edge and cloud should be beneficial. For example, Mishra of SAS says there’s value in designing use cases that rely on both edge and cloud workloads, where local processing and alerting happens at the edge but a global “fleet-level” view is created in the cloud.

5. A "build once, run anywhere" mindset won't work for all workloads

Just as cloud and edge computing have a natural relationship, so too does machine learning and edge/IoT use cases.

Some teams may find out the hard way, however, that the model that worked beautifully on-premises or in a hyperscale cloud starts to sputter in an edge environment.

“We see customers build and train amazing models but then they end up not being able to use AI/ML at the edge,” says Paul Legato, VP of platform engineering at Wallaroo. “Why? Because efficiency of execution is critical. You need to squeeze all the inference you can out of limited compute.”

As edge computing workloads become more sophisticated, IT leaders and teams will need to keep in mind that the “run anywhere” philosophy that applies elsewhere in modern software paradigms may be trickier in an edge architecture. ML workloads comprise a prime example of this concern.

“Machine learning at the edge is also about running models on highly limited hardware,” Legato says. “You can push a button and get the latest and greatest 128 CPU core machine in a cloud, but on the edge, you’re running on a tiny underpowered industrial PC or security camera with minimal CPU and RAM available.”

[ Want to learn more about edge and data-intensive applications? Get the details on how to build and manage data-intensive intelligent applications in a hybrid cloud blueprint. ] 

Kevin Casey writes about technology and business for a variety of publications. He won an Azbee Award, given by the American Society of Business Publication Editors, for his InformationWeek.com story, "Are You Too Old For IT?" He's a former community choice honoree in the Small Business Influencer Awards.