Industry News

Are Edge Computing and Cloud Computing in Competition?

Meredith Shubel2025-04-25 09:59:01The New Stack

Cloud computing can complement edge applications, and vice versa. Here’s how.


Have we officially entered the era of edge computing?

That’s the word on the street. Global market intelligence firm IDC expects global edge computing spending to increase at a compound annual growth rate (CAGR) of 13.8%, nearing $380 billion by 2028. Meanwhile, leaders from retail to energy are hyping up edge computing as a true “game-changer” and “transforming” force. Edge computing holds great promise for remote industrial applications from manufacturing to telecom.


What’s causing all the noise?

Edge computing is a distributed computing model where data processing and storage are brought closer to the data source (i.e., to “the edge”) instead of on distant cloud servers.

With the low-latency, high data processing demands of new industry darlings like the Internet of Things (IoT) and generative AI (GenAI), companies have been pouring investments into edge computing, hoping to capitalize on the architecture’s promised faster response times, reduced bandwidth, and improved responsiveness.


Lately, edge AI is joining the conversation, too.

A subset of edge computing, edge AI refers to the deployment of AI models directly on local devices instead of the cloud. By moving this processing to the edge, the idea is to reduce both latency and dependency on internet connectivity. Edge AI also introduces interesting benefits for improved data privacy and security.


Why Edge Computing Is Stealing the Spotlight

Edge computing and edge AI have been increasingly shaping industry discussions. Even back in 2023, when writing his predictions for the future of the edge in Forbes, Bruce Kornfeld, StorMagic’s chief marketing and product officer, even went so far as to claim that, “Running all applications in the cloud is no longer an option due to cost, latency and uptime constraints.”

Across industries, from automotive to health care, teams are eagerly turning their attention to edge computing to benefit from real-time processing, reduced latency and lower bandwidth costs. Meanwhile, for AI developments, edge AI is gaining traction as an attractive option that can promise lower costs, faster operations and better data security.

Consider: With traditional, cloud-based centers, data has to travel to remote servers for processing. Not only does this slow down operations and drive higher latency, it also creates opportunities for data theft and other cyberattacks along the way. But by processing data locally, edge AI effectively reduces exposure to threats of unauthorized access.

Plus, you could even argue that edge AI is more affordable; local data processing means less data traffic, less data storage, and at the end of the day, less costly energy consumption.


Don’t Expect Cloud Computing To Go Away Yet

Edge computing and edge AI aren’t without challenges.

For one, to reap the benefits of edge AI, you must be prepared to accept the high costs of building and managing distributed infrastructure.

Plus, what edge computing offers in low latency and snappier response times, it loses in raw computing power. For this reason, for applications that need high performance and powerful data processing at scale, cloud computing still comes out ahead.

But some people are saying you shouldn’t have to choose between edge computing and cloud computing.


Edge and Cloud Computing: Better Together

Despite predictions that edge computing will more or less do away with cloud computing, the research indicates otherwise.

In fact, according to research from the Hong Kong University of Science and Technology and Microsoft Research Asia, rising demand for edge AI is actually driving an increase in cloud consumption.

As reported by VentureBeat, “Edge inference represents only the final step in a complex AI pipeline that depends heavily on cloud computing for data storage, processing and model training.”

In other words, despite its numerous benefits, for most applications, edge computing is only one part of the equation; in many cases, cloud computing is still required to handle the heavy lifting (e.g., large-scale data processing and long-term storage).

The research highlights the paradoxical nature of edge AI and cloud computing: “As these systems become more sophisticated, they actually increase rather than decrease dependency on cloud resources.”

Edge computing, it seems, doesn’t edge out cloud computing, but instead, calls it back for more.


What’s Fog Computing?

The Hong Kong University of Science and Technology and Microsoft Research Asia aren’t the only ones calling out the rising friction between edge computing and cloud computing — proponents of fog computing are way ahead of them.

Fog computing is a concept originally introduced by Cisco that “provides a layer of compute, storage and networking services between end devices and ‘on the ground’ and cloud computing data centers,” according to a Cisco blog post written by Maciej Kranz, now general manager at Pure Storage and then a Cisco vice president.

The idea is for fog computing to serve as an extension of cloud computing, distributing data, storage, compute and applications between data sources and the cloud to get the snappy latency of edge computing without giving up cloud computing’s superior processing power.

However, fog computing remains in its early stages, and its lack of standardization and overall complexity make it still ill-suited for widespread deployment.


The Way Forward: Edge Plus Cloud Computing

If edge computing beats cloud computing on latency and real-time responsiveness but can’t quite match all its computational capabilities, then the question isn’t which model is better — but how to use them together.

The mix may look something like this:

For applications where low latency and security are critical and continuous connectivity is doubtful, edge computing should take the lead. But when dealing with large workloads that require intense computation and large-scale data analysis, it’s better to lean on cloud computing. This way, you can get speed and responsiveness without sacrificing raw processing power.

For example, McKinsey posits that the future of autonomous vehicles will rely on both edge and cloud computing, with navigation and other high-latency-tolerant applications functioning in the cloud and more mission-critical systems, such as emergency braking systems, being processed locally.

Similarly, edge AI may take center stage in health care applications to support real-time monitoring and control of IoT devices, but cloud computing will step in for aggregated data analysis.

It may feel like we’re entering the era of edge computing, but that doesn’t mean the curtain is going down on cloud computing. We’re just stepping into the second act.


Declare:The sources of contents are from Internet,Please『 Contact Us 』 immediately if any infringement caused