The Cloud, The Edge: where to put the AI in the system

By Ravi Annavajjhala, CEO Kinara Inc.

  • Friday, 9th September 2022 Posted 1 year ago in by Phil Alsop

Artificial Intelligence (AI) is one of the most widely demanded technologies on the planet. But as companies begin the process of building and ultimately deploying their AI, they encounter several critical design decisions along their way.

One of the most crucial of these is deciding where the AI will process the data it consumes - in the cloud or on the network edge. While deploying at the edge offers considerable benefits in speed and expense, most still choose to use the cloud.

That’s understandable - the cloud provides massive compute power and capacity to store, analyse, and process the vast amounts of data that AI will consume. However, it also comes with several problems around factors such as privacy, cost, data bandwidth, and latency.

Maintaining privacy

Storing sensitive and proprietary data in the cloud still scares people. This process can expose the data to network attacks, and cloud users often expose private information to the open internet accidentally. This is often done through simple but common mistakes, like misconfiguring security controls on cloud buckets. Nonetheless, these seemingly small mistakes can quickly become huge problems.

While cloud technology is often not at fault in and of itself - companies can nonetheless be fearful of the potential complications that processing their AI data in the cloud can bring.

Controlling cost

Processing the vast amounts of data that sensors generate can significantly add up. The cost of cloud computing time can very quickly become precipitously high. It can take multiple days to train an AI model, and if you’re doing that regularly (which you most likely will be), then your cloud usage costs could be untenable. These costs can also become prohibitively expensive where AI must be used at scale.

Bandwidth for transferring data

The task of uploading the sensor-generated data requires substantial amounts of unfailing network bandwidth. The network must be 100% available and if uploading glitches, even momentarily, then the accuracy of the results could be put at risk.

The importance of reducing latency

Uploading all that data to the cloud also introduces an unwelcome amount of latency into the process. While data is sent to the cloud, applications will experience latencies of over 100 milliseconds - which can seriously restrict their ability to operate effectively.

What about the Edge for AI application deployment?

Placing the AI processing at the network’s edge, manages to get around many of these aforementioned hurdles. Principally, placing AI at the edge is the best option for those use cases that require speed, scalability, and privacy.

This is especially relevant when the AI application must make quick decisions. For example, once a camera has ‘seen something’ it may then need to make a decision about what that thing is or what it must do next in response. For those functions, uploading to the cloud will merely take too long to achieve the AI application’s functionality. With the ability to process that data at the network edge, the AI application can make split-second decisions. If a particular application doesn’t need to rely on a constant connection with the cloud, users that choose to place their AI at the edge are also unburdened from the risk of disrupted internet connections.

Keeping sensitive and proprietary data private is also a major advantage of deploying AI at the edge because data is kept strictly within its owners’ own environment (i.e., on premise).

Edge AI also offers a potentially less expensive option. Edge AI will most likely come with more upfront costs because of the initial investment in edge equipment and software deployments, but set against the continuous costs of cloud time, these are often more cost effective in the long run.

Cashierless stores

Edge AI offers agility in use cases which require instantaneous response and greater cost-effectiveness when it comes to use cases that require scalability. As such, it can be a good fit in the burgeoning field of cashierless stores. These stores combine AI-powered technologies like sensor fusion and, crucially, computer vision to create a seamless retail experience where shoppers can grab what they need off the shelves and walk out the door. Amazon Go is the most famous example of this technology, but it has also been adopted by a variety of international retailers such as Aldi, Sainsbury’s, and Circle-K.

Many cashierless store technologies rely on computer vision capabilities. It’s this which allows the store to see activity as a human would, understanding who people are, what they’re buying, and where they’re going.

This capability requires a tremendous number of cameras (e.g., 100s to 1000s, depending on the store size) to track shoppers throughout the store. The cashierless stores could upload the video those cameras generate to the cloud, which would provide processing power for their needs, however they would also fall victim to the same problems we’ve discussed above.

Uploading that data to the cloud can be extremely expensive and energy intensive, especially given the number of cameras that would be providing data. Cashierless stores would need excessive reliable bandwidth to process that data and if the upload were to suffer a glitch, then its accuracy could be put at risk. It would also introduce an unacceptable amount of latency for AI applications, which would slow down the overall operation of the cashierless store. On top of that, processing that data within the cloud could expose it to network attacks and risk data leakage. Opting for Edge AI instead, plugs many of those gaps and processing the data at the edge keeps that sensitive data within secure confines. Furthermore, because that processing is happening on the edge device, it

won’t pile on the cloud use costs, rely heavily on bandwidth, or require an outside internet connection. It can reduce latency for AI applications below five milliseconds.

The Edge AI - by running its algorithms locally - can process its data and make quick decisions independently of an outside internet connection or cloud availability. These problems make the cashierless store - which requires scalability and instantaneous response - an obvious fit for Edge AI.

The choice between deploying AI in the edge or the cloud is not a zero-sum choice. In fact, they can be used in conjunction to take advantage of the strengths of both. For example, cloud can help process the metadata and more complex analytics, while edge devices handle the majority of the application’s workload locally. Choices around where to deploy intelligence in the system must be considered carefully according to the individual needs of the use case.

INFO: https://embeddedcomputing.com/technology/iot/edge-computing/edge-ai-is-overtaking-cloud-computing-for-deep-learning-applications#:~:text=Cloud%20AI%20can%20definitely%20provide,AI%20for%20artificial%20intelligence%20applications. https://www.cardinalpeak.com/blog/at-the-edge-vs-in-the-cloud-artificial-intelligence-and-machine-learning https://kapernikov.com/ai-on-the-edge-why-you-dont-need-to-run-your-machine-vision-project-in-the-cloud/ https://www.signally.ai/main-differences-between-edge-ai-and-cloud-ai https://www.analyticsinsight.net/cloud-ai-vs-edge-ai-which-is-a-better-choice-in-2022/