Today, cloud-based computing allows massive resources to be brought to bear in a range of applications, often using artificial intelligence (AI) for data processing and analysis. At the same time, transporting data from where it’s collected and used—for instance, the edge—to the cloud and back introduces issues including latency and privacy concerns. One also has to consider the eventuality of remote resource outages, potentially crippling a system that can’t function without its cloud overseer.
To help mitigate these challenges, especially in critical industrial and enterprise tasks, engineers may instead choose to implement edge intelligence for time-critical processing. Cloud hardware can still be used as part of an AI implementation, but typically this would involve non-time critical tasks, such as overall data processing and trend analysis.
New chips expand edge AI processing benefits
As the number of deployed internet of things (IoT) devices has exploded over the past decade, this has brought a multitude of benefits to both enterprise users and consumers alike. The basic concept of IoT is that a massive network of small—physically, and in terms of computing power—devices offload processing to a central server, allowing each node to perform well above its individual capability.
The IoT paradigm has found fertile ground in the industrial setting, where connecting machines and systems—creating an industrial IoT (IIoT) ecosystem—allows admins to take advantage of the data that “silent” machines have been producing for years to enhance manufacturing and industrial processes.
Today, the trend in IIoT is implementing ever-evolving high-performance machinery equipped with advanced sensors and control electronics, increasingly capable of supporting business intelligence.