Artificial Intelligence Where You Need It

Delivering AI to the Cloud, Edge and Endpoint Devices

Table of Contents

  1. Summary
  2. AI Compute-capable System on Chips (SoCs)
  3. Bringing AI to the Network Edge and Device Endpoint with Arm
  4. Example: Natural Language and AI/ML in Cardiac Arrest
  5. Example: Keeping the Trains Running on Time
  6. Conclusion

1. Summary

Artificial intelligence (AI) is permeating every aspect of our personal and professional lives, but many companies have yet to grasp the power and revolutionary capabilities it brings. While public perceptions of AI can be skewed by science fiction’s (almost entirely theoretical) depiction of hyper-intelligent humanoid robots, the enterprise view can also be out of step with reality. Many still view AI as completely focused on high-performance or cloud-based computing models. In actuality, advances in hardware, software, and algorithm optimization mean that AI opportunities are now incredibly broad, and often far more down to earth.

Take Audio Analytic, who have announced an experimental baby monitor, running AI-powered sound recognition on an Arm Cortex-M0+ -based processor. This processor, used in devices such as bank smartcards, has an ultra-low energy footprint. The monitor uses real-time on-device analysis of a baby crying, incorporated with a simple LED warning light that flicks from green to red when the device recognizes the baby’s cry. This alert system illustrates how to apply AI where it is most useful, taking into account constraints of networking, power consumption, and processing.

Closer to home, we see utility companies using smart meters for electricity and gas. Such meters transmit data on usage either to an individual walking by the building, or directly back to company headquarters. They have built-in processing and storage capabilities, allowing them to gather and process relevant information and await a passing sensor that they can send the data packets to, rather than requiring someone to approach each device, read it, and manually enter information into their logs.

AI-based devices such as these are more about serious function than science fiction, demonstrating the pervasiveness of AI today. They provide functional, and sometimes critical, capabilities for their users as well as producing huge efficiencies and savings for the companies that employ them. The fact that different use cases require unique architectural choices means that decision-makers need to understand the capabilities, patterns, and tradeoffs of AI-capable processors running in the cloud, edge, and endpoint.

So, how to make the right choices? In this paper, we review processor and platform options, then consider several use cases both to illustrate the potential of AI today and show how to devise, define, and deploy the right AI for the job.

Full content available to GigaOm Subscribers.

Sign Up For Free