Artificial intelligence (AI) has become part of daily life in cloud services such as social media, call centres, and chatbots, and is accelerating applications like genome sequencing, retail data analytics, and financial trading.
AI is now coming out of the cloud and into the edge. Frameworks like TensorFlow Lite that support development on embedded computing platforms enable building lightweight inference engines to handle tasks like object recognition, activity detection, gesture detection, and people counting, at the point of presence. Edge AI can deliver performance advantages such as lower latency, lower power consumption, and greater privacy, by eliminating data-intensive interactions with AI applications in the cloud, says Rhett Evans, business manager, embedded at Anders.
But that’s not all. These edge platforms will outperform, outclass, and replace conventional applications currently performed using scalar processors – like machine vision, for example. It’s a tremendous opportunity for start-ups founded on skills in AI and machine learning to disrupt the old order and deliver new solutions that are faster, more flexible, more sustainable, and more affordable. Equally, established players need to modernise using this new technology or risk being left behind.
To get an idea of the shake-up that’s coming, just consider the dramatic changes in our expectations of mobile phones over the last few years. People were once content with getting their emails, doing some web browsing, basic photography, video, the odd fuzzy selfie. Today, it’s hard to survive without a smartphone for banking, shopping, home automation, navigation, medical care, streaming entertainment, to name just a few. Any handset entering the market now must be able to handle these tasks, and more, to be accepted.
In the world of tomorrow, even the tiniest chips will come with a kick of AI. Just look at the latest MEMS inertial sensors with their own integrated machine-learning core. We will soon expect to find intelligence embedded in every “thing” we use, look at, touch, wear at work, at home, when travelling, shopping, in entertainment venues. And we will feel disappointed in those that don’t deliver.
What are the top applications for edge AI?
Recognising that many smart devices already rely on capabilities like voice recognition, facial recognition, motion detection, edge AI will enable them to become more responsive, more adaptive, more accurate, more richly featured, more easily portable (or wearable), more affordable, and use less power. Some exciting possibilities include:
AI embedded in mobile edge equipment (drones, AGVs) enhances situational awareness to improve safety and reduce transit times for parts and materials within the factory. On production lines, low-cost, high-speed/high-accuracy image comparison and anomaly detection enable 100% visual inspection of manufactured items at line-speed. Intelligent condition monitoring systems inside factory equipment detect and diagnose problems accurately and early, minimising false alarms, allowing repairs to be scheduled for minimum impact on productivity.
Industrial wearables and protective gear improve safety, productivity, and traceability.
Power tools and hand tools detect sub-optimal use and give corrective tips to improve longevity and accelerate employee training.
Digital signage and smart shelves use skills such as pose estimation, facial recognition, natural language understanding to assess shoppers’ moods and responses, enhance the customer experience, reduce queues, and maximise the value of each visit.
Smart wearable medical devices deliver fast and accurate early detection of medical emergencies (such as stroke or cardiac problems) or onset of conditions needing treatment. In consumer wearables, activity recognition enhanced with AI improves performance measurement, fitness advice, therapeutic monitoring, and elderly care (e.g., fall detection).
Cost-effective and accurate facial recognition eases access to buildings and secure areas for recognised users, enhances prevention of unauthorised access. Human presence and activity detection, with pose estimation, provides advance warning of malicious intent (carrying weapons, use of tools to gain entry).
AI enables affordable smart appliances to use natural-language skills for richer interactions with users and analyse sensor data to provide extra services: automatically order consumables, schedule predictive maintenance, suggest new recipes, identify/remedy incorrect use of the equipment.
How is AI implemented on edge platforms?
Edge applications usually face tight constraints including size, weight, and power (SWaP), thermal dissipation, and cost. Processor cycles and memory are often strictly limited. An efficient and lightweight solution is needed, both from the hardware and software perspective. To add to the challenge, we are often looking for responses to be deterministic and real-time.
Hence lightweight AI frameworks are needed for building inference engines that are suitable to deploy on mobile and edge devices. TensorFlow Lite is one example. In addition, embedded processors are becoming available that are architected for running AI applications within a limited power budget. NXP is at the head of the trend with the latest i.MX 8M Plus application processor, which is supported by the eIQ software development environment for machine learning at the edge.
The i.MX 8 family is aimed squarely at edge applications, in terms of power consumption, size, processing performance, and peripheral integration. The i.MX 8M Plus now adds an integrated Neural Processing Unit (NPU) that facilitates accelerating machine-learning inference. The NPU can run neural network algorithms for various tasks such as human pose and emotion detection, multi-object surveillance, word/speech recognition, and many more.
So, the i.MX 8M Plus is AI-ready. Equally exciting is that the NPU comes at a minimal cost premium. It’s not prohibitive to start designing with the i.MX 8M Plus now, to lay the foundation for your company’s next generation of AI-enhanced edge products.
When will software be ready for developing edge-AI applications?
Now. NXP’s eIQ machine learning environment integrates neural network compilers, software libraries, and inference engines such as TensorFlow Lite, Arm NN, DeepViewRT, and ONNX. It’s perfect for working with the i.MX 8M Plus and can target the NPU as well as the GPU and DSP that are on-board.
eIQ also supports TensorFlow Lite Micro for machine-learning on microcontrollers like NXP’s Arm Cortex-M MCUs that are suited to use in endpoint devices: aka TinyML.
The arrival of the i.MX 8M Plus, with its integrated NPU, is just the beginning. We expect more and more neural network models to become available which will simplify application development while software platforms like eIQ will become increasingly powerful, richly featured, and efficient.
This technology can endow your products with features and performance that conventional approaches cannot match. People will soon come to expect these enhanced experiences everywhere, all the time. The market potential is explosive. So, if you haven’t begun to get to grips with embedded AI already, it’s time to make a start.
Our extensive experience of all i.MX 8 processors and the associated ecosystem has helped our customers create solutions for a wide range of industrial applications like those mentioned above. Let us help you discover what the next generation has to offer.
The author is Rhett Evans, business manager, embedded at Anders.