Embedded machine learning. Machine Learning on Microcontrollers. Deep Learning on Embedded Devices. Embedded Artificial Intelligence.

These terms sound overwhelmingly serious, I know, but by the end of this article, you will be able to comprehend the basic idea behind them and how to get started in them. As we progress into this article, please note that we will use these terms interchangeably to mean the same thing: Embedded AI on a microcontroller.

This article is inspired by an authority in South Western Nigeria on Artificial Intelligence, Mardiyyah Oduwole, my mentor.

Terminologies

Machine Learning and Artificial Intelligence

Artificial Intelligence is the ability of man-made machines, systems, models and computers to simulate and even enhance human intelligence. Machine Learning is the process of making this possible.

Most machine learning engineers recognize Machine Learning and AI as the training and deployment of modelled data to the cloud that are useful for a plethora of tasks like recommendation systems, email filters, and what have you.

This article seeks to inform the ML Engineer and Roboticist alike that the cloud is not the only place to deploy ML model. Sometimes, robotics and/or AI researchers may see the need to deploy these models on embedded devices. An author asked this question:

What happens if an autonomous, driverless car suddenly loses its internet connection?

Certainly, it needs it's models to be available onboard itself, not on the cloud, to keep itself safely on the road.

Different other use cases exists outside autonomous vehicles, and they resulted in the birthing of tinyML: the subset of ML that deals with research and development of electronic systems that are made intelligent like smartphones, Amazon's speaker, etc.

Embedded Systems

In robotics generally, large or medium size systems often always comprise of internally embedded subsystems that are legitimately robots themselves. An example: an autonomous, self-driving car is considered a robot, but the smart dashboard system embedded into it can also be considered a robot.

download.png

Embedded systems can also be as small as a simple Arduino or Raspberry Pi microcontroller used to control or automate a larger system.

Embedded AI/ML

Stuart Russell defined AI as the designing and building of intelligent agents that receive precepts from the environment and take actions that affect that environment.

Embedded machine learning or embedded AI refers to making machine learning models/algorithms (or more precisely, intelligence) locally available to the system consuming it, nullifying the necessity of going to the cloud to access it.

Modern cloud computing infrastructures provide on-demand access to large amounts of computing resources and are therefore the primary choice for running sophisticated machine learning and deep learning applications, like autonomous driving and large-scale fraud detection. In several cases AI/ML applications are also hosted on private clouds such as on-premise data centres. Nevertheless, executing AI in the cloud is not always the best option, especially for applications that require low-latency and must be run close to the end-users. Specifically, Cloud AI requires the transfer of large amounts of data from the place where they are produced to cloud data centres, which consumes significant network bandwidth and incurs high latency. Furthermore, such data transfers can be susceptible to privacy leaks, especially in applications that manage sensitive data. Moreover, the use of GPUs and TPUs in the cloud is associated with a significant carbon footprint, which raises environmental performance concerns. The benefits of running AI closer to the users are everyday perceived by millions of consumers that execute deep learning applications in their smart phones, including popular applications like Siri, OK Google, and Apple FaceID. These applications take advantage of ML/DL models that are (pre)trained in the cloud, yet they also manifest the merit of running applications closer to the user. Furthermore, they illustrate the possibility of training models in the cloud and executing them in devices with less powerful computing capabilities. - Wevolver