Artificial Intelligence – Cloud and Edge implementations takes an engineering-led approach for the deployment of AI to Edge devices within the framework of the cloud.
We often use the word ‘engineering’ in casual conversation. However, in this context, we attach a specific meaning to Engineering. Engineering is the use of scientific principles to design and build machines, structures, and other items, including bridges, tunnels, roads, vehicles, and buildings. The American Engineers' Council for Professional Development defines engineering as: (specific emphasis of interest highlighted)
The creative application of scientific principles to design or develop structures, machines, apparatus, or manufacturing processes, or works utilizing them
all as respects an intended function, economics of operation and safety to life and property.
Engineering has many disciplines such as Mechanical Engineering, Chemical engineering etc.
(definitions and descriptions of engineering adapted from Wikipedia)
But when we consider the deployment of AI to Edge devices, we consider an interdisciplinary engineering approach. AI on Edge devices could include many areas like Drones, Edge analytics, embedded FPGA etc.
Some initial comments:
(above section adapted from wikipedia)
Looking back at the definition of Engineering above, we can infer some key themes which apply for deployment of AI and Edge computing in the cloud:
Hence, we have a big picture view as below. We model problems in the real world through ML/ DL algorithms and implement the model through AI and Cloud. The model is deployed on the edge and the edge device provides a feedback loop to improve the business process.
In this model, IoT / Edge extends beyond basic telemetry. The telemetry function captures data from edge devices and stores it in a data based in the cloud. We can then perform analytics on that data and build models based on the data. The models could be trained and deployed to edge devices. The architecture could also include streaming analytics and also include microservices/ serverless design principles. Finally, CICD / DevOps philosophy is a key part of the process as we explain below.
In this vision, containers are central to the whole process.
When deployed to edge devices, containers can encapsulate deployment environments for a range of diverse hardware.
CICD (Continious integration - continuous deployment) is a logical extension to containers on edge devices. Essentially, CICD and the DevOps philosophy streamlines software development. Through collaboration and automated testing, the quality of the software is improved. (CI part i.e. continuous integration). The CD (continuous deployment) part enables you to rapidly update Edge devices – either for patch / code updates or model updates,
Edge devices need to also cater for a wide range of execution environments such as CPUs, GPUs, FPGAs etc. Containers lend themselves well to this philosophy. We can see this from the facebook paper below Machine Learning at Facebook Understanding Inference at the Edge
Artificial Intelligence – Cloud and Edge implementations takes an engineering-led approach for the deployment of AI to Edge devices within the framework of the cloud. The interplay between AI, Cloud and Edge is a rapidly evolving domain. Ultimately, we see this philosophy (AI + Cloud + Edge) deployed as containers in CICD mode to transform whole industries as industry specific containers – spanning cloud and the edge.