Subscribe to DSC Newsletter

Free book - Containerize your Apps with Docker and Kubernetes and impact of containers for AI on Edge devices

Background

Containerize your Apps with Docker and Kubernetes is an excellent free book from Gabriel N. Schenker

You can download the whole book by registering HERE

In this post, I explain the significance of deploying apps with Docker and Kubernetes and also some of my thinking at the University of Oxford artificial intelligence cloud and edge impleme... course.

Oxford for AAA AI edge engineering “Ability to deploy AI and ML models in containers at the Edge”

The book table of contents is

Chapter 1: What Are Containers and Why Should I Use Them?

Chapter 2: Setting up a Working Environment

Chapter 3: Working with Containers

Chapter 4: Creating and Managing Container Images

Chapter 5: Data Volumes and System Management

Chapter 6: Distributed Application Architecture

Chapter 7: Single-Host Networking

Chapter 8: Docker Compose

Chapter 9: Orchestrators

Chapter 10: Orchestrating Containerized Applications  with Kubernetes

Chapter 11: Deploying, Updating, and Securing an Application with Kubernetes

Chapter 12: Running a Containerized App in the Cloud

 

Significance of Containerization

Containers are the best way to implement a DevOps architecture. This book explains the  end-to-end deployment of containers for an Azure environment – including container orchestration through Kubernetes. The book explains the software supply chain and the friction within it – and then presents containers as a means to reduce this friction and add enterprise-grade security on top of it.

 

The analogy of a shipping container in the transportation industry is often used to describe software containers. In a shipping container, the container is a box with standardized dimensions (length, width and height) and can be treated as a black box from the shipper’s standpoint. Also, the shipper is not concerned about the contents of the container

 

Containers unify the world of development and deployment. Prior to devops/containers – developers would hand over the code to the operations engineers who were then supposed to install it on the production servers and get it running. This process got complex and error-prone with many developers deploying different applications and versions to those applications.  The problem was further compounded due to management of many dependencies and very short release cycles.  Containers overcome these problems by encapsulating the application and all external dependencies – ex as in a Docker container.

 

Containers provide a number of benefits:

Improving security:  Container images are immutable and can be scanned

for known vulnerabilities and exposures. We can also ensure that the author of

a container image is who they claim to be. 

Simulating production-like environments:  Containers help simulate a production-like

environment, even on a developer's laptop.

Standardizing infrastructure

Standardised infrastructure - Every server becomes another Docker host. We do not need to install special libraries or frameworks to manage dependencies.

 

AI on Edge devices using containers

Note this section relates to my personal work

at the University of Oxford artificial intelligence cloud and edge impleme... course. It is not a part of the above book

 

I am exploring the idea of the benefits of containers on Edge devices. All the benefits of containers in reducing friction over the software supply chain also apply to the Edge. I see AI edge engineering as the “Ability to deploy AI and ML models in containers at the Edge”

 

The philosophy of packaging into containers enables us to run algorithms more effectively because machine learning and Deep learning projects need to manage deployment with an increasingly complex software stack. By using pre-packaged containers which are optimised for performance and tested for compatibility, you can get started easily.

 

Some more notes:

  1. We can see deep learning containers on the Edge by application type (ex for energy etc)
  2. Also, deep learning containers on the Edge by device type – ex specific processor types

There are already a number of initiatives covering this trend

 

Best Practices of Implementing Intelligent Edge Solutions with Azur... 

slides 10 , 11 and 12 for the big picture

slide 30 "Model Management – Inferencing Target" -  how models will vary depending on target hardware ie the model is the same in the cloud but exploits the best features of the target processor

Docker - ARM collaboration - cloud-based development environment for writing and delivering applications to cloud, edge and IoT devices. .

Google - Deep Learning Containers

Azure ML with ONNX runtime across a range of hardware

 

I see containers on the Edge to be the key feature of AI Edge Engineering – especially as Edge applications become more complex from Oil Rigs to Autonomous vehicles

Welcome your comments

 

You can download the whole book by registering HERE

Views: 650

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service