Application portability and legacy application support are two interesting use cases for containers running inside virtual machines.
Coming off Dockercon 2017, the story of containers is starting to morph beyond microservices-based applications. Docker helped the concept of containers in Linux go viral with the open source project Moby, formally called docker.
Containers offer a different layer of abstraction than the virtual machine (VM). Until recently, many use cases for containers focused on deploying platforms such as Moby on bare metal. While companies such as VMware pushed the concept of containers within VMs, many industry observers questioned the use case for containers running in VMs, due to the performance overhead of the virtual machine itself.
SEE: How one e-commerce giant uses microservices and open source to scale like crazy (TechReupblic)
As adoption rises so has the clarity surrounding container use cases within VM deployment. The obvious use case is development environments. However, there are production-ready examples for deploying containers with a VM. Here are two use cases.
1. Public cloud portability
Application portability is one of the original use cases for containers. Containers package the binaries and dependencies for applications. Containers allow the deployment of an application written and compiled on one distribution of Linux to run on another distribution, regardless of the packages installed on the target instance.
An application developed under a VirtualBox Ubuntu VM runs without modification on an Amazon Linux instance running on AWS without modification, other than the installation of the container engine. The same container package runs on a Red Hat Enterprise Linux (RHEL) instance in Microsoft Azure. Operations teams gain the ability to select the best platform for each containerized application.
2. Stateful app migration
One of the surprising announcements from Dockercon was the availability of Oracle software in the Docker Hub. Oracle made available its database software to run in containers. Containers are typically synonymous with ephemeral workloads—if the host running a set of containers fails then, in theory, they restart on separate hosts.
An obvious question is why run a stateful or even monolithic application in a container? Simply put, containers consume fewer resources than a VM. Without the need to put an entire kernel, some use cases at Dockercon claimed server utilization savings at 30% over comparable VM configurations.
To take advantage of containers on bare metal, an organization must invest in orchestration tools such as Docker Enterprise, Mesosphere, or Kubernetes. Organizations with a significant investment in virtual machine infrastructures can leverage the high availability features of their virtualization platforms for containers.
Stateful apps like many of the common off the shelf (COTS) products running in enterprises rely on redundant infrastructure. VM platforms provide such a redundant infrastructure. Operations teams have the ability to spin up COTS-based workloads in containers running on VMs. If the hardware hosting the VM fails, the underlying VM is spun up on new hardware via VM management software and the stateful containers restarted.
If the primary use case for containers is to squeeze out every bit of performance from the underlying infrastructure, it's better to run the container on bare metal hosts. However, don't write off your investment in VM management. Obviously, not every workload is suitable for container platforms. Likewise, not every container requires the raw power of bare metal performance. As such, the use cases for containers inside of VMs are compelling.
- VMware's five key cloud-native computing investments (TechRepublic)
- 5 tips for securing your Docker containers (TechRepublic)
- How to run NGINX as a Docker container (TechRepublic)
- New Docker turnkey program helps enterprises modernize legacy apps (ZDNet)
- Docker LinuxKit: Secure Linux containers for Windows, macOS, and clouds (ZDNet)