This article is courtesy of TechRepublic Premium. For more content like this, as well as a full library of ebooks and whitepapers, sign up for Premium today. Read more about it here.
Containers help keep development agile and ensure continuous delivery—making them a great tool for DevOps shops.
While developers were in the process of modernizing legacy apps and creating new ones, containers came to the forefront. Containers provide a way to package everything needed to run the app: the code, the runtimes, and system tools. This allows developers to create the applications on a laptop and deploy them on servers.
Needless to say, there's been some controversy in the enterprise as to whether containers are shiny objects or useful tools. Some will argue that developers can use virtual machines just as easily as they can containers, but others say containers are what enables DevOps because applications can be deployed that much faster.
"Containers are not only changing the way we ship applications, but how we operate them as well," said Carl Caum, technical product marketing manager at Puppet. No matter what resource companies deploy a container to, containers will run the same way. Everything from figuring out where they should run, how many of them should exist, and where traffic should be routed is handled by software, which lets developers focus on the application itself, Caum said.
Although Docker is undoubtedly the first name developers think of when they're looking at containers, other options exist: Kubernetes, CoreOS, and Mesos, and even virtual machines. "Docker is the standard choice when starting out with container technology," said Sai Gunturi, head of product and technology at NectarOM. "Instead of container technology, some organizations prefer to run virtual machines, which generally have a much larger footprint." Containers use a smaller footprint than a full virtual machine, and scaling the application means just adding more copies of the same container in most cases, he said.
Enjoying this article?
Download this article and thousands of whitepapers and ebooks from our Premium library. Enjoy expert IT analyst briefings and access to the top IT professionals, all in an ad-free experience.Join Premium Today
The ability to rapidly develop, deploy, and integrate new software and features is essential to the overall success of many organizations. DevOps is the solution. This glossary of 20 DevOps-related terms will provide you with a working vocabulary. Free for Tech Pro Research subscribers.
Containers bring a lot to the development table
No matter what they choose, developers have found that containers provide a precise, controlled environment to build continuous integration and continuous delivery pipelines, according to Joel Chovanec, staff software engineer at kCura. The exact dependencies, along with the server software, can be packaged together neatly. "Since containers are immutable, the software you test and verify will be precisely the software you deploy - no mismatched versions or dependencies," he said.
The second reason to use containers—and a big one that fits in with the agile methodology of DevOp—is the speed of deployment, enabling the quick launch of new features and new applications. "Containers start up extremely quickly," Chovanec said. "This makes containerization convenient for iterating quickly during development, as well as scaling stateless services in production."
Developers can also run production code on their local machines, which lets them replicate a full development environment without the need to deploy an application across the enterprise, Gunturi said. "Using custom-built container images really helps prevent the 'well, it worked on my machine' excuse," he said. "If it works locally, it should work when you deploy it, too."
Custom dependencies also become a non-issue. "If we need to run some special software that needs a bunch of configuration, it's minutes to install and run the Docker container that is already pre-configured," Gunturi said.
The downside of containers
However, containers aren't always the best choice for all types of development projects. They tend to be superfluous when a company is already using virtual machines, and they have their downsides. For example, they can be a "black box" that makes it difficult to understand what is in them or what was running in the past, Caum said.
In addition, existing applications often need rewriting to support being run from a container. "Large, monolithic applications are difficult to containerize and don't often add much advantage," Caum said, adding that containers tend to be more manageable when using microservices.
Preventing container breaches
While containers are known for being relatively secure, there is always a risk when running other people's code, Gunturi said. "You have to look carefully at the container source code and configuration."
This dovetails with advice from Caum, who said it's important to know what's in those containers. "Having the situational awareness of knowing what's actually in the hundreds or thousands of containers currently running across your infrastructure... is critical to knowing where you might be vulnerable to attack."
Container lifetimes also need to be shortened as much as possible to prevent attackers from having large windows of time to play with, Caum said. Running containers need refreshing every minute or so. In response to updates, perpetually refreshing the infrastructure and updating images alongside them keeps containers up to date.
What container use comes down to is how fast developers need to deploy their applications, whether virtual machines are already running, and whether it's possible to secure the containers adequately. (It usually is.) Containers are a great way to keep development agile and ensure continuous delivery, which is why they're so popular among developers. Just look carefully at the types available and choose what will work with your environment and developer skill sets.