This article is courtesy of TechRepublic Premium. For more content like this, as well as a full library of ebooks and whitepapers, sign up for Premium today. Read more about it here.
Companies looking to standardize their DevOps environment need to do more than evaluate tools. They must also consider culture, goals, and processes—and decide where standardization makes sense.
There's no shortage of DevOps tools to try when working on projects. From Chef to Ansible, Puppet to Docker, developers have their choice. However, picking a standard toolset can be a little more difficult, especially with the open source nature and variety available. Everyone has their own favorite toolset, which can cause friction in otherwise smoothly running departments. According to experts, it may require a different mindset in addition to examining the benefits and drawbacks of different tools to achieve some semblance of DevOps standardization.
The biggest challenge in standardization is the culture change required, said Sarah Lahav, CEO of SysAid Technologies. "Even though DevOps it is a very hot term, the day-to-day DevOps activities and goals mean different things to different people.... This is what makes standardization a huge challenge."
Tools like Puppet, Chef, and Ansible often seem like a shortcut to standardization, Lahav said. However, this isn't necessarily the answer to standardized DevOps because the tool may not fit the organization's needs. Rather, organizations need to remember that DevOps is about connecting development and operations teams for agile, rapid, customer-centric deployments.
Open source technology may help with standardization
Still, that doesn't mean choosing tools is a fruitless pursuit. Terri Schlosser, head of product and solution marketing at SUSE, said a software-defined infrastructure (SDI) and open source approach can help overcome DevOps challenges with visibility, automation, movement between environments, and implementing adoption. The combination builds on the concept of DevOps' growth through collaboration.
Enjoying this article?
Download this article and thousands of whitepapers and ebooks from our Premium library. Enjoy expert IT analyst briefings and access to the top IT professionals, all in an ad-free experience.Join Premium Today
Schlosser said the lack of visibility into application delivery often plagues DevOps, but open source is the most transparent and collaborative infrastructure. "As DevOps uses continuous innovation to stay refined, so do the open source projects that contribute to its infrastructure."
Research: DevOps adoption rates, associated hiring and retraining, and outcomes after implementation
In February 2017, Tech Pro Research conducted a survey to find out how many companies are adopting the DevOps methodology and how well it has served them. This report provides analysis and recommendations that will help you avoid setbacks as you plan and implement your company's DevOps strategy.
Free for Tech Pro Research subscribers.
Stay transparent everywhere
Collaboration and transparency are keys to standardization, and these need to be a part of the process when choosing any standard toolset, said Ian Buchanan, senior developer advocate at Atlassian. When tool standards are set, they might not fit the needs of other teams in the organization - but too many tools can make it harder to share knowledge and use the tools.
"The challenge to standardization is striking the right balance between team autonomy and the ability to redirect teams to the highest priority business problems," Buchanan said.
Always know your needs
Once you're ready to choose technology, it's important to know your infrastructure and your needs, both present and future. David Birdsong, infrastructure lead at imgix, said that every tool being considered needs to be tested, from the tools themselves to the tools that work with those tools.
For deployment automation, Birdsong tested both Ansible and Chef. "I ruled out Chef because of the assumption it was making about my infrastructure. It handles its own dependencies by expecting to be able to download and compile on every single node. That assumes every node will always have internet access—very much not the case for many kinds of deployments," he said.
Stay away from trends
Once you know your needs and have tested tools, it's important to make sure you're headed for functionality, not trends. "You can't do something because it's new and shiny and you want to use it," Birdsong said. Often, simple is best—like the service announce and discovery pattern he uses.
But at the same time, it's also worth it to look at new tools. "Change is painful, but it's necessary sometimes," he said. Originally, imgix had used a CentOS distro for its operating system because it was something Birdsong had used in the past. However, the company later moved to Ubuntu, which was difficult but paid off in terms of shortcuts in package authoring and productivity increases—more than enough to offset the time lost switching over.
Know that standardization may not happen
In some cases, it may not be practical to standardize. According to Buchanan, standardization can be a disruptive change for teams that are maintaining and enhancing old software. "For products near the end of their lifecycle, the benefits of standardization may not justify the disruption. That's not to say that legacy projects can't benefit from DevOps, but just that embracing tool standards may be a poor fit," he said.
For organizations looking to standardize their DevOps environment, it's not just a toolset they'll need to choose. The culture, goals, and processes will also need to be considered, and at times, it may not make sense to standardize if it requires letting go of functional legacy software and disrupting the organization.