Long gone are the days of waterfall software development. The agile movement has brought common-sense software development principles to nearly every corner of the world and changed the way we look at software.
This philosophy left marks on how we look at our infrastructure, too. With agile came DevOps and the idea to bring together infrastructure engineers and developers, which has lead to the broad use of infrastructure-as-code tools such as Chef, Puppet, and Ansible as key enablers to making DevOps a reality.
Despite the DevOps movement taking many key concepts from agile development, many DevOps engineers fail to test their automation code in the same way they test the software they deploy. It is crucial for software to have tests, and this should apply to infrastructure-as-code software too, if we plan to change and improve this code with no worries about breaking automation key to our DevOps pipeline.
Far too often a small, seemingly innocuous change in a Chef recipe, when improperly tested, can introduce errors. A smart DevOps team should look to incorporate three simple solutions into their infrastructure-as-code development: unit testing, integration testing, and a DevOps pipeline.
The most basic and easiest way to start testing your DevOps code is through unit testing. The intent of unit testing is to confirm that given a specific input, the script yields the expected output. The big trap most beginning unit testers should attempt to avoid is testing that your infrastructure-as-code tool works, as opposed to the automation script itself.
When you’ve mastered unit testing, the next type of testing you should look at implementing is integration testing. The intent of integration testing is to verify that the end state of the system is, in fact, what we expected. Integration tests provide us a higher degree of confidence that our code is doing what we need.
Once you have some strong testing in place, it’s time to apply the same DevOps pipeline practices used for source code to your Chef, Puppet, or Ansible code. Start by building a basic pipeline. By using git-flow for branching and merging and applying the testing you’ve just built as a quality gate, you can ensure that DevOps engineers have to have their code tested before being merged to master. This can prevent bugs from being introduced to existing systems, as well as sharing buggy code across the team. Cloud-based environments can also use this opportunity to run the automation scripts on a target server to allow the engineer to visually inspect the system before signing off as done.
By applying agile testing and DevOps practices to how we build DevOps automation, we can ensure the critical components of our pipelines are well tested and don’t introduce bugs or defects that impact development or operation functions.
This is a repost of a Techwell Insights article.