(Editor's note: This is the first article in a two-part series. In the second article, U.S. Department of Homeland Security CTO Michael Hermus describes how his team identified the right metrics for its agile transformation initiative.)
At the Department of Homeland Security, driving transformation reminds me of that old saying: Changing course in a large bureaucracy is like trying to turn a giant battleship. Institutional inertia makes it difficult to introduce new ideas, and individuals are often resistant to doing things differently. But part of any leader’s job is to understand the reasons for this resistance and find ways to address them. To be successful on the battleship, you have to get everyone on board.
That’s what we sought to do starting in 2016, when we decided to transform our IT acquisition oversight process. Almost everything that we did to obtain new capability was considered an “acquisition,” including software we built or purchased, and was subject to oversight. These legacy processes descended from the DoD process originally designed for “heavy metal” acquisitions (battleships again!) DHS had actually been trying to migrate to more modern, agile software delivery practices for several years, and across the department many individual groups and programs were having some success. Larger programs, however, were burdened by that acquisition oversight process.
[ Read part two of this story: See “Agile success: Don't settle for metrics that tell half the story.” ]
After taking a hard look at the entire ecosystem across DHS, we found that this process wasn’t supporting modern, agile best practices for software development. In fact, in many cases it was actually inhibiting them. The oversight process effectively forced waterfall thinking, planning, and execution. Things like rigid management of scope and requirements made it nearly impossible to implement and deliver software in an agile way. The process needed a revamp.
As a critical first step, we gained buy-in from our senior leaders. They agreed that it made sense to learn through doing: We decided to pilot changes in five major programs at different phases of the acquisition lifecycle. This would help us understand what was really broken, versus what might be working okay, and thus figure out what actually needed to change. By choosing multiple programs at different stages, we aimed to learn about the whole process in parallel, which would be faster than going from start to finish in one or more programs.
Was our goal to pilot agile software delivery? That was part of the effort – we did indeed work with all five programs to guide them toward more modern software delivery, a big part of which was the adoption of agile methodologies. We also helped guide them towards more modern architectures and technology stacks, as well as adopt an open-first, cloud-first posture wherever possible.
But the big idea was really trying to fix and transform the acquisition and oversight process itself, so that it would support modern software delivery. This heavyweight process was causing many programs to struggle throughout the agile journey. If we wanted DHS to become a world-class technology organization, we had to address this foundational issue.
We started by forming cross-functional teams to support each program, which all rolled up to a HQ-level working group that reported to our senior leadership. For more than a year, these teams worked with the pilot programs to not only advance progress toward goals, but also learn as much as possible. Along the way, we saw significant improvements in several measurable factors, such as faster approval timelines and streamlined document artifacts. Even though it was a small data set, by some measures the pilot programs took about half the time to get to the same place as more traditional programs.
The Action Plans
While the initial successes in the pilot programs were significant and hard-earned, the ultimate goal of the transformation initiative was more far-reaching – we wanted persistent improvements that would impact all of DHS. Toward this end, the team spent many weeks reviewing, studying, and analyzing all five program engagements, and conducting follow-up interviews with key participants and stakeholders. We synthesized what we learned into 18 prioritized recommendations, each outlining suggested changes to software acquisition and oversight lifecycle, policy, and/or guidance. Every recommendation also included an initial action plan to implement the changes required to scale improvements across the entire department.
For example, we determined that one of the most important improvements we could make was applying technology to the process of reviewing and approving documentation. A majority of the effort and elapsed time in the oversight process was spent dealing with documents, yet we used only the most rudimentary mechanism (e.g. email) to facilitate the process. Even though we want to streamline documentation requirements as much as possible, they will not go away entirely; there will always be some level of documentation needing review and approval.
[ Is your DevOps team set up in the best way? See our related article, DevOps success: A new team model emerges. ]
To address this, we have already begun to apply much-needed workflow capability to the process of reviewing, approving, commenting on, and editing acquisition artifacts - documents that are used for planning and validating mission need, operational requirements, and acquisition strategy, among other things. This automated workflow will streamline the entire process, while providing data on every step of the process, such as review timelines, overall throughput, and bottlenecks. This should lead to better individual accountability, and also drive ongoing improvement to the entire workflow.
Of the 18 action plans, external observers might conclude that many are obvious pre-requisites to modern software delivery – and they would be correct. For example, an agile IT organization needs the ability to manage requirements in a more flexible fashion. Similarly, testing policies and practices must support integrated and ongoing incremental testing.
However, by undertaking this in-depth collaborative exercise, we achieved several important things that could not be taken for granted. First, we demonstrated conclusively, through data and real-world experience in the DHS environment, that we actually needed to make these changes. The recommendations were not simply an academic application of industry best practices from the confines of an ivory tower.
Second, and even more importantly, the inclusive nature of the process and the cross-functional teams achieved a level of buy-in that would have been impossible otherwise. Participants from all relevant stakeholder groups, including non-technical folks such as finance and procurement, were involved from the beginning.
Since culture change can be the hardest part of moving to agile methodologies and processes, or indeed any major transformation initiative, such buy-in is critical. This was our key to “getting everyone on-board” the battleship!
In part two of this article, we will look at another key part of our agile transformation initiative: Developing the right metrics.