The continual evolution of DORA

I interviewed Nathan Harvey, who currently leads DORA at Google. We talked about recent changes to their key DORA metrics and performance clusters, and how their research program is continuing to evolve.

My conversation with Nathan made me want to write up a short post about DORA’s remarkable history (listen to my podcast episode with Dr. Nicole Forsgren for the full story).

The annual State of DevOps report was conducted by Puppet in 2013, years before the company DORA or the book Accelerate came into being. Nigel Kersten, who was then the CTO of Puppet, tells this story (source):

In 2012, we reached out to Gene Kim who had been a supporter and friend of Puppet since the early days. We knew he was about to publish his groundbreaking novel on IT and DevOps, The Phoenix Project, and couldn’t think of a better person to collaborate with. Gene brought in Jez Humble, the guy who literally wrote the book on Continuous Delivery in 2010 (2010!). Over 4,000 technical professionals completed our 2012 survey — an unheard-of response rate for any IT survey at the time. We launched our first State of DevOps report together in 2013, and the appetite for the data blew us away.In 2013, we had a chance encounter at LISA (Large Installation System Administration Conference) with an academic researcher who specialized in IT impacts. Dr. Nicole Forsgren joined our scrappy little team and took the research to another level. We produced the 2014, 2015, 2016, and 2017 State of DevOps reports together. By that time, Nicole, Jez, and Gene had formed DORA (DevOps Research and Assessment) and wrote Accelerate: The Science of Lean Software and DevOps: Building and Scaling High Performing Technology Organizations based on these reports and other research.

DORA was a bootstrapped company that went on to be acquired by Google in 2019. The DORA research program and State of DevOps reports continue today at Google. While there have been many changes to the research program over the years, here are some key highlights:

  • The first report in 2013 referred to IT performance, a term which was gradually phased out and replaced by software delivery performance.

  • In 2018, availability was added as a fifth key metric for capturing operational capabilities. DORA’s overall performance measure was renamed software delivery and operational performance.

  • In 2019, researchers found that Change Failure Rate (one of the key metrics) did not cleanly fit into their delivery performance construct based on statistical analysis.

  • In 2021, the availability metric was replaced with reliability: “Historically, we have measured availability rather than reliability, but because availability is a specific focus of reliability engineering, we’ve expanded our measure to reliability so that availability, latency, performance, and scalability are more broadly represented.”

  • In 2022, researchers were no longer able to identify an “Elite” performance cluster. As a result, the report omitted benchmarks for this group.

  • In 2022, researchers found that software delivery performance alone did not predict better organizational performance. Operational performance was also required.

Published
August 9, 2023

Get started

Want to explore more?

See the DX platform in action.

Get a demo