The One Number You Need to Increase ROI per Engineer

The Developer Experience Index (DXI) is a measure of engineering effectiveness predictors that can be directly linked to financial impact.

One thing that CEOs, CTOs, and CFOs can all agree on is this: software engineering is both a critical investment and an expensive one. And, compared with other critical business functions such as sales or operations, measuring and optimizing the efficiency of software engineering feels like a black box.

When talking to CXOs, we hear two repeated challenges when it comes to conventional metrics like lead time and mean-time-to-restore (MTTR). First, while useful in certain contexts, these metrics capture a narrow and limited view of software engineering. Second, these metrics can’t be translated into financial terms (i.e., dollars), making it difficult to link them to business impact.

At DX, we’ve helped solve this problem for leading tech companies including Dropbox, Twilio, and Toast, as well as traditional enterprises like Pfizer, P&G, and Tesco. Along the way, we’ve pioneered research on what actually drives productivity and how to measure it.

Deriving from these efforts, we have developed a measure of software engineering effectiveness we call the Developer Experience Index (DXI) which pinpoints the most critical factors in achieving higher software development efficiency and quality. The DXI is a predictive measure developed by our researchers, with over four million benchmark samples from hundreds of organizations across the world.

Developer Experience Index (DXI)

The path to higher productivity lies in a company’s ability to remove friction for their developers, enabling faster delivery and innovation. The DXI therefore measures engineering effectiveness through the lens of developers, providing first-hand insight into engineering processes and workflows.

The DXI measures actionable areas of software delivery that are most predictive of outcomes such as development speed, ease, and quality. Other sources may tell you to measure developer experience differently—but no other measure is proven to link to business outcomes like the DXI. In fact, the DXI is the only validated measure of developer productivity that can be directly tied to dollars.

A single-point increase in DXI score correlates to a 0.7% increase in engineering efficiency in terms of reduced time loss.

The development of the DXI is based on years of accumulated quantitative and qualitative research, and data from over 40,000 developers across 800 organizations. Analyses have included cognitive interviews, descriptive statistics, criterion-related validity analyses, reliability analyses, regression analyses and predictive modeling.

Our research reveals that top-quartile DXI scores correlate with engineering speed and quality that is 4 to 5 times higher than bottom-quartile DXI scores (Figure 1). In addition, top-quartile players are more engaged, scoring 43% higher on employee engagement than bottom-quartile teams.

Most notably, DXI scores strongly correlate with time loss: a single increase in DXI score correlates to a 0.7% increase in efficiency in terms of reduced time loss. The DXI is thus the first validated measure of developer productivity that can be directly linked to dollars, enabling leaders to concretely understand and communicate the ROI of improvements.

Figure 1 titled 'Developer Experience Index (DXI) predictive model' displays the correlation between the Developer Experience Index (DXI) and key business outcomes.

The DXI is a formative measure of how, and to what extent, the right conditions exist for developers to be able to deliver effectively. The DXI measures 14 dimensions (Figure 2) found to be actionable (and changeable) at both the manager and organizational level, such as deep work, local iteration speed, release process, and confidence in making changes.

Past research by DX identified a comprehensive set of factors underlying developer experience. In designing the 14 dimensions included in the DXI, our researchers took into account both the actionability of issues for management and predictive power against outcomes such as engineering velocity, quality, and efficiency.

Figure 2 titled 'Developer Experience Index model' displays a structural equation model.

The 14 dimensions are combined into a single overall DXI score, providing a balanced and transparent indicator that is protected from the volatility of individual metrics. Each of the 14 dimensions is also scored and tracked individually, enabling clear understanding of specific drivers impacting performance.

The DXI is measured through self-reported data collected on a quarterly or semi-annual cadence. To help contextualize data, DX has developed industry benchmarks for the DXI—leveraging over four million data samples from thousands of organizations—that are tailored to specific industry sectors, regions, and company sizes.

Turning data into action

The DXI provides leaders at all levels of the organization—from C-level executives to frontline managers—clear signals to identify improvement opportunities, spot trends, and measure the impact of investments. Data is just the starting point, however. Turning data into action and improvements requires diligent strategy and execution.

One common trap is the misconception that improving developer experience is primarily about better tools. Our research shows that cultural and structural factors such as product management and collaboration have an outsized influence on developer performance.

Similarly, leaders must recognize that issues must be tackled at all organizational levels. Problems such as excessive build times or infrastructure often crosscut an organization, requiring attention from leaders or dedicated teams. More local concerns, by contrast, such as minimizing interruptions or clearly defining work, require improvement efforts driven by frontline managers.

Below, we list several key recommendations for utilizing data to drive maximum impact. These recommendations derive from our experience in helping hundreds of organizations measure developer experience and leverage data to drive transformational change.

Provide reporting to all leaders and managers.
None of the metrics or data an organization collects has any value unless it’s put in front of leaders where they can use it to make decisions and take action. It’s critical that organizations provide reporting to leaders and managers as soon as possible after data is collected. Reporting should be delivered through easily consumable dashboards as well as readouts, slide presentations, and executive summaries.

Enable teams with best practices and recommendations.
With access to data and reporting, teams are able to identify key opportunities for improvement. Organizational leaders can enable self-directed improvements by providing teams with recommendations and best practices. These best practices can include those identified in high-performing teams within the organization, as well as outside industry examples.

Integrate DXI into a broader measurement strategy.
There is acceptance among technical leaders that good developer experience is critical to effective software delivery. Yet, at some organizations, proposed investments to improve developer experience struggle to get buy-in as less technical business stakeholders question the value proposition of improvements. "What is developer experience?", many of them challenge. "And why does it matter?"

To steer clear of these challenges, it’s critical to frame developer experience within broader engineering productivity and performance measures that executive leaders are familiar with and place importance on. The DX Core 4 framework (Figure 3) is a unified approach that combines DORA, SPACE, and DevEx into a set of standardized metrics. The DX Core 4 helps organizations achieve alignment around developer productivity across the business, leading to faster action and change.

Analyze data by team and persona
A common mistake made by organizational leaders is to focus on organization-wide results instead of breaking data down by team and persona (e.g., role, tenure, seniority). Metrics are highly contextual and can differ significantly across teams or roles. Focusing only on aggregate results can lead to overlooking problems that affect small but important populations within the company, such as mobile developers.

Compare results against benchmarks
Comparative analysis can help contextualize data and help drive action. For example, developer sentiment toward tech debt is commonly negative, making it difficult to identify problems or gauge their scale. Teams with lower sentiment scores than their peers and organizations with lower scores than their industry competitors, however, flag notable opportunities for improvement.

Getting started

Companies are feeling increased pressure to measure and improve developer productivity as markets are putting greater emphasis on efficient growth and ROI. Yet, compared with other critical business functions such as sales or operations, getting good data on performance is difficult.

The DXI provides leaders with a comprehensive measure of engineering effectiveness to help them lead with confidence and drive higher value delivered per engineer. Additionally, the DXI’s link to financial impact helps leaders concretely understand and communicate the ROI of improvements.

Organizations should get started with measurement as early as possible, even if they have not yet established or planned a formal developer productivity investment. Measurement can help organizations understand trends, decide the right time and place to make investments, and navigate macro shifts such as remote work and AI-powered programming.

The DXI has been adopted successfully at hundreds of organizations across tech, financial services, consumer goods, and pharma. To learn more about the DXI and DX platform, please get in touch with a representative from DX.

About the authors

Abi Noda is co-founder and CEO of DX where he leads the company’s strategic direction and R&D efforts. Laura Tacho is CTO at DX and leads the company’s executive advisory practice.

The authors wish to thank Dr. Liz Pavese, Dr. Michaela Greiler, Dr. Margaret-Anne Storey, and numerous DX customers for their contributions to the research.