Skip to content
Podcast

The AI adoption playbook: Lessons from Microsoft's internal strategy

Brian Houck from Microsoft returns to discuss effective strategies for driving AI adoption among software development teams. Brian shares his insights into why the immense hype around AI often serves as a barrier rather than a facilitator for adoption, citing skepticism and inflated expectations among developers. He highlights the most effective approaches, including leadership advocacy, structured training, and cultivating local champions within teams to demonstrate practical use cases. Brian emphasizes the importance of honest communication about AI's capabilities, avoiding over-promises, and ensuring that teams clearly understand what AI tools are best suited for. Additionally, he discusses common pitfalls, such as placing excessive pressure on individuals through leaderboards and unrealistic mandates, and stresses the importance of framing AI as an assistant rather than a replacement for developer skills. Finally, Brian explores the role of data and metrics in adoption efforts, offering practical advice on how to measure usage effectively and sustainably.

Show Notes

Barriers to AI Adoption

  • The biggest obstacle to AI adoption in engineering teams is not a lack of tooling or access, but developer skepticism rooted in hype and unrealistic expectations.
  • 30% of developers say their primary concern is that AI tools will not deliver on their promises, which discourages initial or continued use.
  • When developers try an AI tool and don’t immediately experience transformational results, they often abandon it entirely.
  • Broader narratives—like AI replacing developers—add to the skepticism, even though only about 10% of developers are actually concerned about job displacement.

The Role of Leadership

  • Leadership advocacy is a powerful lever for AI adoption. Developers are seven times more likely to be daily users when leaders actively promote and normalize the use of AI tools.
  • Leaders should clearly communicate which AI tools are approved, what kinds of tasks they’re suitable for, and that developers are encouraged (and expected) to use them.
  • Messaging that overstates the capabilities of AI tools tends to backfire, creating disappointment and further resistance.
  • Ongoing communication—rather than one-off announcements—is essential to build trust and maintain momentum.

Team-Level Strategies and Local Champions

  • Adoption is more successful when it’s supported at the team level through peer learning, not just centralized rollouts.
  • Developers benefit most when “local champions”—respected, experienced team members—show how they are using AI tools in the context of real workflows.
  • Organizations that rely on local champions for internal knowledge-sharing see about 22% greater adoption among developers.
  • Brown bag sessions, peer demos, and informal walkthroughs are especially effective for making AI adoption relevant and practical.
  • Managers play a key role in identifying early adopters who are both enthusiastic and influential within their teams.

Measurement and Data-Driven Adoption

  • Successful adoption efforts rely on clear, ongoing measurement—tracking who has installed AI tools, who is actively using them, and how frequently.
  • Rather than boiling engagement down to a single score, Brian recommends tracking usage at multiple tiers: daily, weekly, monthly, and lapsed users.
  • Dashboards and scorecards help leaders visualize progress and encourage healthy, team-level competition.
  • Monitoring usage across groups—not individuals—helps foster psychological safety while still promoting accountability and improvement.

Common Pitfalls and Developer Concerns

  • Public leaderboards or pressuring individual developers to adopt AI tools can lead to disengagement or mistrust.
  • Developers often worry that AI-generated code may introduce bugs, vulnerabilities, or degrade quality. This concern is second only to the fear that AI is overhyped.
  • Some developers fear their skills may atrophy if they rely too heavily on AI; this highlights the need to reframe AI as a partner in development, not a replacement.
  • Organizations should address these concerns head-on through honest, nuanced messaging that balances enthusiasm with realism.

Long-Term Impact and Ongoing Research

  • Initial studies show AI tools can improve code-writing efficiency by 5% to 30%, but broader productivity gains are harder to measure due to the multifaceted nature of software engineering work.
  • Brian highlights the need for more research on how AI impacts long-term code maintainability, especially when it accelerates initial development phases.
  • Developers still spend only about 14% of their time writing code, so increasing throughput in that slice doesn’t automatically translate into overall productivity gains.

Timestamps

(00:00) Intro: Why AI hype can hinder adoption among teams

(01:47) Key strategies companies use to successfully implement AI

(04:47) Understanding why adopting AI tools is uniquely challenging

(07:09) How clear and consistent leadership communication boosts AI adoption

(10:46) The value of team leaders (“local champions”) demonstrating practical AI use

(14:26) Practical advice for identifying and empowering team champions

(16:31) Common mistakes companies make when encouraging AI adoption

(19:21) Simple technical reminders and nudges that encourage AI use

(20:24) Effective ways to track and measure AI usage through dashboards

(23:18) Working with team leaders and infrastructure teams to promote AI tools

(24:20) Understanding when to shift from adoption efforts to sustained use

(25:59) Insights into the real-world productivity impact of AI

(27:52) Discussing how AI affects long-term code maintenance

(29:02) Updates on ongoing research linking sleep quality to productivity

Listen to this episode on:

Transcript

Abi Noda: Brian, thanks so much for coming on the show for a second time, so recently. I’m really excited to chat with you today.

Brian Houck: Yeah, absolutely. Thanks for having me back.

Abi Noda: Something I’m seeing right now from a lot of companies is figuring out how to drive adoption of particularly software development AI assistance. I’m hearing from more and more leaders that are actually doing top-down mandates from the CEO, CTO saying, hey, developers have to use these tools. I’m talking to just as many other companies that are just trying to figure it out more from a grassroots bottom-up standpoint. What are you seeing companies trying to do?

Brian Houck: Yeah. So great question, because many of the organizations that I talk with are also grappling with this. And even internally at Microsoft, we have looked at how can we increase adoption amongst our own engineers. And as developers are often skeptical by nature, so you have to be able to show value to them to get them to change their ways. And we have been able to measure across, not just within Microsoft, but across other organizations as well, some strategies that are effective.

And while I’m not sure that mandates are going to be as well received. Just as simple as having leadership strongly advocate for the use of AI tools makes developers seven times more likely to be daily users. And so they are just communicating out the value of it, seems to drive adoption a lot.

Some other things that I’ve seen are things like offering formal training. We’ve seen that AI tools are not necessarily equally usable for all kinds of tasks. They are better at certain tasks than other tasks. And it’s learning which of those tasks AI is better suited for will change your outcomes. And so offering formal training makes organizations about 20% more likely to have most or all of their developers adopt AI.

And learning from your peers is something else that’s been, I think, incredibly powerful. Something that many organizations have found useful, is not just having some central infrastructure team say, hey, go and use AI, but rather having local champions within a team say, hey, I’ve tried AI, it’s great. This is how I use it. Actually have brown bag sessions where they share their screen and they actually walk through their real world use cases that will resonate with that team. We’ve seen that organizations that use local champions like this are about 22% more likely to have their developers all or most of them adopt AI. And so those are just some of the strategies we’ve been able to measure as effective.

Abi Noda: So leadership advocacy, formalized training, local champions. Before going further into these, I want to ask you, in some ways this problem of driving adoption is no different than driving adoption of any other development tool or practice in an organization, but in other ways it is new because AI seems to be just very top of mind for folks in the press and it’s constantly shifting. So I’m curious, is there anything particular about AI that you think is fundamentally different in terms of how you think about this problem, compared to say, driving adoption of another type of tool?

Brian Houck: Yeah. The hype around AI is in many ways the biggest barrier for adoption. And so when I ask developers like, what are you most concerned about with AI, it is, that it’s a gimmick. It is something like, let’s see, 30% of developers say that the number one concern they have about AI is it’s not going to live up to its promise. And so having it be so top of mind for everyone sort of builds up this expectation. And so you either think there’s no way it’s going to be that good, so I’m not going to try it. Or you try it once and it isn’t as transformational as maybe you had in your head and you’re like, all right, I’m never going to try this again.

And so I do think it is different than driving adoption of some new build technology or a new IDE or something like that, because it is talked about so prevalently from both experts and non-experts.

Abi Noda: Yeah, there’s just a lot of pent-up expectations and judgments and pressure on all sides of it, I think, which sort of heighten the tension and stakes.

Brian Houck: Definitely. And I think even just like the constant narrative around how AI is coming for our jobs puts another interesting spin on it as well. Whereas, few developers who are using AI day-to-day are actually concerned that it’s coming for their jobs. Only about 10% of developers in some of my recent work have said that they are concerned that AI might replace them at any point in a foreseeable time horizon.

But it adds to the skepticism, where it’s like, oh, if people are saying this is so good, it’s like having another human. It’s like, no, it’s an assistant, it’s a coding assistant. Those are very different things. But it adds to this barrier that has to be cleared.

Abi Noda: So let’s get a little more tactically into your approach, your learnings around how to drive AI adoption and impact. So first of all, maybe take us back to the beginning of maybe one or collectively the different areas of work you’ve done here. How did you think about attacking the problem at the start?

Brian Houck: So we’ll talk about the advocacy angle first, because I think one question that many developers have is, am I even allowed to use these tools? Are they available to me? Am I allowed to use them? Am I expected to use them?

And so I think having sort of organization-wide, certainly team-wide messaging on what are the expectations, what is available to me, what is the expectation, what kinds of things am I allowed to use it for? What kinds of things shouldn’t I use it for? Sort of helps get over this initial fear that’s like, ooh, am I not supposed to be feeding any of this into AI assistance? Yes or no?

And so having, I think that advocacy come from the top, not just to encourage use, but to say like, no, no, this is okay. We actively want you to do this in these ways. And I think framing the opportunity for developers and making them feel safe to try it. And so it’s like, this is a tool to help you achieve your best work. This is something that will help solve the problems that you have told us you have in your day-to-day workflows.

And I think that messaging is step number one and it’s got to be consistent from leadership. Having leadership, not just send out one email blast to the org, but once every couple of weeks say, hey, we really encourage use of these tools, try them. Here’s all the resources available. That is the foundational piece that we had.

And none of that helps really unless we can track our adoption. And so not only do we want to roll these out, but we want to be able to make sure that we can measure which of these approaches are having an impact. We do that first and foremost by just looking at who has installed the tools, who has tried it, and then try to sort of increase the level of usage.

Abi Noda: In terms of messaging from leadership, team managers, you mentioned sort of tying it to the developer’s own day-to-day, like here’s how this can help you. What other messaging have you seen? Are there any dos or don’ts? For example, messaging to the tune of, if you don’t do this, you’re going to be left behind, or if we as a company don’t do this, we will be left behind.

Brian Houck: I think the biggest don’t is don’t overpromise. Don’t promise that this is going to solve all of the toil that you have in your workflows or that this is going to be able to completely take away any responsibility you have to find bugs in your code and things of that nature.

And so the biggest challenge is combating the skepticism that many of us have around AI tools and so don’t overpromise. It’s really about framing what it is good for and the ways you should expect this to help you in your day-to-day seems to be the better approach.

Abi Noda: So we have that messaging from the top, advocacy from the top. We have some visibility into who’s actually using it. What are some of the other components of driving success here? So you have some people at the top, team leaders saying, hey, we encourage you, it’s good, here’s why. We’re looking at the reports and still only 15% of people are using these things on a weekly basis. What are the next components to success?

Brian Houck: Yeah. So one of the things we have seen is that, as I previously mentioned, AI tools are not equally well-suited to all kinds of tasks. They are better for certain kinds of tasks than other kinds of tasks, particularly coding assistance are great at repetitive sort of more mundane tasks, great at sort of getting you started, getting some boilerplate stuff to build that flywheel. And some of the more complex novel tasks it still requires devs to be more front and center.

And so the more developers use AI, the better they are able to learn what are the tasks that it’s better suited for, what is it maybe not as good for? And teams being able to share that knowledge amongst themselves is sort of like the next key, is how do we optimize the usage we are getting out of it in order to create that flywheel effect, where like we learn how to use it a little bit better, we teach our peers, we share our best practices, they learn more and we get that momentum going? And this is where those local champions really come in.

And in, our workflows unique and specific to our team, what are the ways that we have actually found AI to improve our developer experiences? And that’s where finding more senior champions whose voice carries a lot of weight within the team, who have found some of those best practices, who have found successful ways to integrate it into the workflows, making sure that they have a way of forum to share those best practices, is sort of like that next level, that seems to be very helpful.

Abi Noda: Brian, you’re far more of an expert on this than I am. So a question I have for you is, is there a world in which there’s universal guidance around those optimized use cases? Or it’s just based on the nature of the technology and how LLMs work? Is it really important that this be a locally-driven process to discover those optimal use cases?

Brian Houck: It is such a great question, and I think the world today is going to be different than the world a year from now. And it’s hard to predict what will happen a year from now. I certainly can imagine that at some point in time the use cases are going to converge to more of a universal truth.

But what we’re finding right now is because developer experiences are so fragmented, they are so diverse across organizations. Even within an organization, they can be incredibly diverse, then it really requires those local champions. Because not only is it a function of finding how it is most locally relevant, but it’s also those local champions have… Their voice carries a lot of meaning, they have a lot of respect. And so if they say it, then that’s able to sort of give it a lot more credibility.

But because particularly AI coding assistants, excuse me, are not as universally good across all languages they certainly aren’t as good across different kinds of applications, then you are going to see some very specific local changes.

Abi Noda: How do you go about identifying these local champions or initial pilot teams? Is it purely a vote based on people’s feel, whoever is raising their hand and most excited? Or have you thought about this or seen it, thought about in a more proactive, deliberate way?

Brian Houck: So we often find that it is effective to have managers within teams identify very strong, more senior devs who can fill this role. And so certainly that’s going to be the kinds of developers who would otherwise raise their hand and say, oh yeah, I’m interested in helping. But it’s which developers were early adopters because they wanted to, which ones are passionate about sort of mentoring and sharing their learnings with their peers. And the best way we have found it, sort of scaling that out, is asking managers to identify within their own teams who would be these ideal candidates.

Abi Noda: Can we dig into that a little more? Have you really programatized this where there’s like a spreadsheet, where here are the champions for each team and then like these champions have X, Y and Z responsibilities?

Brian Houck: I think it’s certainly going to be different across all organizations and even within Microsoft it is fairly inconsistent across the company. I think there are some advantages to trying to programatize this in a more formal way where there are clear accountabilities and things of that nature.

But in general, right, I have seen it be successful to have it be more grassroots. And while organizations may be tracking adoptions and managers are getting some pressure to find ways to drive it, letting them and their teams figure out sort of the rhythms around champions does seem to work as well.

Abi Noda: Once we have these champions established again, how do you really drive that adoption? I want to ask you what… We talked about one failure mode earlier, which is skepticism for developers. Like if you go overpromise and kind of lean into the hype, there’s a good chance that it’s going to backfire and sort of create a poor experience, and even more skepticism from developers.

What are other failure modes you’ve seen? Like for example, is setting individual targets for every developer or putting up a big leaderboard of adoption?

Brian Houck: Yeah, I’m not convinced that sort of shaming the individual developers who aren’t adopting AI is a path to success. I haven’t actually proven that that’s a sort of a failure mode, but I would strongly suspect that it is. I do think that like other developer experience metrics, tracking adoption at sort of broader, larger team levels isn’t inherently a bad thing, but calling out the specific individuals probably would be a bad thing.

I think that some of the other big concerns that devs have are around things like, that AI will introduce vulnerabilities or bugs into your code. And so I think you do need to be careful about not… Your messaging should include this notion that it is still our responsibility to ensure quality. Ultimately it is the humans who are that last line of defense. To ensure quality, and we do that through rigorous testing and things of that nature. And I think that oftentimes the conversation doesn’t touch on that. I think it is often about, oh how can we just ship code faster, ship code faster, ship code faster? And I have had devs say, but what about quality? Like I know that concern about quality is the number two thing that devs are most concerned about.

Abi Noda: Yeah, after hype.

Brian Houck: Yes. And then something else as sort of a bit of an antipattern is not being clear that leveraging AI is not atrophying your skills, it is changing what it means to be a highly productive developer. Many developers are concerned that oh, as we start to rely on AI more our own technical skills are going to atrophy. And we need to reframe that conversation around, what it means to be a developer is changing. This is, AI is a coding assistant or a paraprogrammer. It is not replacing us. It is someone that can sit next to us and help us and help guide us. And we have to learn how to leverage that and listen to that advice. And so the skills we need are going to change. It’s not a function of atrophy in our existing skills.

Abi Noda: Yeah. So we’ve talked about some cultural and procedural methods for driving tool adoption from local champions, formalized trainings, advocacy. Any technical methods, like reminders or nudges or building, I don’t know, AI into… It’s already integrated in the IDE, I’m just thinking aloud here, but any technical solutions you’ve investigated or attempted?

Brian Houck: I haven’t, but it’s such an interesting idea. I do think in general nudges seem to work for lots of aspects of the developer experience, and so I could totally imagine that giving nudges would help. I think something else I have heard as a barrier to adoption, is just having the time set aside in order to try it out.

And so I think that’s where nudges could potentially help, where if you notice a developer has a free block of time, oh hey, maybe try installing whatever approved AI coding assistant is out there and give it a shot. But I haven’t actually done any formal studies on that.

Abi Noda: And we talked about some of your recommendations around reporting, looking at it more at the group team level. I’m curious to ask you a little more tactically, how do you use data in this process? Are there any specific reports, scorecards, readouts that you’ve found effective to help with this organizational process of driving adoption in particular?

Brian Houck: I think the most effective thing we have found is to have dashboards that just track the raw amount of usage by leaders. Turns out that humans are fairly competitive and so leaders are going to be very motivated to drive adoption within their teams, both because the evidence does seem to be pretty clear that AI is helping us be more effective at our jobs. And so they want their teams to be more effective, but they also want to be able to drive it faster than their peers.

And so this is a place where I think that behavior actually leads to good outcomes though because devs, once they have adopted AI, seem to overwhelmingly like it. And in fact 80% of developers who use AI say they would be sad if they could no longer use it. And so it’s how do you get that initial adoption to get that hook in?

And so having leadership scorecards where it’s organizational leaders see, what percentage of my team has adopted it? And how many people have tried it and then fallen off? How many people are daily users? And all of those things leaders seem to want to drive more.

Abi Noda: Getting really tactical, how are you tracking active? Are you looking at daily active users? Are you looking at number of people who have used it X amount of days, in X amount of days of the month? How are you slicing the data?

Brian Houck: Yeah. So we don’t boil it down to a single score. And so we sort of look across lots of different columns. It’s who has installed it, who has tried it once, who uses it sort of once a week versus once a month, versus every day. And so we have sort of levels of usage. And I think that is not just within Microsoft, but organizations I’ve talked to seems to be a fairly good breakout. Who are daily users, you use it at least four days a week, who are sort of once or twice a week versus monthly users, once or twice a month. And then who has used it before but hasn’t used it for a month, who has fallen off that train. And so I wouldn’t try to say just like, oh you are an AI adopter or not. It’s not quite binary like that.

Abi Noda: Right, makes sense. You kind of mentioned infrastructure teams often maybe being put in the role of rolling these things out. Infrastructure teams don’t necessarily have direct authority over what teams are doing and how much developers are using these tools? This is a broad question, but how as an infrastructure team or leader, how do you get these local teams and leaders to do things that you don’t have direct authority to tell them to do? Yeah, how do you think about that problem?

Brian Houck: This is where partnering with leaders up the chain for that advocacy comes into such an important role, and that is why I think we see those organizations where leadership does strongly advocate for it are substantially more likely to get teams and individuals to be adopting AI.

And so the central infrastructure teams absolutely can roll it out. They can try to make the case for how this is going to improve developer experiences, but it does seem to really get supercharged once people further up that management chain start advocating for its use, in combination to the rollout from their central eng teams.

Abi Noda: And at what point do you call it done? So this initiative to drive adoption, at what point does someone like you, someone like the infrastructure team sort of stop being the instigator of this and sort of say, all right, now the organization can just kind of carry on?

Brian Houck: When I get there, I’ll let you know. And unfortunately because it’s not so cut and dry as say, oh, when we have 70% adoption, 80% adoption, 90% adoption that we can wash our hands, because we also have found that the more you use AI the better you are able to use AI, the more you learn from it.

And so it’s once we get a large blank of adoption over the vast majority of engineers, say 70%, 80%, then it’s a function of how do we get that frequency of usage up? And I think at least for many of the organizations I’ve spoken to, we’re still somewhere in that curve. And so it is not done yet. Because at some point you’re going to have, to your question, the vast majority of developers are using AI tools, they they’re using it very frequently. It’s seamlessly integrated into their workflows. And then it’s going to focus more on how do we integrate AI into some of our central infrastructure tools or how are we able to better locally train some of these models and things of that nature?

Abi Noda: just a couple of months ago when you were on the show, we talked a little bit about what is the real impact we’re seeing from AI tools. Even just since then, I feel at least in the data we see at DX, we’ve already seen shifts in the adoption numbers and the impact numbers. I’m curious, what’s your view at the current moment in terms of an organization asks you how much more productive can our developers be right now with the tools available? What’s the answer to that, in your view?

Brian Houck: Like with many things around productivity, the answer is it’s complicated. Because even things like AI coding assistants, there are lots of studies out there that have confirmed that AI coding assistants allow you to write code faster. It increases your code throughput, maybe fairly dramatically, maybe more modestly. But developers only spend about 14% of their day coding. And it’s like, oh, how much more productive am I? It’s tough, because I may be helping for part of what a developer’s role is, but a software engineer is a lot more than just a coder nowadays.

And so I do look at things like, what is the increase in our coding efficiency? Not necessarily how much overall more productive we are. And it does look like… If I look across a wide range of organizations it’s anywhere from… I’m seeing numbers from 5% to 30% more efficient with their coding.

Abi Noda: Yeah. That’s what we’re seeing as well. Something I’ve been thinking about, just personally for the engineering team at DX is just this idea of, there’s a cost to write code, but then there’s a cost to maintain it over time. And I’ve just been thinking a lot about, we’ve been doing some greenfield work that’s been incredibly accelerated through AI tools, but then inevitably a couple of months later we have to go back and modify that code and build on that code.

So it’s just made me think about, okay, we accelerated zero to one with AI tools, but did we decelerate? To what extent is the 1 to 1.3 then, hindered or accelerated? I’m just curious if you’ve seen anything around this in your research?

Brian Houck: This is why it’s so complicated to answer the question, how much more productive are we? And one of the things I am looking at right now, in fact with a study is the impact of AI on code maintainability and other aspects of code quality. And unfortunately don’t have anything quite ready to share yet. But my mental model is these coding assistants, they are like a paraprogrammer. And if you aren’t doing it as a partnership, that’s going to be potentially problematic, at least in the short-term while you have to figure out what it is that you let your partner go off and do.

And so really doing this as a partnership I think is helpful. Because if you just say, hey, write everything for me, and then the dev isn’t sort of still involved in making sure that it works and it satisfies all your quality criteria, that does seem to come due.

Abi Noda: Brian, thanks again for coming on the show and your time, sharing your wisdom and insights. I’m really excited to share this knowledge with leaders and organizations more broadly. So again, thanks again for your time.

Brian Houck: Perfect. Thank you so much for having me.