Jonathan Biddle, Director of Engineering Effectiveness at Wayfair, shares the story of how his team found repeat success and subsequently grew in size and scope. He shares lessons they’ve borrowed from startups, including understanding the adoption curve and knowing your core users, and offers advice for other platform teams looking to move to the next stage.
Mentions and links:
Abi: Jonathan, thanks so much for coming on the show today. Really excited to chat.
Jonathan: Yeah, happy to be here, Abi.
Abi: The theme of our conversation today is around driving adoption of developer platforms. And I think you've been on such an interesting journey building different platforms, so really excited to dive in. I know that your journey really started with the Python Platform team at Wayfair. This was your first time working on a platform team. So I'd love to hear about how you came into that role and why did Wayfair form this team in the first place.
Jonathan: Yeah, it's an interesting story. I never actually applied to work at Wayfair. I came in through an acqui-hire situation back in summer of 2015, and I landed on a team that was a lot of fun, marketing engineering, we were building. I ended up focusing on the notification systems there. And I also established myself as a bit of a resident Python expert. Python's probably the language I've easily used the most in my career as a software engineer. And there was a point at which Wayfair recognized a small but important subset of its systems we're running in Python. We largely productionizing data science models through flask. And those systems were supported, for the most part, by volunteer efforts from other teams. They didn't really have a real strong, robust platform team to own and operate those things and recommend best practices and drive it forward.
Most of the systems at the time were PHP with a small subset being Java and C#. So there was a point where there was a bunch of revenue and value flowing through these systems and it needed to take a bit more seriously and intentionality. And so when there was a meeting called together, I was, at the time, leading a bunch of folks under marketing engineering, folks on notifications, being the Python expert of that group. We had one Python system that we owned there. We were discussing, what do we do? Do we form a platform team for Python? Where does this go? And I remember them describing the job and I'm like, oh my God, I think that's my dream job to help a company figure out how to do Python well and actually see it become a tool that was available to anyone.
So in the course of that meeting, I raised my hand. I was like, "If you are looking for someone to start said team, I would really enjoy being that person." So it was a bit of a gamble. It was going from leading all of the notifications to starting a brand new platform team, something I had no experience doing. So starting totally from scratch, but that was the start of the journey. That was, I think, at the beginning of 2017.
Abi: I'd love to hear, as you describe that story, one common thread when I talk to a lot of platform and DevX leaders is they're all really, really passionate about the domain in which they work in. And not to say that other engineering leaders aren't as passionate, but I think it's a little bit different. And the way you told that story, I'd love to know what was it that got you so excited? Why did shifting into that role attract you and excite you so much?
Jonathan: Yeah. So I didn't start development platforms. That team existed long before I joined Wayfair. My team was Python Platforms, which was one of the teams within development platforms when I kicked it off. I think that the team early on was essentially called a little bit of a grab bag of general-purpose tooling and things that fell between the lines of all the teams out there. And it's just, well, who's going to own that? Well, development platforms can own that.
I don't think, initially, it was really looked at as a key investment area to drive developer productivity across the entire organization. It was just ... That thing is used by a lot of engineers, doesn't have a clear owner, give it to that team. I think a really cool part of the journey was actually playing a key role in turning that team into a team that's a very key investment that has a very broad, high leverage impact across the entire engineering organization and, ultimately, helps the business get more business returns on its engineering investments. So that transition happened while I was on the team. It didn't happen overnight, but it was a fantastic journey to be a part of.
Abi: That's awesome. Well, excited to get into it. And I loved the way you talked about how you'd led that shift from being ... Well, the analogy I like to use is shifting from just being a cost center, platforms just maintaining a bunch of stuff. It's not really a profit center, but it's really a multiplier on overall developer productivity and should be driving business dollar returns. Sounds like you think of it similarly.
Jonathan: Yeah. If I can use my analogy briefly on this one, it would be... If you think of software engineers, software engineers generally don't provide any direct business value. In Wayfair's case, we don't pay software engineers to handle the shipment of goods and bring them to a customer's house or take orders or anything like that. Instead, they're building these systems that allow us to scale and be a successful business with many, many more customers than you could without that technology innovation automation. And you can think of software developers almost like working on this metal layer of the business. They don't provide any direct business value, but when you use them well, your business can scale way higher than you could if you were, say, a catalog that was being mailed to people's houses that then had this manual bookkeeping process to actually ship the goods. That's just not going to get you very far.
I look at platform teams as almost like this meta layer on top of the meta layer. And so their leverage gets even larger. But what happens at the same time is the further you remove yourself from ultimately the revenue generating, the value creating activity that the business does, the more unclear it is how you actually apply that leverage to create business value. So I look at any platform team as, ultimately, what you're trying to do is help the business scale be more successful. And you're in a position of immense leverage, but you're also in a position of immense ambiguity. And it's not clear how and where you can find opportunities to drive the creation of business value. But if you can find those, it's incredibly impactful.
Abi: Well, I love that analogy and description, and I think it leads into the beginning of, I think, the story of your journey. So you had mentioned to me once you came into this role, you volunteered, got the role, you had to start thinking about how to run a team like this. And you mentioned to me you thought of it like a startup. Can you share just the early beginnings and what you meant by thinking of it as a startup?
Jonathan: So part of it was almost born out of necessity. Me never being on an infrastructure or platform team before and being faced with, "Okay, well, now I'm responsible for this team," which was just three of us at the start succeeding and actually not being dismantled in six months. We had to figure out how we actually create value.
And so what I did was I reached for what I was most familiar with, which was startup entrepreneurial thinking. And I'm like, well, if we look at the team, we've got potential customers. We've got investors. And if we don't satisfy enough customers to satisfy our investors, we're in trouble. And the more I thought of it in that lens, I'm like, okay, actually, I think a lot of that startup entrepreneurial thinking still applies. It's just that it's a different incarnation of almost the exact same systems problem.
So we started to look at that way and almost think of it very, very like, how do we create a startup? How do we get this thing off the ground? How do we make sure that we're providing sufficient value to justify our existence or hopefully our growth? And that led us to this just general thinking of, "Okay, well, right now at step one, we don't really know where we can provide the most value." We have a ton of assumptions and we have a bunch of potential users and people who are currently using Python, people who might want to use it. And so how do we find out where we can actually provide value?
And our answer is, well, let's get really close to them. Let's literally partner with them and just find a team and say, "Hey, we want to help you with your project," just like a free engineer showing up. And while we're doing that, we're seeing what sucks, what could we make a little easier? And just through that getting super close to the customer, so to speak, a whole bunch of opportunities just showed up where like, "Oh, we can make that better. We can make that better." That really is awful when you have to go do it. Bringing in a new dependency into the environment was a rough experience. All of those things created this essentially really, really firsthand experience driven backlog of where we could provide value.
And that was what drove a lot of our initial work. It was admittedly fairly shortsighted. It was like, how do we get value out the door in the next two months so that we can show people, "Hey, look, this is getting better now that we exist." And we had a way of thinking of it, which we called transactional value versus recurring value. So transactional value would be I go and I sit with your team and I help you debug something. That value is created once. That team is super grateful, and doesn't really help anyone in the future. Recurring value would be, "Hey, people keep on running to this problem over and over and over again. I'm going to give them a library that they could just use. And if that works well, no one really has to deal with this again."
What we would found early on is we would do transactional value creating activities to, one, demonstrate the value and the expertise we bring to the organization and use that to drive or identify opportunities to drive further long-term recurring value opportunities. And it was a bit of a cyclical effort where we were not going to make long-term investments until we validated it with that transactional value of just getting really close and making sure we're solving a real problem.
Abi: Well, I, first of all, love that model of transactional versus recurring. I mean, to go off this startup analogy, I can think of startups that are launched in this exact way where they begin as a service, they go out, provide a service to some market segment or customer base, so then identify a common pattern or problem and then end up creating a solution or product that brings in recurring revenue, instead of transactional services revenue. So it sounds remarkably similar. I really like it.
Jonathan: Yeah and honestly, I think when you really take a step back, to me, it still feels like it's the same problem. I would actually say the biggest difference is startups have the benefit of observable revenue. You have a bank account, it is shrinking. And you can see when you put more money in the bank and you can see when it evaporates from the bank, and that gives you this very clear, real feedback loop on how you're doing it creating value.
Businesses generally don't have that. I think the same dynamic still exists if you don't have buy-in from the organization because you're not creating value, eventually. That team probably isn't going to exist anymore. And so that ... You still have the runway concept as a team, you just lack the easy observable measurement of revenue, and runway, and a bank account. And so I think you have to supplement the lack of that with very intentional thinking of, where are we creating value? How is this helping the business succeed? What do our customers think of our product? That kind of thing.
Abi: As we talk about it, a lot of platform teams we talk to have to work really hard to even become a team. I mean, you certainly, it sounds like, had high stakes to prove the value of your team, but in the eyes of the leaders who allowed your team to come into existence and take a shot at this, I mean, were they just viewing this as a bet? I mean, is how you set ... They're just making a bet, we have 90 days of runway to prove this wasn't a mistake?
Jonathan: So I think my ambitions were a little higher than the initial investment was. The initial investment was more or less, "Hey, can you stabilize this edge case of the Python system?" I think there were 13 services or so that we're running in production and they're like, "Can you create a nice little foundation layer for that?"
My ambitions were, I want Wayfair engineers to feel like they can reach for Python when it feels like the right tool for their job and that to be an accessible and non-controversial tool. And I thought if that can happen, Python's actually a pretty strong match for Wayfair's cultural strengths: very high and pragmatism, enjoying speed of iteration, those types of things. It's a really good fit and that was my ambition. So I use that initial investment to say, "I think we can go a lot further than just creating stability in this space." I want to see it unlock a bunch of new value.
Abi: That makes sense. Well, I want to continue asking you about this journey then. So early on, you were balancing the transactional with some recurring. I mean, how did this begin to evolve? Some of the questions I have, what were the larger goals you started to set or what were the larger projects, maybe those recurring solutions that you guys set your sights on as you moved out of that very early initial phase?
Jonathan: So initially, our document and mission statement was to make Python at Wayfair incredibly accessible, low effort, and high output. That was where we started. We quickly, I think with our early adopters, started to see a lot of success there. And then we had to move the goal post for ourselves a little bit. At the same time, one thing that was happening at Wayfair was we had a large monolithic PHP ecosystem and the deploy congestion at this time was like a couple thousand software engineers trying to get their deployments out to a single shared monolith, was getting pretty congested. Like a log jam of deployments trying to get out to production. And we saw that as an opportunity, too.
Well, we have this new ecosystem, this new option we're building over here. We could use this to give teams an alternate option, too, so that it can continue to release business value because that deploy congestion is creating some sizable risk.
So then our goal post moved to essentially establishing ourselves as a beachhead for decoupling from the monolith. That ended up becoming a pretty big opportunity for the organizations, what led to a lot of growth of Python, I think, in 2018 in particular. The majority of Wayfair's production systems even today are still Python because we created such a surge of adoption early on. I don't know if that's like ... That might not be widely known, but Wayfair has ... It's probably one of the lesser known big Python companies. We have many, many production Python systems. It's, by a few measurements, the most used language in the organization.
So we essentially accomplished that. We gave that deploy congestion issue a timely release valve. Then we said, "Well, let's take it to the next step. Let's become Wayfair ... Let's lead Wayfair's developer productivity transformation," because that's something we're really passionate about. That's really what these teams are doing. At this time, our scope is expanding beyond Python. We're starting to look at C#, Java, PHP, just anything where we can help developers become more productive. That's when we rebranded to developer acceleration and just started to go after anything that could unlock developer productivity, could take away friction from developers, workflows, those types of things.
And then today, as you know, we're continuing that journey, I would say our mission has evolved to making Wayfair a world-class destination for global engineering talent. We want people to come here and feel like it's just the best developer experience. They feel really well enabled. They feel like they can bring an entrepreneurial spirit to technical problems. And that's our aspirations at this point is we want to make this a standout engineering shop where you can come here and just really enjoy solving real business problems and bringing that engineering perspective and talent to the table.
Abi: Well, it's really an inspiring journey. I mean, to say it jokingly, you're like a startup that's made it. I mean, I think the journey you've been out of-
Jonathan: That's what it feels like.
Abi: You started with this very specific scope and ambition. And to continually expand that scope and charter to what you're doing today, I think that's what every platform team dreams of. So that's awesome. And we'll loop back around to what you guys are doing currently, but I want to go back then to that early success. You'd mentioned to me, it felt like a resounding success. It felt like a startup that took off. And then you mentioned how you really asked yourself, what are the theories that led to that success?
And I'm hoping that's what we can share with listeners today is, what are the lessons they can apply to have the same success wherever they are on that journey? I'd love to start with, you mentioned reading Crossing the Chasm and then Diffusion of Innovation. I've partly read both of these books. Can you share more about your view on the journey you went on and particularly around adoption?
Jonathan: Definitely. So, yeah, this is coming from a time when we were seeing a lot of success. It absolutely felt like the closest I will probably ever experience to being at a startup that just takes off. That's literally what it felt like. And I was not comfortable with the fact that if someone said, "What did you do? What was your strategy that really worked?" And I'd be like, "Oh, we trusted our instincts." That's terrible advice. And so I wanted to understand it much better then. I want to have an answer if someone would say, "What were you doing here that was working so well?”
And so as part of that, I wanted to do some reading and really tease apart what part of this strategy was working. And that Crossing the Chasm, I didn't actually read it, it was on my list of books to read and it was one that I was thinking, "Oh, this might be pretty familiar," the technical adoption curve. And that's basically what happened here. And before I read a book, I like to read the Wikipedia article because reading books are expensive from a time standpoint. And then I saw that Crossing the Chasm was basically a modern telling and take on a book written in the '60s called Diffusion of Innovations. And I'm like, "Well, which one do I read?" And I landed on, "Well, I'm just going to go to the source." Go to this nucleus for all of this.
And I'm happy that I did. It's a ... I'll put it out there. It's a bit of a difficult book to get through. It's errors on the academic side, but there are some sections that were seeing the matrix level of clarity of, that is the exact pattern that we saw take place and seeing how, yeah, that's why our approach was working here. And we are ... After going through that book, it became so clear, "Oh, these are the dynamics that are happening." And I felt very comfortable that we could then take that knowledge going forward and we have since then to figure out why isn't something being adopted, why is something working really well?
And this started to feel like, "Okay, now, I have a story I can tell." Now, I can go to another team and tell them, "If you're struggling with this problem, adoption related problem, here are some ways to look at that that can help you better understand what's going on."
Abi: So I'd love to get into those. So let's say you're a platform team fairly early, where you guys are, just formed, you've gotten past that proving point. You've maybe started working on some new tooling, no users yet. How can you apply your theories and learnings in that scenario?
Jonathan: I think the three key concepts to understand is what the adoption curve means. And that's basically early on when you don't have a lot of users. You are looking for a particular type of user who is interested in a particular type of product. If you think about.. Use the iPhone as an example. When the iPhone first came out, it didn't even have copy and paste. It was very limited in features. And there were a bunch of holdouts, I would actually call that, myself included, who were kind of like, "I'm going to wait a few generations for this to mature a little bit." And then I eventually bought one.
But there's people who are just like, "Oh my god, this is amazing." They're willing to spend the high price and they're okay with a bunch of features being absent. That's not for everyone. And I think there's like, you have to just recognize that that's the stage that you're in. Don't worry about the people who are waiting for the more mature product. Your job right now is not satisfy their needs. Your job is to find your innovators, your people who fit the archetype of someone who's ready to try something, be the first over the line, that kind of thing, and solve their needs first. And you'll worry about the other group later.
So that's one thing is to be aware of the adoption curve.
Happy to go into more details on that in a moment, but I want to also talk about what makes people fit into that earlier segment. A lot of it comes down to their risk appetite. So there's a bunch of theories you can read on this. I won't go into all the details but, for example, if they perceive high relative advantage, they can look at this and say, "That makes my life a lot better." That makes them good for that early adoption phase.
If they have high expertise, that makes them great. If they perceive the situation as untenable, the iPhone probably isn't a good example of this, but if they're in ... This was actually applicable in our situation. We didn't even realize this, but we were going after teams who were like, "We're struggling to reach our quarterly goals over and over because of deploy friction." They were the teams that we actually partner with first because they just saw that we're in a really bad situation and we need to find a way out. And that can make someone great. That, combined with high technical expertise, puts them squarely in your innovator. You're like that first group users category.
If you end up targeting the wrong group, what it can often feel like is you're trying to sell and sell and sell someone to go and use a thing and they're just not interested. Then there's the qualities of the thing you're trying to sell them as well. So that one gets really interesting. There's like, how easy is it to understand? Can you try it out first? How easy is it to actually implement? Do the benefits ... Are they observable? So in some cases, an innovation can be fantastic, but the benefits just aren't observable. And you'll see some people adopt it and then abandon it because they weren't able to actually see the value it's creating, even if it was creating real value.
So there's a bunch of things that actually cause adoption to stick that are on the solution side and being aware of what those look like as well. You piece those three things together and you can start to be very intentional about, well, what do I need early on to satisfy the needs of that really early group? And then how do I apply the adoption curve to know where I need to transform my thing over its journey?
Abi: I love that model and way of thinking about the initial beginnings of building out a solution or a platform tool. I'm curious, this is a question someone asked me last week, how do you think about personas? Even for, let's say, a Python script, CLI tool you're building, right, you probably have to think about, "Okay, there is 30 different teams doing Python and they actually work in a little bit different ways." Did you have to make ... Was it difficult? How did you navigate really pinning down? Did you have to just focus on some teams instead of other teams? How did you navigate that or what would your advice be?
Jonathan: So to start, we do have some personas. We had them more in the beginning and then we ended up landing on a concept that I got from a book called Let My People Go Surfing, just like a Patagonia leadership book. It's a fun read. But there's a concept in there I really like called core customers. And from what I remember, the gist of it is when they design a backpack, I have a Patagonia backpack, I use it to go on a train into the city carrying a laptop. They're not designing that backpack for me, even though I might make up a larger percentage of their customers. They're designing for someone who's taking that backpack over the Apple-Elation Trail or something like that and having it in the rain, it gets muddy, it gets dropped, all those types of things. And they establish that as their core customer that they're designing for.
I happen to be a happy byproduct of building a good product for that core customer. So we started to borrow that same concept and we established our core customers, call it senior engineers. People who have been around the block a few times, they know what they're doing, they can be trusted with a sharp tool, and we want to look at what their needs are, what type of documentation they need to be effective and solve for their needs first and then help others close any gaps to get to that place.
So if someone's unfamiliar with Python, or Java, or anything like that, how do we get them to the point where they are skilled and knowledgeable in that system versus building something that's a safety scissor version of the solution. I think that's a trap you can fall into unless you really cement the idea of these are the core customers, we're going to solve their problems in some uncompromising way, and then figure out how we can get other people to make that journey to that archetype.
Abi: Got it. So building the professional grade tools, not the box store grade tooling.
Jonathan: Right. I think if I remember correctly, the idea was that if Patagonia was to start building backpacks for me, they would alienate their core customers because the quality goes down. And then eventually, the reason that I like it starts to slip as well and they lose everyone. I think that was the theory in general.
Abi: Yeah, no, that makes a lot of sense. I'm curious. So as you start to get users' adoption of your system, and you mentioned earlier this program where you had people on your team embed with other teams, but I'm curious, besides that, what were some of the feedback gathering mechanisms you had? I mean, were people just coming to you shooting you emails or were you doing... I don't know, what were you doing?
Jonathan: A lot of things. So that embedding with other teams actually became a really key part of just how we ran the department. We would generally try to have at least one person from the platform team. It just contributes to, I called, a feature team's work gathering perspective, helping maybe show them how to use a new tool and creating. But one of the most important things is creating a really tight product feedback loop for us to say, "Where is it rough?" What could we do to improve their situation? When you ... We also go out and do customer interviews, but I find while we do get great feedback from "customer interviews", there's nothing quite like walking in their shoes for a quarter. You'll see what it's really like, what's really a problem.
Sometimes you'll see things that people might assume are not addressable and they totally are. And unless you're actually there living that life, you would just assume that ... You would not realize and no one complained about that specific thing that creates a bunch of friction. So there's always ... Every time we do these, we do a lot of documentation around them and what are we learning. There's always, I think, something interesting that we take away from those engagements.
Abi: Yeah, that's really interesting. I was talking, a few weeks ago, to a former leader at Netflix and he used the phrase like a pebble in your shoe. If you just go ask developers what's painful or what's slowing you down, they might say one thing, but if you were to actually observe how they're working and experience it, you would find these hidden pebbles in the shoes, so to speak. So that makes a lot of sense.
I would love to ask, you've talked about adoption and so I imagine that you were tracking adoption-type metrics. I don't know what that looks like, but I'd love to know what types of signals or metrics you tracked to understand your own success and how those metrics or signals maybe have evolved over time.
Jonathan: Yeah, there's a few different signals we've looked at over any product's journey and that is adoption is a key one. Let's say you have 5,000 engineers in a company. Not every single engineer at the company might be a potential user of the thing you're building. We borrow this term from, I think, the business side of things called a serviceable available market, which basically describes, these are the users who are legitimately potential users of your thing. The way I think about it is the relative advantage is there. If they were using it instead of whatever they're using today, hopefully, that would mean that their effectiveness goes up.
So having at least a broad stroke concept of what that number is, what is your S-A-M or SAM (Service Addressable Market), and then looking at, well, how many people are using it today? In some fashion, we do try to measure adoption. That percentage will land you somewhere in the adoption curve. So what we do is, if we're in the first 2.5% of adopters, following the diffusion of innovations theory, you're dealing squarely with the innovators. At that point, what we try to do is say, "Okay, look, right now, it's about testing assumptions. It's about trying to surface assumptions that we aren't even aware that we're making and making sure that what we're doing is actually creating value." What does adoption of this thing even mean? Does it improve their ability to deploy code or reduce some friction in some way? And how do we measure that?
And so this is where we actually try and figure a lot of that out. And then we got to the beta phase that's like ... We call it a beta, but we're very intentional about what that means. That's the next 13.5%. So that brings us up to 15% of users using it. And at that point, what we're trying to do is stabilize it, refine it, get it ready for general availability. And we have a very intentional take on this.
So once it's GA, you're just trying to drive adoption over the course of ... So 100% as closely as you can get to that SAM. So just looking at the percentage of users who have started using the thing, you can get a lot of understanding of where you're at in that journey. In addition to that, like I said, I do think it's very important to make sure that you are aware of, what does adoption of this thing actually mean in terms of value? Because adoption alone is not necessarily the creation of value. It can be a good proxy of value, but you should be aware of, "Oh, if they're using this tool, they should deploy faster. Are we actually measuring that? Can we differentiate that from teams who haven't used it yet?" That kind of thing.
Abi: Aside from adoption, are you measuring CSAT scores or NPS, things like that?
Jonathan: We do have NPS scores for a few of our products. We have a developer portal is one of the things that we offer. And we try to keep tabs of how people are feeling about it and keeping track of an NPS score there. We also do a lot of, I think, interviews and we're starting to do more with a sentiment focus survey. I was just listening actually to your podcast with Shopify about how they take an approach on that. And so I think we have a lot of opportunity there to get even more mature with how we start to understand and observe shifts in general happiness of engineers and how they feel about the tools that they have at their disposal. So we-
There's some broad stroke stuff we need to do with surveys. I think there's also a lot that we've always done with just pull someone aside, have a 20-minute conversation with them, get their feedback, document it, share it.
Abi: That makes sense. Well, changing themes a little bit. When we were speaking before the show, you had mentioned this tradeoff between organic adoption and mandated adoption, and you mentioned that you think, early on, you shouldn't force people to adopt your tool. I'm sure there are some cases where you would want to force or mandate adoption, but can you share more about your opinion on how adoption should occur in an organization?
Jonathan: Yeah, definitely, especially because this is a concept that's not on the Wikipedia overview of the book, but I think is extremely relevant for platform teams. In the Diffusion of Innovations book, it contrasts what it calls centralized and decentralized diffusion. And the gist of it is this, centralized diffusion. And you can think of this as you have a team that basically says, "Hey, all of you out there, you're using this tool from now on, you have two weeks to comply." That's centralized diffusion.
Decentralized diffusion is the free market approach. It's like, "Hey, we got this tool. You like it. And you send an email out to the organization and see who starts to use it. That's very decentralized. There are tradeoffs between the two. Neither one is correct or incorrect. You lean to centralized when you essentially have ideally low levels of uncertainty, centralized expertise. And if consistency is important, you basically have no other choice.
If you need to have everyone rowing in the same direction, you have to centralize. If that's not important and you can afford to have people adopting things at different times and inconsistencies are okay, then decentralize can be a good option. In that case, if uncertainty is high or you have strong decentralized expertise, all of my potential adopters are very smart, capable software engineers as they often are, then decentralized diffusion becomes a really appealing choice. What you get from that is higher degrees of innovation, generally stronger local optimums. You get people who lean in on autonomy. Autonomy feels great, they use the tool, they maybe contribute a feature back if you have an intersource style contribution option. And you just end up with a better, more effective solution at the end of the day. So decentralized diffusion has a lot of value.
An example of where we've applied this is there was a point in time when we migrated or made a decision to migrate our source control provider. That's definitely not a thing you want to decentralize. You don't want to have people confused about where source code is. So we went through a centralized decision-making process and once a decision was made, we tried to roll it over to 100% adoption as quickly as we possibly could. It still took a little bit because that's a big migration, but that's an example of where centralized adoption is really useful.
With decentralized, which was, I'd say, the majority of our projects, we're looking for people to choose to adopt something because they see it as so valuable. That's a great signal back to us that the thing that we're getting people to start to use is actually solving real problems for them. If something's not compelling enough to get them to use it, then maybe we should go back to the drawing board. Maybe we should revisit some of our assumptions and try and figure out why aren't people using it, how can we make it more valuable, how can we make it cheaper? That kind of thing.
Abi: I love that and it made me think of the difference between bottoms-up, product-led startups and enterprise software because enterprise software, by its nature, is usually mandated adoption once a company purchases it. And not going to name any names, but a lot of enterprise software has a reputation for not being the most delightful to the end users. So with your model and approach of decentralized adoption, like you're saying, I think you get much better signal and probably just much better ongoing accountability and signal around upholding a high bar of quality.
Jonathan: Yeah. And if I could maybe add one more piece of nuance there is you don't have to go all one way or the other. And in fact, you don't even have to stick to one or the other. And the thing that we generally like to recommend is early on in building out a solution, lean in on decentralized tools. And then if you can get to the point where 50% of your addressable market is using the thing, pivot more towards incentivized or maybe even mandate it to close that gap, because at the end of the day, it can be expensive to maintain a proliferation of options out there. And I think to mitigate some of those costs, collapsing redundant solutions, becomes very appealing.
And you are an organization, you can lean in on that. Just don't do it early on. Leave it for the end to just get over the finish line or explore other options like allowing people to take ownership of their local solutions versus having, say, a centralized team managing something that only 10% of the organization is using. You shift that burden to those teams so they have a natural incentive to get off when the timing is right.
Abi: So wait until the end to bring down the hammer.
Jonathan: Yep. Don't start with the hammer if you can avoid it. Unless the situation requires it.
Abi: Makes sense. Well, in the startup community, people always talk about one of the best places where lessons are learned and knowledge shared is in learning about the failures of startups. And so I know you mentioned that you've done this for a long time, launched a lot of things. So you've seen a few cases where adoption either failed or fell short. So I'd love to hear maybe just one of those lessons or stories and what you can take away and what listeners can take away from that experience.
Jonathan: Yeah, we have quite a few where either adoption fell short or just struggled and languished much more than we would expect. We're going through one right now, which I can give some details on, where we essentially have a tool for spinning up ephemeral integration testing environments for Kubernetes. And when this tool is running, it helps teams go a lot faster. It's really great and this thing, you'd expect to take off wildfire. And it got a few people onboarded and it plateaued and we're like, "Why aren't more people using it?"
And so we're in the process of debugging that right now, where the diffusion of innovation concept becomes super helpful is you can look at, well, what's actually getting in the way of adoption here? A naive approach could say, "Oh, it's just awareness." If we made more people aware of it, send out a whole bunch of emails or ping Slack a whole bunch of times, then we get more adoption. That's not necessarily true.
Adoption awareness is just the very first step in the adoption journey. And there's a whole bunch of reasons after that that someone could just choose, "Yeah, I don't think I can do this." And in our case, one of the things we're looking at is the cost of adoption. We'd like to think as a consumer, cost is just the sticker tag of the thing perhaps. But as a software team, cost is generally some amount of cognitive load. You have to build a mental model of this thing and understand how it's going to interface with your systems. You have to have also some general amount of time and energy spent in integrating with a thing.
And so those two costs can be really, really prohibitive for some teams to actually make use of it. And there's a third one, which is just propensity for change. If teams feel like there's too many changes being thrown at them, too many options, they just get overloaded and they just shut down and change. And they want a period of stability before they start to introduce something new. So there's some interesting hidden costs there that might not be as obvious upfront.
So one of the things that we're looking at is, yeah, this tool is really great, but one of the big costs is you might have to mock out a lot of dependencies if they haven't started to use it already. And what we're doing there is saying, "Okay, can we better understand who has the least amount of work to do?" Right now, it could swing from you have two weeks of work to you have three months of work. That's a pretty wide range. And if we just come to a team saying, "Hey, you have between two and 12 weeks of work to adopt this thing," I'm probably going to assume it's 12. I'm not ... Yeah, I'm just going to pessimist. It's the worst-case scenario because we're all software engineers. That's what ... Estimates are usually worse than what they're given.
What we can do is maybe go after the ones who have the simple cases predominantly, get them onboarded, and then look for ways to bring that adoption cost down and also maybe lower cognitive load, maybe develop tooling to automate a portion of the adoption. Anything that we can do to address those, other than the change absorption rate issue, just make it cheaper. We're essentially lowering the price of the product. And that's one option we think of is potentially a big avenue to unlock further adoption of this thing.
Another one is making it more visible to leaders. What type of improvement to deploy velocity or change velocity for the team this creates? If you clue in a leader that like, "Hey, this might take four weeks to onboard as a system, but once you do, your teams go twice as fast," that's a pretty compelling pitch. And trying to get that information in front of the right people can be a big tipping point in driving further adoption.
Abi: Well, thanks for sharing this story, and I love the advice you've given and I love the way you're explaining how teams should maybe think about, no matter how great of a tool or solution you build, if you don't make it worth it or easy enough to actually adopt it, then adoption's going to struggle. And that makes a lot of sense.
Jonathan: Yeah. On our team, we like to really ... It's something that I think is baked into our culture at this point, but we don't just own building the tool, we own driving adoption of the tool as well. So our job's not done once we've built the thing. Our job is done once we've seen teams adopt it, and benefit from them, and actually go faster, be happier, more productive employees, and generally everyone's winning. That's when we feel like, "Okay, now, we can pat ourselves on the back." We're not allowed to pat ourselves on the back when we've only built the thing.
Abi: I love that and can definitely share that. I meet with a lot of platform teams that might be celebrating a little prematurely or are really excited about what they've built but aren't maybe thinking as deliberately about adoption in the way you've explained.
Jonathan: Yeah, it's scary. All of it. It's really scary when you're basically ... Because you can't control adoption. We said if we're leaning in on decentralized diffusion here, we can't force anyone to adopt our thing. But what we're doing is we're essentially signing up for being held accountable for owning something that we don't control. And that's scary, but that's the exact situation startups are in. I can't force my customers to start to spend more money on the new feature I'm rolling out. I can't force more customers to show up. I have to solve a real problem with real externalities that I don't have direct control of. And I think that's really, really scary.
And one of the leadership principles on the team is you have to recognize that and you have to bring the right mentality of support and empathy to the team to create almost psychological safety for saying, "Yes, go out there and be prepared to hold yourself accountable for things that aren't in your direct control and know that you have a leadership layer that's recognizing the scariness of that and understand that sometimes we're just going to get it wrong, and that's okay."
Just like startups get it wrong, we get it wrong, too. And being okay with those. As long as we're learning, this isn't necessarily a failure. I think that's a real important cultural quality to support that type of risk-seeking mentality that I think is required sometimes.
Abi: I love that. Well, the last thing I want to talk about today is, like we said at the beginning, you're the startup that's made it. I mean, you've been a part of this thing from seed stage all the way through multiple rebrandings and expansion of scope. And so I'd love to ask you for your lessons and advice on how other teams can make that leap. I mean, to use another startup analogy, it's going from your series A to your series B to IPO. I feel like you're an IPO with where your organization is at today, but how do you make that leap from working on a tool or a language to rebranding to developer acceleration and then engineering effectiveness? How do you do that?
Jonathan: I think a big part of it is where you set your sites. I think a lot of our success is probably due to the fact that we had people who are highly ambitious. We had a lot of our leadership team that had startup experience and just wanted to have a big impact. And whenever we started to achieve some of our ambitions, it very quickly turned into, "Okay, well, how do we go even higher? How do we take this even further?" And I think that's a big part because my general perspective is I think most tech companies of decent size today are probably underinvesting in developer productivity space.
This feels like a corner we're just in the process of turning. Just the creation of this podcast is one of those data points we looked at and like, "Oh, here's a cool new podcast on this topic." I think the tech industry is starting to get a little more clued in that there's a real strong investment opportunity area there that I think a lot of us might be neglecting. So chances are if you're on one of these teams, there's a lot of untapped value just waiting to be realized and your team could be the one to go and make a strong case for that and what that takes.
Another one would be to prioritize measurement. Measurement is so important because it's the best you can do to creating that approximate for a revenue stream that you'll never have actual revenue ... Maybe there's an interesting business idea there. They start to charge software engineer service. That's not what I'm interested in. But you're never going to have that real tangible rubber hits the road dollars and cents measurement. So you have to find ways to make up for that.
Also, having alignment with the leadership group on when things are getting better, when they're getting worse, when they're stagnating. I think there was, I think, another one of your podcasts actually that I want to say it was Dropbox where they talked about the value of that and creating a shared understanding with leadership of what the value potential is and what the right amount of investment is. I would suspect that most organizations are underinvesting. That's probably the tool that's going to unlock further investment. If you can't measure it, you basically can only use rhetoric at that point. And the problem with rhetoric on top of just you're making it up is anyone could steal that. Anyone, at some point, can come in with a stronger rhetoric and challenge what you're doing. And unless you have the data to go back to, whoever is the best salesman's going to win.
Abi: Yeah. Well, I love the insights you've shared and completely agree that there's an enormous opportunity for impact that these teams have the potential to have. And love the tips that you've shared to help folks, hopefully, be able to continue growing their scope and impact in the same way that you've been able to. Really enjoyed this conversation today, Jonathan. I think a lot of leaders are going to find it really inspiring. I love the way you think about adoption and all the tips you've shared today. Thanks so much for coming on the show.
Jonathan: Yeah. If I can give one last parting word, it would be, I know I focused a lot on myself in this, but I'm standing on the shoulder of giants of other parts of the team, other leaders, other brilliant engineers. So I just want to make sure that this is not all me. I'm just the one on this podcast. This is very much a group effort and there's a lot of amazing people that, if they weren't involved, I don't think we would've been nearly as successful as we are. So just a thank you to all the people who have been part of that journey.
Abi: Awesome. Well, thanks again for coming on the show.
Jonathan: Thank you.