Thansha Sadacharam, who leads Tech Learning and Insights at Peloton walks us through the journey of building the company’s developer experience survey. She shares what went into the survey’s design, rollout, and maintenance, as well as the different teams involved.
Abi: Thansha, thanks so much for coming on the show today. Super excited to chat with you.
Thansha: Yeah, excited to be here. Thanks so much for having me.
Abi: So I’m so excited to dive into your journey with your first developer experience survey that you put together at Peloton, and I know you’re coming up on one year, and you’re rethinking things, so it’ll be awesome to get your lessons and the changes you’re thinking through now. I want to go way back through to the beginning when you were first exploring running a survey. Could you start by sharing a little context about what your role and your teams were focused on at the time you began thinking about running this?
Thansha: So when I joined Peloton almost two years ago now, my team’s mission was to start and stand up our tech learning function, so the tech learning function didn’t exist at Peloton, and so I was brought in to build that. And so one of the very first things that I looked at was onboarding. And in building my case for developing an onboarding program for all of our engineers, one of the things I looked at was our time to first and 10th pull request, and it was really, really interesting to get that number and do a bit of a comparison to things I’d seen in my past life at Shopify and things that benchmarking in the industry of, this is what a good time to first is. This is an average and building that case.
But in presenting that information to our engineering leadership team, engineering leaders love numbers and we know that, but I think numbers don’t often tell the whole story. And so I did a ton of interviews. I think I ended up talking to over a hundred folks at Peloton, which was a great way to meet people, but it was also a really, really great way for me to build a very holistic picture of that included qualitative and quantitative inputs to really understand what the onboarding experience was like and build the right onboarding solution for all of our engineers that were joining Peloton. And in that work, I was referring to our VP of platform at the time and we were spending a lot of time talking about what are some of the things that we wanted to be able to use to understand the developer experience at Peloton.
And so then when my former boss, Jason joined, he was super passionate about creating more of a qualitative input for actually understanding that developer experience. And I think we’re still looking at and trying to understand a lot of different metrics, but at the end of the day, I think I very strongly believe, and I think a lot of our engineers also really appreciate this, that engineers aren’t robots, they’re humans. And just looking at numbers or looking at certain key metrics don’t drive the whole story. So for us, having a really comprehensive survey that helped us understand that entire developer experience was really important to not only shaping tech learning programs, but also feeding into our platform team for our engineering managers and our engineering leaders to understand where they should focus their energies on to drive developer efficiency, effectiveness and engagement essentially.
Abi: Well, first off, that’s an incredible number of interviews you personally conducted. You’re basically a human survey tool. I want to ask you, you mentioned your former VP Jim, right? I know him as well. I know he was part of the inspiration for this. Can you talk more about how you guys were all inspired to even go down this path? It sounded like there were conversations before he had left, he had maybe had some prior experience or you were looking at some other companies. Where did the inspiration come from?
Thansha: Yeah, so when I joined, there was a deck that he had put together on the vision for the platform team and there was something there about measuring, having a developer satisfaction score. And so he worked at Spotify before that and had a ton of experience in the industry. And I think this was definitely something that Spotify was passionate about. I’m also well connected with some of the folks at Spotify from my previous lifetime as well. And so that was definitely something that he had always wanted to measure along with some of this other dev insights work. And so he had done a good job evangelizing that even before I got to Peloton. And the really great thing I think about the leadership team on our engineering side at Peloton is they very much wanted these insights as well.
And so that was really great because when we had to start, because at the end of the day we’re asking all these engineering leaders if we can have their developers time to fill out this really long survey with the promise of, we promise useful things are going to come out this and that we’re going to be able to on the right things and prioritize.
And so it was really good that some of that kind of, I guess groundwork had already been placed. And then when we approached the rest of engineering leadership and say, we want to do this, we didn’t actually get a ton of pushback, which is really amazing. And I think in fact our engineering leadership to this day, they’re still quite hungry for those metrics and for ways to understand the developer experience. And we can get to this, but when we actually did our first couple surveys, we did an entire software review on it where we sat with the entire, so our SVP of engineering and all of our VPs of engineering actually go through all of the results and present back. These are all the different results. These are the key factors that are driving satisfaction or not driving satisfaction. And so having that groundwork laid and then having that buy-in, I think was super helpful and important.
Abi: Well, that’s awesome to hear you had such great, that was going to be my next question. I think leaders at other companies sometimes struggle to get buy-in for these types of initiatives. What advice would you share with some other leader out there who listens to this and is inspired and wants to do it? How can they convince other executives and leaders that this is something that’s valuable and worth doing?
Thansha: Yeah, I think we were really lucky because the way that the timing of this all worked out is that we had focused on what at the time, especially for Peloton during 2020 and 2021, Peloton was hiring a ton of engineers while we were going through this wild hyper-growth phase. And there were a lot of pain points that engineers were running into. And so I think that first thing that we did right was we found a problem and a pain point that a lot of engineering leaders were feeling because they were getting their engineering managers and their new hires, or if they were a new leader and they got brought in, they understood how painful the onboarding process was, instead of just going out… And I’ve built onboarding programs before at other companies. I’ve ran onboarding programs before at other companies. I probably had an idea of what I thought we needed to do, but instead of just going and building that and being like, look, we got all these great results. We were very, very data driven and informed in our approach.
And so I partnered with our dev insights team to get those first couple early metrics on time at first and 10th, and we did a comparison to industry standards and benchmarks and we’re able to show our engineering leadership, we’re losing this much time every single time we bring a new engineer in because of our onboarding process. And then we paired that with those qualitative insights from all of those interviews that I did. And I think presenting it in that way really got our engineering leaders, it made that conversation really easy because they were seeing all of this data and understanding the value that it was bringing us as the team that was building these programs because then we were able to shape our solutions around that. And I think that that approach definitely, I think it built a lot of trust. I think that’s really, really important to do.
And so I think about it. You’re giving your leadership a little bit of a taste of, these are what insights can do for us and these are the problems they can help us solve. And when I first initially started talking to some of our engineering leaders about building out one onboarding program for all of Peloton engineering, got a little bit of resistance of, well, that’s going to be really difficult to do because we have lots of different types of engineers and they have different means. How are you going to build one program? And there’s obviously ways that you can do that because lots of large companies do this. And I think having all of those conversations, having some of those insights really helped us drive how we created that solution. So I think that helps.
So giving your leadership so you don’t have to go full blown hundred question survey and go to measuring DORA metrics immediately you can actually, maybe think about what is that one pain point? What is that thorn in my engineering leadership team’s side that maybe is not the most top priority thing for them in this current environment, but is a thorn in their side that if you picked it out, it would make their lives a little bit easier?
Approach that from a data-driven and informed way. And I think can help get some of that buy-in. And I also think we were really lucky, we were thinking quite a bit about product thinking for our platform team. And even in the way that I approach program management, I always try to think about med programs as products. How do we make sure we’re building the right things in the way that a product manager thinks about product market fit?
How do we think about ensuring that if we’re running a learning program that people actually want that program and actually need that program. For our platform team, this is super important. How do they identify the most, the biggest pain points for our developers and attack those and fill those? How do we drive efficiency and engagement for our developers? These are all big questions you have. And when you’re a small organization and there’s 10 people, you can obviously talk to 10 people or try and do what I did, which is hours and hours of interviewing. This is a more efficient way to do that. And so I think starting small and then really showing that value can really help.
Abi: Well, I love the piece you share it around thinking about what’s sort of the thorn in the side of your leadership team that’s similar to advice I always tell Devex leaders to when they’re getting started, really pick out the thing like you said that, what is the C-suite staying up at night worrying about as far as it pertains to developers? And if you can align your effort with that thing, then you’re much more likely to get buy-in. So I completely agree with the advice you shared. I think that’s really insightful. I want to move into the design process for the survey, which I know was extensive. I want to start at the beginning. Once your team decided, we’re probably going to do this thing, what was the first thing you did to get started from a design standpoint? I recall you sharing that you spent quite a bit of time just doing research into how other companies do it. Is that what you did at first?
Thansha: Yeah, so I call this phase desk research. Or I think whenever you’re building processes or programs for teams inside a company, I think it’s important to look both outside and inside. I’ve been in tech for 10 plus years, and so very lucky to have a bit of a community that I could reach out to. So I actually just spent some time reaching out to other folks at other tech companies that do this just to understand what were they thinking about, how had this helped them, how had they sold it to their leadership teams? What value did they see that came out of it? So I spoke to folks at Google, spoke to some folks at Spotify, spoke to folks at Shopify, did a ton of reading. And it’s really interesting because I think in this space there’s a lot of gray area and it’s constantly evolving, but there’s so much research and literature out there on this topic.
So I just did a ton of reading, obviously read the space taper and learned all about that framework, all about door metrics, all that stuff, and just spent a ton of time reading and I think there’s so much thought leadership in this space, it’s really, really great. And so that really gave me an idea of how other people are thinking about it. This is how other people have rolled it out and it’s made sense for them. Then brought that back internally and thought about it, now we need to do some research to understand internally how we’re working. And so internally, what our research looked like was a little bit different than our external research. But internally, I have a designer on my team and every time she sees these slides, she’s like, “Thansha, these are terrible.” But I made essentially a journey map.
It’s a very terrible looking journey map, but a journey map regardless. And we called it a day in the life of a developer at Peloton. And so we have lots of different types of developers. We’ve got Android, iOS, back, front, end all. We’ve got a bunch of different types of developers. And so we actually broke it out by type and created user types essentially. And we mapped out this journey from the moment that they opened their laptop all the way to when changes are released. What are the different tools, the different processes, what does their day look like? And I think that was really, really interesting. And again, a ton of conversations, like a ton of just setting up interviews and having developers walk me through their day, but it really helped us narrow down what that current journey looks like.
And then we mapped that to, what’s our desired outcome? This is what it currently looks like, but where do we actually think we want to go? And that process helped us identify the different tools that our engineers were using, the different ceremonies and rituals depending on the team that they were a part of. And so that was really, really helpful in actually shaping our survey. We were able to basically take all of this external research that we had on how other companies were doing it, all this stuff on the space frame mark, and then how our engineers were going through their days and map it to a survey. And so when we got to the designing part or prototyping part of this survey, we essentially created nothing fancy but a Google spreadsheet and we broke down essentially each letter of the space framework into a section. So we had a section and then within each of those sections we created different factors.
For example, within let’s say our satisfaction section, we might have put tools, and then we basically went back and looked at all of those maps, my hideous maps that my designer would make fun of me for, but this really, really rough map. And then mapped it into our survey of, these are all of the tools that our engineers are using when it comes to testing. These are all of the tools that our engineers are using when it comes to collaborating. And we essentially bucketed all of these things out into different factors. And it was so helpful because it gave us an understanding of, there are things that our engineers are doing currently because of the way our systems, our processes and our tools are built. And then there’s this what we know is a bit of the gold standard. And we essentially created factors and questions that merged both of those, and that was really, really helpful as a starting place for us to prototype. And that was really informed by all of that research that we did.
Abi: Well, I really love the day in the life of a developer mapping and using that as to keep yourself centered on the developer. This is the state of developer experience. One quick question to loop back a little bit. You mentioned talking to folks at companies like Google, Spotify, Shopify. What were your impressions of Google’s approach?
Thansha: Yeah, I think Google is so much further along in this journey than we were, and so that was really interesting just to see the rigor that they took. And I think this is why it’s important to balance that external and internal research. I wasn’t necessarily like, we have to go and copy exactly what Google’s doing because Google’s so different from Peloton, just the scale, the number of engineers. And so I think for me, I was like, wow, this is really amazing and really interesting to see that there’s such a rigorous approach. But it was also really interesting. So I used to work at Shopify and speaking to colleagues at Shopify and they were like, yeah, we don’t use the space framework. We really think about our own developer experience and we really think about looking internally. And so that was really interesting too, I think for me to look at both those approaches. Super rigorous and tons of research and back from that Google perspective and then thinking about if I was much more inward looking.
And so I think we tried to balance that. I think at Peloton in particular, we’ve got a ton of leaders who have worked at places like Google and Meta and Big Fang companies, but then we’ve also got people who’ve been with Peloton for 10 years, for seven years. And remember when it was a small scrappy startup when there was just 50 engineers around and they all fit into the same room. And so I think we wanted to marry that spirit. And so I think it’s helpful to have that context of Google and how they were thinking about things, but I think it was really important for us to look inward too. And I think I would say that to anyone who’s thinking about doing this, is really think about your company and where your company is and customize your survey to that. I think that that’s really important.
Abi: I remember when we spoke a few months ago, you were talking about how you wanted to use the same questions as other companies so you could potentially benchmark. Was that something you ended up doing?
Thansha: Yes, so actually it was really interesting because we like benchmarking. I think I mentioned earlier in the way that we were thinking our time to first and 10th I think is really important. I think it’s really common across the tech industry for us to do that. But it was actually when I partnered with our talent team that they were actually, we’ve got a bunch of benchmarking data that we can share with you. And so that was actually really, really helpful in saying, we don’t necessarily have to ask all of the same types of questions, but there are some general questions that we want to ask so that it can help benchmark satisfaction, I think was the big important one that we wanted to benchmark. And I think even now with our engineering leadership team, that’s still something that’s really important. And even with our talent team I think is still really important to be able to say, are we sitting above or below where the industry’s sitting at?
I think when you’re thinking about developer experience, you’re obviously thinking about it from an efficiency and productivity perspective, but I think you’re also thinking about it from an engagement perspective with the mindset of engaged engineers and engaged developers are happier and more effective is I think one of our hypothesis anyway. So it was I think important for us to use some of the same questions so that we obviously didn’t want to ask things that were so left field that we couldn’t benchmark at all, but we wanted to have a bit of a balance so that we could say, satisfaction is really, really important for us to benchmark, but the way that we do, I don’t know, testing or whatever, maybe just because of the stage that Peloton is at might be very different.
And so that we want to ask questions that’ll actually give our teams the most advantage and being able to leverage some of those. I would be curious though, I don’t know, maybe five years from now or maybe three years from now as Peloton grows and scales, if our questions will change and become more similar to let’s say what Google is doing or something like that.
Abi: You brought up measuring satisfaction. So I want to double click on that because it’s so interesting. A lot of companies say they measure satisfaction or developer satisfaction or have DSAT as even a C-suite metric, but when I ask them, what’s the actual survey item? They’re actually very different. Well, how is satisfaction measured exactly with you guys?
Thansha: Yeah, so there’s two questions that we ask and so that’s being able to recommend Peloton is a great place for engineers specifically to work and being really proud of the work that you do at Peloton. So those are the two big questions that we ask. Again, for us, I think DevStat is important to us. It’s definitely an important metric. And I think the way that we look at it is there’s so many factors that impact DevStat. I think it’s also really important to, that’s one metric, but then you need to look at so many others to really understand the holistic picture that I try at least with my leadership team, to not over-index on it. It’s definitely for sure, it’s obviously the headline that we share with our leadership team, but the way that I like to present is, this is a headline, but let’s double click.
Let’s click into what is maybe contributing to that. And I think the greatest thing about this is being able to read all of the comments and the feedback and seeing the things that really bother engineers and you’re like, oh, that’s a thing that we can fix. And it’s so particular often. And so I think that’s a really important part of developer satisfaction and productivity is that I might have read this somewhere too, so I might be quoting somebody, but I don’t think one stat or one metric can give you all of that. And so I would encourage people as helpful as it might be to do some benchmarking and things like that, just remember it doesn’t tell you the whole story and don’t feel bad if you’re a little bit lower than others. And so I think that’s important to keep in mind.
Abi: Don’t reduce productivity down to one metric from the space framework. Good memory.
Thansha: I was like, I read this somewhere.
Abi: And I only know I’m working with Nicole and Margaret on a newspaper, so that’s top of mind. But I’m going to go back to the survey design process. So as you were doing all this research and talking to your talent team, I’m guessing you ended up with the spreadsheet with a hundred questions.
Thansha: So many questions.
Abi: What was your process for beginning to refine these, test them perhaps? How did you get down from that initial list of probably hundred, if not hundreds to your final set?
Thansha: We still have a lot of questions. And as I was mentioning to you earlier, we’re in the process of evolving the survey. And so I’m curious to know because now we have partners coming to us wanting to add questions. And so now I’m like, got to go back to thinking about the right size. And so I think one thing that was really helpful on our talent team, we’ve got folks with really strong backgrounds in psychology and organizational design. And so it was really, really helpful to partner with them to actually think about how we even ask those questions and really thinking. I think that’s another really hard part about survey design. And I get this all the time when I talk about the state of developer experience survey that people are like, well, how did you structure your questions?
And I’m like, that’s really hard to tell you because you’d have to read all of it to really understand. I wouldn’t be the right person to tell you that. It would definitely be the people on our talent team that would be able to do that. And we also had a couple data scientists on our dev insights team partner with us too and think about survey design and think about the way that we were asking questions and really refine that. So that was one thing I think with the questions. And so they were really helpful in really helping me anyway, understand, what is this question? What result and is it actionable? How are you going to be able to drive action from these? And so that was really helpful because I think sometimes when you’re creating these, it’s easy to be like, well, I’m just curious and I want to know.
And it’s like, that’s actually not that helpful. And so how do you make it actionable? I think that was one thing that we definitely really thought about is how do we make this actionable? And so that was really helpful. And then also really partnering with our platform engineering team and our product managers on the platform engineering at the time was really helpful too because they’re going out and speaking to customers like our developers all the time. And so they’re also hearing pain points and different things like that. And so places where we knew we had enough information, we were like, we don’t need to constantly keep digging into that. We don’t need to put salt in that wound. We know it’s a pain point and so we can maybe reduce something there. And so that’s how we thought about it.
I think I’ve mentioned this to you before too, is our survey’s still quite long and that’s part of the reason that we think about splitting it up so that if you’re a developer, you only get it once a year because we do want people to feel like, I only have to fill this out once this long survey that Fanta sends me and that I don’t have to constantly fill this out. Survey fatigue is so real and it’s why we partner with our talent team so that we’re not sending out this long state of dev survey when they’re about to send out our Peloton engagement survey. And so survey data is very useful and helpful, but you obviously don’t want to abuse how often you’re asking people for this information and things like that. And so that was definitely something that we were thinking about when we were designing.
Abi: So much there that you shared. That’s interesting. Just to recap, how long was the final survey? How many questions approximately and how long to fill out?
Thansha: I want to say it’s like 50 something questions now, but I don’t know if that is completely accurate because it also depends on how you answer some of the questions. You might get separate drill downs. So give and take, I would say anywhere from 15 to 25 minutes, also depends on how many comments. I’m not a commenter in surveys, I do not. I’m the worst survey taker I guess because I never drop comments in. I always just answer everything and then send it off. So it also depends on how many comments you write.
And so it’s really interesting too when we’re looking at the survey results, sometimes we get paragraphs and paragraphs of comments and then other people are just like one-liners or they’re like me and they don’t write any comments at all. So assuming people like me are going through it a little bit quicker and then people that are actually spending a lot of time with it are probably spending I would say at least 25 minutes on it, which is a chunk of time for sure.
Abi: Absolutely. So I want to ask you what exactly was the trimming process? Was it just you deciding reordering things in the spreadsheet and deciding what would make that top 50 cut? Or did you do it by committee or were you going to stakeholders and iteratively trimming it down?
Thansha: Yeah, it was definitely the latter. So going to stakeholders and iteratively trimming it down and it’s this fun little dance that you do, right? Because sometimes you go to stakeholders and they’re like, actually we want to add this question. And you’re like, well if we’re going to add that, then what are we going to remove? And so that was a very iterative process in this big stakeholder groups. I started with the talent team because I thought the talent team would be able to really help shape some of those questions and reduce some of those things down. The other helpful thing with the talent team too is they’ve got another survey that they sent out. So we didn’t want any repetitiveness in our survey. And so there were a bunch of questions that our talent team was like, oh we actually, we can ask this or we’ll put this in ours and I’m like, great.
We’re still going to get that information but in a different format at a different time, which is amazing. And so I think some of those more general things like learning and development, growth, all of that stuff that do contribute to happiness and retention and all those effectiveness, all of those things we don’t necessarily feel like we have to ask in our survey. So that was really helpful. And then we met with our platform PM, so really working with our product managers on the platform engineering team to then say, what do we already have enough information on? What can we remove? What do we want to add? And actually even now in the evolving of the survey, we’ve either removed a bunch of tools or added some new tools. So working with the platform team has been super helpful even now in evolving it so that we can cut down on some of that stuff.
And then it was also working with our data scientists to say, where can we pull maybe something that is more quantitative and so maybe we don’t need it in the survey and things like that too. And then of course, leadership reviews. So Jason reviewed it and added his thoughts and thought where we added some of his input. And so I think that was really helpful. And then when we got the survey back, we actually took some questions out because we were like, this actually didn’t give us anything. And I think that process was very, very long and very, very iterative for us to be able to get to a place where we’re like, this feels good. And even now we’re going into a year of it, we’re now thinking, how does this evolve? What do we need to keep in there? What do we need to take out? What are new things that we need to ask? So it’s a bit of an evolving process.
Abi: Speaking as someone who’s written a lot of bad survey questions, I want to ask you personally, were you nervous about some of the questions just flopping? You were the base of the survey.
Thansha: Oh yeah.
Abi: And I’m curious, how did that feel? And also for you personally, what are the most interesting things you learned about survey item design through this process?
Thansha: Yeah, I think also it’s when you’re sending this out, so I’m not an engineer, but I work with engineers every single day. I was just waiting for that really blunt, logical mania. This is a bad question coming back as a result. And we got a couple of those. So I think that you have to be prepared for that I think, because I think everyone will have their opinion. So I think that’s something to keep in mind. It’s definitely a nerve-wracking process. And I think in addition to that too, you’re asking people for all this time, I think there’s also was just a little bit of nervousness on my end of I think this is going to be really impactful. I think this is really going to provide a lot of value. And there’s a lot of people bought into that at the organization.
So I’m hedging my bets here. I think that this is going to really help, but it is really nerve-wracking because you’re asking your leadership team and your engineering managers to get behind having their teams pull away from work to fill this information out with the promise of if we go slow now, we’ll be able to go fast. Give me this 15 to 20 minutes now, it’ll help us go faster, it’ll help us focus on the right things. And so I think that is really, really, really nerve wracking. I think some of the things that I learned just about survey design in general is that I’m definitely not an expert and that I need to rope in the experts. And thankfully we had those inside of our business, but I don’t think, and I wish I could probably go back and look in Google changes, look at the original version of the survey I made.
Some of those questions were so terribly worded and just learning about scales and, do we want to do open-ended versus not? And how do you gather comments? And I think that was really a huge learning for me. And so for us, we decided to go, one of the things I learned was like, let’s go with a scale in the way that we shape all of our questions that they could be answered on some type of scale. Because that’ll help us easily view trends when we’re looking at results. And when I first designed the survey, I had a ton of open-ended questions and got feedback from my talent team of, this is going to make it your life really hard when you’re then analyzing a ton of data and your data scientists are going to hate you. So I think that was really eye-opening in the terms of the way that we even just wanted to structure those questions. So that was definitely one of my key learnings.
But honestly at the end of it I was like, thank God for all of the partners that we had inside the business that are experts on this and that we were able to build this with. Yeah, because I definitely don’t think I could have done it on my own.
Abi: Yeah, I’m sure we’ll swap notes later, but same experience as you getting into… And it’s so funny, of course I work on a survey product and I remember when we were getting started on the initial design of it, so many engineering leaders would be like, what? This is just a survey tool? Behind the scenes we were working nights and weekends just painfully crafting and refining these questions. And I think until you actually get into that work, you don’t realize how difficult it is.
Thansha: And I think when I go to conferences and when I talk to other engineering leaders about this, the question I get all the time is like, oh, what are the questions that you ask? And I think even though you might initially get a reaction of like, oh, this is just a survey tool, but with the questions in it. So I think that’s the hardest part about designing the survey. Because I think that especially when it comes to developer experience and understanding that experience, shaping that is definitely the hardest part.
Abi: So looking at this big picture, how long did this process take in terms of human hours collectively that went into it? And also just in terms of the time span? Was this weeks, was this months of design? What’s the summary of the process?
Thansha: It was definitely months. It was definitely months of design and a lot of human hours I would say on my part. And human hours, thankfully by the graciousness of all of the partners that I worked with at Peloton too, of Evan Insight’s team and our data scientists. So a ton I think of human hours that went into designing it. And I think I was mentioning earlier, it’s not done because we talked about the design process. We didn’t even get into the launching process of having to build out an entire communication plan to then explain what this is to. We had talked about it to all of our VPs and all of our VPs and our SVP was bought in, but it was like, we have to now explain this to directors and managers because they’re the ones actually managing the time of our engineers and we have to tell them why we think this is important.
And so it was often me going to a lot of engineering manager meetings and director level meetings to explain what this was and why we were doing it. And then building an entire comms plan around it and launching it. And then managing the launch of it and sending out reminders and making sure that watching that participation bar painfully slowly crawl every day and bugging Jason to send out reminders. It was just a lot of, even that part after the design, there’s also a bunch of stuff that you have to do and then you get the data back and then you have to analyze all of it and decide what you’re going to do with it. And thankfully we had a data scientist that I was working with and so we were able to pull some really, really great metrics and then we ended up writing a report. And so we call it the state of developer experience at Peloton because we actually produce this pretty long report that then goes out to our leadership and platform team.
And then it’s sharing all that data with all of our leaders and coming up with action plans and communicating that back to engineering because you want to build that trust of, every time I send up this survey, we’re actually going to do something about it. It’s a ton of human hours.
Abi: I want to ask you about the communication plan you talked about. What’s your advice to think? A lot of leaders don’t, again, similar to survey design, they don’t think about the fact that there’s going to need to be a communication plan until they get to the 10 mile line right before they’re launching the survey. What’s your advice, what worked for you or what didn’t work?
Thansha: I’ve been thinking about communication plans a lot lately because we’re rolling out a bunch of different processes. And again, I read this somewhere and I don’t know who I’m quoting, but I think it’s like, you need to communicate the same thing seven different times instead of different ways. And I think that was definitely the case for this survey because we’ve got our Peloton wide engagement survey, everyone knows what that is. But now we’re saying, oh, we’re going to launch something that’s specifically about your experience as a developer. And people are like, what are you talking about? What does that mean? We have to think about how we communicated that. And I think the approach that we took was obviously we made sure we had leadership and I think that was really, really important. And again, was so lucky to have some of that buy-in built in already.
So that was really, really important. But then we also had to explain to our engineering directors, our engineering managers and all of our engineers, why this was really important too and what’s in it for them. You’re asking me to fill out a survey, what am I going to get out of this? And so we really tried to tie it back to our platform engineering team’s mission and really tried to tie it back to there’s a million things we could work on, but we want to work on the things that are the most important to you and that will drive the most impact for you. And we want to build the right processes and the right tools and systems that will drive efficiency, effectiveness, engagement for all of you. And so that’s why we want to build this.
And as you can tell from a theme of theme of this conversation, I spent a lot of time talking to our engineers and I think by the time the state of developer experience started, we started having those conversations, the concept of time to first PR and 10th PR and we started looking at a bunch of metrics with incident management. So meantime to mitigation and resolution, these metrics around engineering performance became part of the vernacular and people started to understand, we’re becoming a bit of a bigger company, we’re starting to look at these types of metrics. And in my conversations with engineers where that always went was like, but these metrics don’t tell the whole story and how do we get that holistic picture? It’s actually by hearing from you and that’s how we help shape some of these things.
I always go back to that onboarding example. I never would’ve been able to build the onboarding program that we built if I had just looked at time to first intent. That would’ve given me maybe one course that we built, but everything else was actually from these qualitative insights that we’re able to gather. And so we really tried to bake that into our communication plan too of we want to get a holistic understanding and a holistic picture. We don’t want to be a company that just becomes obsessed with a metric and driving down that metric and as soon as you give people a metric, they’re just going to try and game it anyway. So we don’t want to become that, right? We really want to think deeply about how we drive developer experience and understand that it’s a gray area and that at the end of the day we’re working with people.
And I think having those inputs was really, really helpful and building that into our communication plan helped us get that buy. And then it was also thinking about, what are we going to communicate once we’ve got all these results back? And so we actually publish our report out to all of our developers so our leadership team gets it at the same time that everyone else gets it. And then we meet with our leadership team to go through it and dig in and go through any questions and things that they have. Now you can go onto our confluence and go to the state of developer experience landing page and see all of the results and see the actions that we committed to.
And we always report on, this is what we committed to, this is what we hit this quarter. And I think that helps build that trust too. So it was really actually, I spent a lot of time talking about trust with our talent partners of how do we build trust into this process. Again, that was also another learning of just with survey design, how do you build trust and obviously into the design of the survey, but then also into the way that you communicate out and what you do with those results. All of that is so important.
Abi: A question I have, you said in the CI developer experience report, you talk about what we have done with the results. Who is the we? Is that all engineering leadership or is that just the platform team? I think that’s something I’ve seen organizations struggle with, is who is this data for and who should be taking action based on it?
Thansha: And so our thinking on this is evolving. When we first started and also when we first started this, I was actually a part of the platform team. I’ve now moved and reported into software programs. And so initially this was very much a platform engineering project, and so all of the actions were actually created in partnership with our platform PMs. So the results would come out, we would present them to our PMs and we would go through that data and say, and they would be like, oh, yeah, we’ve been hearing this and seeing this. We’ve got this on the roadmap already. This is new. Let’s think about how we can tackle that. Maybe this quarter we can come up with a plan or whatever. And so it was definitely our platform team. And in the evolution, we’re now thinking how do we empower our engineering leaders with some of this information and how do we make sure we present that information back to them and how can they drive actions within their own teams or their own departments?
That’s something that we’re starting to think about because now that we’ve got this vernacular part of the organization, people know what to expect. It’s like, now I think we can push some of that action down. And I think along with platform engineering, our leadership team, they’re obviously huge stakeholders in this, so there’s definitely actions that they were thinking about driving or wanted to help the platform team prioritize. And now we’re definitely thinking about, how do we evolve this so that it doesn’t just have to sit at the VP and platform engineering level to drive actions on this? How can we empower our engineering managers or our directors of engineering to drive some action on this?
And I think there’s obviously going to be things that are very like, this is a platform thing that, or is an infrastructure thing or something, but there’s a lot of things around documentation, the way that teams collaborate, all of that stuff that is not necessarily related to tooling that managers have a direct impact on or directors have a direct impact on. And so I think in the evolution of this survey is going to become much more of the focus as we go into year two, is really thinking about how do we empower our managers and our directors to do that.
Abi: I love that. Well, we’ll probably have to do a follow up episode to hear about how that goes. I want to ask you earlier you talked about stressfully watching the participation rate meter take off. So I want to ask, what participation rate did you get? And I’ve heard of some companies giving out bribes to people for filling these things out. What are your keys or lessons for how to drive participation?
Thansha: So our participation rates, it’s I think close to 60%, which is okay. I think when I first designed this I was like, we’re going to get 80% participation. It’s going to be amazing. And we did not. And again, because we’re sending out a long survey and so we haven’t necessarily taken the approach. There are some things I know we could do that would be a little bit more heavy handed that we haven’t necessarily done. We could definitely turn it into a competition. We’re Peloton, we love competition and we definitely do do that for our engagement survey. And I definitely try and ensure that my team has it filled out immediately so that we’re always leading the race. So there are those kinds of strategies that I’ve seen even used just at Peloton. But I think because our talent team has leveraged those, I just want to be mindful of how many times we’re doing that.
I think going into the second year, we definitely want to rethink that and think about, how can we drive that number up further? I think for us, because it was the first year that we were doing it, the data that we were getting back was so rich and so helpful that it wasn’t one of those things where I was like, this is really killing us.
And interestingly enough, a lot of results I don’t think super shocking. We looked at the state of devex experiencer, especially for our platform team who spends all of their time talking to developers. They were like, a lot of this makes sense. Now we just have the numbers, which is really great and it allows them to dig a little bit deeper. So for us that rate, I think it’s not exactly where I wanted to be. For the first year, it was fine and it got us the information we needed. I think as we think about evolving, we’re definitely thinking about, how do we increase that number? How do we think about sample size? So we were sending this out, we were originally planning on sending this out three times a year every single year. We’re now shifting to twice a year just so we can increase the number of people that we’re sending it out to and see how that impacts numbers and responses.
The interesting thing is we’re seeing an upward trend. The first time we sent it, I think our participation score was, I don’t know, something like 52% or something and it’s like gradually gone up and it’s gone closer to 60. So I think again, building that trust of filling this out, it’s going to make a difference, all of this stuff. I think that’s another strategy that we thought about and that was the approach I leaned on in the first year of like, let me try and build some trust as my strategy versus turning it into a competition or giving out free t-shirts and stickers. I don’t know, I might have to give out stickers. We’ll see. We’ll see how year two goes.
Abi: Yeah. First of all, interesting that you mentioned you’re seeing the participation rate go up after subsequent surveys because that’s so counter to what most people assume. They seem, oh, after the first one is just going to fall. But I’ve seen the same trend with other companies as well, and I think it’s because the survey program improves and becomes better communicated, it becomes more a part of the organization like you were talking about earlier. I also wonder as you explore, you mentioned one of the goals is to make the survey more for the frontline managers, directors, et cetera. So I wonder if as you make the survey more for the rest of the organization rather than for just the platform team, you might see even more uptick.
I wanted to follow up on the frequency piece. You mentioned you’re going from three times a year to twice a year. One thing we didn’t touch on earlier in this conversation was sampling. So if I recall, you used sampling in your previous surveys. Can you explain why you did that and why you’re now maybe moving away from that?
Thansha: So we’re still doing sampling, but I think a little bit differently. So in the first year that we ran it, we essentially took our entire developer population and divided it into a third. So we would sample a third of the population and it was super random. We wanted it to be like we were truly just taking a random sample. We wanted it to be a variety of tenure, we wanted it to be a variety of roles, a variety of all of that. And so we did that, so we sliced basically the entire development population into three. So if you were part of cohort A, you would get, let’s say the survey in Q1 and then if you were part of cohort B, then you would get it in Q3 and then C you would get it in Q4. And so we decided to essentially do something like that.
The thing that we found with that sampling and dividing it into three is just based on the number of developers that we have, the number actually turned out to be quite small when you factor in that lower participation rate, right? You come back and you’re like, this is significant. We had our data scientists review it and they’re like, yeah, this is significant. You can call this significant, it’s fine. You can make some conclusions from this, but you’re still like, it could be better. And you’re like, this number could be better. And so this year we’re thinking, let’s split it into two. And so instead of taking the entire population and splitting it into three, we’re taking the developer population and cutting it in half and sending it out. We haven’t landed on the times yet, but let’s say you’re a part of cohort A, you get it in Q1. Again, maybe if you’re a part of cohort B, then you’ll get it in Q3.
And we’re doing this for a couple reasons. One, obviously, like I mentioned, to increase that sampling size. Then the other thing too is that we want to align with our talent partners who are also sending out an engagement survey and they’re moving to twice a year. And so we want it to feel less like you’re constantly getting surveys going out. I think it’s also really important to think about digestibility for your leadership team and for your leaders and for folks that are then going to action this survey. When we first did the very first state of DevX survey there, it was just so much, there were so many things that we could have acted on and prioritizing that with our platform and hearing team and our product team, it was like this whole thing.
It was, again, when you talk about human intensive and the amount of hours, that that was quite a bit of a process. And so we also, one of the things is like we want to give time for our teams to implement those solutions, and that’s kind of, I guess one of the cons of big massive surveys like this, is you just get so much information. And so the interesting thing with this year, and I’m curious to see how it goes, is that we’re going into year two is seeing, we know there’s a bunch of big bucket things that we’ve ticked off. Speaking of thorns, again, we know that the thing that was driving most of our developers creating a lot of pain for our developers, we’ve implemented this new thing, we think it’s better and what we’re hearing is better. So, great. We know that we’ve tackled that top whatever, 5%.
This year, we’re hoping it’ll be a little bit different and already in the conversations that I’m having with leadership and some of our directors, I’m starting to see that shift change already in the types of pain points that we’re talking about for our developers and the types of opportunities that we have for our developers. So I’m curious to see if we still feel like, oh my gosh, there’s so much in here. If we can get that survey down. Like I mentioned earlier, we’re now at the point where because people know about the survey and have seen the results it can put out, we’re getting partners from lots of different areas of the business come to us and say, “Hey, can I add this? Can I add this?” And so I think there’s a little bit of work there of figuring out, what’s that balance of the right amount of information so that the team and the leadership team doesn’t get indigestion that we’re not.
Abi: Makes sense. I want to rewind a little bit and go back to what we were talking about earlier around the state of developer experience report. I want to specifically ask you, how did you present this information to executives? That’s something I think a lot of dev leaders out there probably really nervous about doing. You only get one shot at that. You’ve invested all this time into this program. How do you succeed at communicating the data, the results, the takeaways to executive leadership?
Thansha: Yeah, so we’ve got this very, very, very long report that we send out to the entire organization, and it’s really fun actually, when we send the report out and then in the tech learning channel, people will pop in and be like, oh my gosh, I’m so surprised that this tool got a low satisfaction score because I actually love it. And you’re so it starts this really fun discussion. And so that goes up to the entire organization and our execs also get that because they’re a part of our software mailing list. But immediately after we send that out to our entire software organization, I actually follow it up with a bit of a one pager or a one slider. One slide of here are the top level things that I think are the most important for us to discuss and action on. And so I send that out.
And again, very, very lucky to work with a very engaged group of leaders, and I actually sent that out and it was actually at our SEP software who was read this, this is great. Let’s schedule a software review on this and let’s have you take the rest of the leadership team through this and so we can really dig in and ask some questions. So I think if you can get that invitation, that’s always really, really great. And so I think thinking about how you share that information and get that and get that invitation is really important because I think if you’re starting with an invitation, it’s always a little bit better. And so it was actually his idea. And so we put together a deck that ran through the survey. Again, so much information, the way that we spread it out was like we have a headline, which is again, this is our satisfaction score, and then these are the three things that are driving that score up or down.
These are the three factors and these are the three things that we’re going to dig into. And then we also do some ground setting of, there are some things in this survey that have come out that are things that we control as a platform team or as a developer experience team that we already know we’re going to take action on and we’re going to take action on, or that we’re planning to take action based on this survey. Then there’s stuff that actually the engineering organization can drive impact on. So whether it’s engineering leadership, it’s managers, directors, and that’s not stuff that we necessarily control, but we want to highlight and bubble to the top so we can inspire action. And so then it becomes about influence and inspiring action at that level. And then finally, there’s also things that are Peloton wide that again, we’re bubbling up and sharing because the results of the survey also go to our talent team and our talent partners, and they also take a look at it.
And so again, not necessarily things that are within our sphere of control, but we want to bubble up to the top so that we can continue to influence it and drive action on those things. And so we do some ground setting there, and then we go through each of those factors and we do a bit of a deep dive. And so I’ll use the example of let’s say it’s tooling. We actually go through all of the tools and we’ll do a bit of a mapping of these are the ones that are driving the highest levels of satisfaction. These are the ones that are driving the lowest levels of satisfaction. I think many technology companies our size that grew very, very quickly like we did. We were tool agnostic for a while. Engineering teams were empowered to use whatever made sense for them as long as you went through the right security process, you get to a certain size and then that becomes less desirable.
And so bubbling those things up to leadership of, these are all the different tools that our engineers are using. These are the ones that are actually driving satisfaction. These are the ones that aren’t like, how do we have those conversations? So we actually just dive through each of those factors in that exact same way and have a discussion with leadership around, this is what we’re doing. And I found when we did it with our software review, it was actually a very productive conversation in that we were able to get a lot of feedback from leadership. We actually think this is the thing that we need to focus on. Let’s focus on that. These are the things that we can do. And so I found that really, really helpful. I think finding ways to engage your executive leadership team is super helpful, but at the same time, not gate keeping that information. I think it was important that we didn’t gate keep it and that we shared it immediately as soon as we could with the rest of the software organization and engaged our leadership team afterwards.
Abi: The advice around focusing on getting that invite from executive leadership, I love that piece of advice because it makes you think about the steps you need to do prior to presenting to leadership to get that invite rather than doing the survey and being like, oh, how did I get this in front of people? No, you have to show the value and then get that invitation to dive deeper. I’ve love this conversation because we’ve woven in so many of your reflections as you’re thinking about your next DevX survey. One thing you had mentioned to me before the show was that you had gone into that previous surveys and even the time period following those surveys with some nervousness around the value of these surveys. So what’s your perspective on that now?
Thansha: Yeah, I think actually my entire time at Peloton has been this theme of standing up a new function, and I think a lot of the things I do are a lot of, I hope this is going to be useful and I think this is going to be useful, and I hope it is. And a couple weeks ago, as I’ve started to think about, what are we thinking about in year two, I’ve actually put the survey back in front of a bunch of leaders at Peloton, and it was really interesting actually. I was having a conversation with one of our directors of engineering and he was like, oh, all of these areas we’ve actually improved on, which is really great. I know that that’s been addressed and I know that that’s been addressed. And so that was really rewarding to know that all of these conversations that we had a year and a half ago, and it started as an idea on a slide that Jim had turned into this, I think was really, really rewarding to know that we were able to drive a lot of change.
So I think I’m feeling much more confident going into year two that there’s a lot of value that we can continue to get, and I think in a natural way, my mind has immediately gone to, where are the areas that we want to focus on now that we know that these other areas we’re feeling much better about? Again, going back to thinking about that survey design and making that survey as useful as possible, how do we either condense the survey, how do we focus in on different things? And so we’re actually, there’s a couple themes and a couple factors that last year we knew were pain points, but were maybe a P2 compared to something else that was really, this is on fire. We need to fix this immediately that now, so maybe we had one or two questions on those P2 areas that we’re now actually really flushing out and adding a lot more questions to because we know that that’s something that we want to focus on.
So it also helps when you’re thinking about roadmap and planning for your team, it helps you think two, three years out because you’re like, I know this is a pain point. It’s coming up and there’s not much we can do better now just because of time, space, continuum, and we need to focus on these other things. But you can start having conversations about those as those other things wrap up and naturally leads you into, great. Now we’ve got this survey. Let’s input a bunch of other questions or things or questions that we have and decisions that we need to make so that we can get that input from the people that we’re building for, which I think is so important. And so I think that’s really been the most rewarding part of it and has really made me very much a believer. And I was obviously a believer because I built this, but also has made me really, it’s really cemented that of, yeah, what we thought was right.
So I think that’s really helpful too. Just from a roadmap mapping and thinking about the future of your organization and those things. I think it’s important and helpful. And also I think we spend so much time, I do anyway, I spend a lot of time in those painful moments with our developers understanding what those painful moments are, and with our leadership team spending a lot of time understanding what those painful moments are. It’s really nice to get a moment where you’re like, there is a happy moment there, and I definitely find that personally very motivating.
Abi: Well, Thansha, I’m so impressed with the way you’ve approached this from the beginning and so grateful for you sitting down with me to explain it all in such detail. I think listeners, I get asked all the time about, what are your tips for? What questions should I use? The same question you hear all the time, so I’m excited to be able to point people to this conversation to hear about how you approach this, and I think it’ll be so valuable. Thanks so much for coming on the show and speaking with me. This is really enjoyable.
Thansha: Yeah, thanks so much for having me. It was great.