Podcast

How “instructional engineers” improve developer onboarding at Splunk

This week we’re joined by Gail Carmichael, Principal Instructional Engineer at Splunk. At Splunk, Gail’s team is responsible for improving developer onboarding, which they do through a multi-day learning program. Here, Gail shares how this program works and how they measure developer onboarding. The conversation also covers what instructional engineers are generally, and how Gail demonstrates the impact of her team’s work.

Timestamps

  • (1:16) The Engineering Enablement & Engagement Team at Splunk
  • (8:01) What an Instructional Engineer is
  • (14:36) The developer onboarding program at Splunk
  • (16:05) Components of a good onboarding program
  • (21:11) Why having an onboarding program matters
  • (28:17) Measuring onboarding at Shopify (Gail’s previous company)
  • (31:39) Measuring developer onboarding at Splunk

Listen to this episode on Spotify, Apple Podcasts, Pocket Casts, Overcast, or wherever you listen to podcasts.

Transcript

Abi Noda: Gail, so great to have you on the show today. Thanks so much for your time.

Gail Carmichael**:** Great to be here.

 

Abi Noda: So we originally connected, we were in a Slack group together, and I had seen you post the comment about how you were focused on improving onboarding at Splunk. I looked you up on LinkedIn. You had a title I’d never seen before of instructional engineer, and I thought I’ve got to talk to Gail. So I’m excited to finally have you on the show. I want to start by diving into what your team does. It’s unique. I haven’t had anyone on the show who does exactly what you do with the types of backgrounds and focus that your team has. So let’s dive into it. Share with listeners a little bit about what your team does and what the makeup is.

 

Gail Carmichael: Yeah, for sure. So our team is the Engineering Enablement and Engagement Team at Splunk, and our mission is basically to build a world-class engineering organization through learning and knowledge-sharing. So that really sums it up in a nutshell, but I’ll tell you about some of the programs that we do, and it’ll really give a better sense of what we’re all about. So one of our biggest programs is our onboarding, our technical onboarding. So we call it engineering bootcamp, at least for the kind of core that we do for the first week. So when engineers come, they do their HR onboarding, but then they come to us, and we try to give them a really good overview of what all engineers need in our products and technology organization, which isn’t everything they need ever, because of course, everyone’s a little bit different, all the teams work a little bit differently. So we’re trying to bubble up a high-level abstracted view of our systems, our products, all that kind of stuff.

But then after that, you need to keep going. You need to keep saying, “Okay, well, what about my product? What about my team? How does that work?” So there’s a bit of a balance between us supporting other teams and creating what they need to support their new hires and us creating materials as well for that, depending on hiring needs and all that kind of stuff. So that’s kind of onboarding. We really want to make sure that new hires get confident in their jobs in their first 90 days. That’s kind of one of our big metrics, which I’m sure we’ll talk about. But yeah, so that’s onboarding.

So there’s other programs too, though. We do have things like tech talk programs, so trying to make sure that we have other engineers feeling empowered to share their knowledge. We also have a lot of really great events, which kind of cross between maybe learning and engagement. We have a grand hackathon that is done every year where people put down their regular work, typical sort of thing, work on something that makes engineering life better at Splunk or makes customer lives better. It’s really cool, because a lot of that stuff we try to get into actual product development, so actually impactful.

Another fun one that we did this past year is I organized a bit of a group for Advent of Code. That was really great, because we got Splunk to sponsor the event. We had daily threads sharing our puzzles, and it was just such a great community event. And although learning does happen as part of that, I felt like that community feel, that feeling of being part of something bigger was really especially great for that engagement piece.

And so the makeup of our team is really interesting. We have five people on the team, but we all have technical backgrounds, so we all have some kind of engineering or code writing, computer science type backgrounds, different levels, different kind of specifics, but that’s super interesting because we’ve lived the life, we understand engineers, and it gives us a headstart in being able to help them learn and get the knowledge they need.

Abi Noda: The mission you stated at the beginning of building a world-class engineering organization through what your team does, now I want to ask, does your team, where does it live? Is it within the HR organization? Does it report directly to CTO? Yeah. Where does it sit?

Gail Carmichael: ** ** Our team is actually part of Products and Technology, so we are in the same organization as engineers. I think that is super helpful. I think being part of HR, as a lot of learning teams are, and even some of these learning teams that support engineers, I think it does make it more challenging, because you’re kind of outside where the action is, if you will, and the leadership that you report to is not having the same incentive necessarily as the engineering leadership would. It’s really great that we are part of that.

Abi Noda: And tell me about the title, Instructional Engineer. I don’t think I’ve seen that anywhere else. How did that even come about at Splunk?

Gail Carmichael: That is such a good question, and honestly, kudos to my manager for making that one happen. When we first started forming our team, we knew that we wanted to have these people with technical backgrounds, and originally, it was looking like we would be bucketed in terms of job ladders into a more HR traditional learning sort of role. But because my manager, our director, he really recognized the importance of having that technical background, he really wanted us to be recognized as engineers in any way that we could. So getting a little bit closer to the engineering job ladder, I think technically we’re TPM, but pretty close to engineer, but especially getting that engineer in our title, even just to have that title when reaching out to subject matter experts, you can kind of have that rapport where they know you have a technical background, that really helps start conversations a little bit easier. So that’s basically how that happened.

Abi Noda: I’d love to hear about the history of this function at Splunk before you joined. The instructional engineering role and concept is kind of unique, and you’ve mentioned to me this has gone through a few iterations at Splunk before you joined. Would love to hear about that.

Gail Carmichael**:** Two iterations ago of this team, it was led originally by somebody who had learning expertise but not engineering background. They had more of, I’ll say, a sales background. And so that ended up leading to a bootcamp for all of our products and technology disciplines. So it included UX, product management, that kind of thing, didn’t necessarily target engineers specifically. And it ended up being kind of a collection of lectures, if you will, guest lectures from different people in the org. So that ended up being not so holistic and maybe not so impactful, especially if you’re a product manager sitting on an engineering-centric type session.

The next iteration of the team I think recognized this fact. It was led by an engineer, who didn’t have as much learning background, and in that case, they did start to try to get a little bit more hands-on type code-lab style workshops into the bootcamp, so that was really good, but still had this sort of collection of lectures, and it was still a little bit for everybody. And on that team, their scope was bigger. I can’t remember exactly what it was, but I know that they had some developers, so they might’ve been creating something a little bit like internal tools or productivity type tools. I’m not a hundred percent sure.

So they had developers, they had documentation people. So the team was trying to improve our documentation story, which is probably the bane of all companies’ existences. And at one point, that leader decided, “Hey, I enjoyed trying this, but maybe learning isn’t my passion.” So he decided he wanted to go back to being an engineer in product, and that’s when he hired our current director, who does have that technical and learning background, and he was the one who kind of said, “Okay, we need to focus exactly on engineering, because you can’t boil the ocean. Trying to do too much for too many people doesn’t necessarily work well.” And he set out to hire technical-plus learning type people, these unicorns. So yeah, that’s kind of the history.

Abi Noda: And what’s interesting is that title’s not a facade. You do have a CS background. Share a little bit about your personal background and how you personally got into this role.

Gail Carmichael: ** ** It’s kind of a fun journey. So like you said, I do have a CS background, so I spent a lot of years actually at university, because I got convinced to stay for grad school as well. But during grad school, I really had a chance to get into lots of teaching and outreach and mentoring and everything about learning, and eventually I even started doing more projects, even research projects that were related to CS education and so on. Ended up being a faculty instructor for a couple of years on term. Really gained a lot of really interesting background in learning as well.

And so eventually it didn’t work out to stay in academia. I had tried to stay permanently as an instructor and missed it just by hair, I like to say. So I ended up going back to industry, and that worked out really well, because I started off at Shopify with a little bit of development experience again to supplement what I had done during my co-op years in undergrad and then got a chance to quickly start working on learning programs again, but now from the industry perspective. So that was really cool. So throughout the years, really putting together the technical knowledge and the learning in really interesting ways.

Abi Noda: You’ve kind of been in this role at several organizations now before Splunk. You were at Shopify. In your perspective, who is strong in this type of role? Is it people without technical skills? Can they thrive in this type of role, or in your view, does it really require engineers who lean towards helping others and teaching? What’s sort of the ideal person for this type of work?

Gail Carmichael: And I think that is such an interesting question, and I think about it a lot. So as somebody with the unicorn skills, I kind of like to think that’s necessary for the job, or at least that’s the best kind of thing you can get. But I will say I have seen people who maybe didn’t have the engineering background or weren’t as technical, but who, let’s say, did a few courses on data science or built a little bit of technical knowledge and had really strong learning background. I have seen people succeed in roles like this.

But that said, I also find that in the end, if you don’t have the skills yourself, the engineering or technical skills, you have to rely more on subject matter experts, so the engineers that you work with, to create a lot of the learning material. And you will be able to give good feedback to them from certain perspectives. You probably can tell them whether it’s organized well or whether the language is clear or whatever, but if you want someone to create, let’s say, a code lab, and you don’t know really too much deeply about what they’re writing about, how can you make sure that it’s really achieving the learning goals that you set out to have?

So I do find that if you have the engineering expertise and the learning expertise, I feel like you can take things a little bit further. For example, engineers put together a tutorial. I’ve noticed it tends to be very handheld if you will. It’s kind of like step-by-step, “Here’s how you do this,” but they’re not necessarily thinking about, “Okay, well how can I turn this into active learning? How can I get people to play on their misconceptions? How can I get them to practice things?” You don’t have a learning background, obviously, typical engineers, being able to suggest like, “Hey, here you are suggesting they walk through this. Why don’t we flip that? Why don’t we give them an error that they make a mistake on purpose, and then have them figure out how to fix it?” I don’t know to what extent you can do that without some of the technical knowledge as well.

Abi Noda: Really interesting. It kind of makes me think of a similar problem with, for example, internal platform tooling at many organizations don’t have product management roles within that function. And you hear a lot about challenges with platform teams building different tools, but then adoption is a struggle or the way the tool is designed is not quite right, because they’re not really thinking about the customer as deeply as they should. What you’re describing sounds similar, just applied to learning and enablement content and courses. An engineer could create that content and do a good job building the technical components of it, but to actually package that up and have impact, you need to think about the learner in this case or the customer. Is that how you kind of see it?

Gail Carmichael: Yeah, I think that’s actually a really interesting insight. I don’t know a lot about the equivalent team at Splunk, but during Shopify days, I know that our internal tools development team did have a PM, and I think that really did make a positive difference. With the learning, keeping the end user in mind and from the learning’s perspective, but also, there’s so much knowledge about how learning works that a lot of people don’t have. And so if you don’t have that knowledge, you’re not going to be able to apply it in the stuff you create. So perhaps there’s also an analogy there with the PM and the kind of knowledge that they have. 

Abi Noda: As someone who’s now in this type of a role, that instructional engineer that recognizes that combination of technical skill and instructional expertise, we’ll get more into later in the show what you’re actually doing at Splunk, but how would you advise others to try to make the case for this type of a role within their own organization?

Gail Carmichael: ** ** I guess in some ways I have tried to make the case myself as well. If you are able to give some examples, it can help, and maybe if you don’t have the examples yourself, you can come out to one of these communities that are out there for technical educators who might be able to help you give some examples. But even like I said, the typical sort of, let’s say, live coding that engineers do, it tends to be coding on the screen and talking but not interacting, for example. Maybe a learning person without the engineering background could say something like, “Hey, maybe you should make this interactive,” but they don’t necessarily know the specifics of how, right? So yeah, you’ll get probably a better result out of that, even with someone who has learning and not technical backgrounds, because maybe the engineer will think of some kind of way to get it interactive, but just that little extra bit of knowledge, they can start to be more specific.

They start to know how teaching coding works. They know that you can do some of the things we talked about before, right? Introduce an error on purpose, play on misconceptions. There’s cool things you can do with even just polling multiple-choice type questions to get people thinking. I think making the cases that you can get good results without this sort of role, but I think that you can always take it further, and it’s worth the investment, and you can probably get there faster, too, from observation.

Because again, you’re not relying completely on a subject matter expert who’s very busy, who doesn’t have a lot of time to give you, whereas if you have somebody on the learning team whose job is to put together learning, they can do some of it themselves. They don’t have to fully rely on an SME or maybe not at all, and they can take that learning to be, like you said, a little bit more impactful, just get a little bit further. So I think that’s a big piece of it. 

Abi Noda: So earlier you talked about how onboarding, this onboarding bootcamp, is a big focus of yours. I’d love to really unpack this in detail and take listeners through what this is and how it works. So first maybe talk about the problem that you guys were setting out to solve with designing this program. What was happening at Splunk? Why did you need to create onboarding program such as the one you’ve created?

Gail Carmichael: ** ** We technically had an onboarding program, but it was for more than just engineers, and that just wasn’t working super well. It was difficult to run, because you constantly had to get your volunteer speakers, make sure that they were giving relevant talks, and not all disciplines really benefited from all these talks necessarily. So when this latest iteration of our team started, we recognized that we really needed to focus on engineers specifically, and that included creating a new onboarding program that was designed specifically for them.

So to figure out exactly what that looked like, we kind of approached it with a research mindset. I guess it helps that I also came from academia at one point, because that’s kind of my instinct anyways, “We have to research this,” just not as rigorously as you do for a paper. So we kind of got an early set of questions looking at feedback we had from our previous iterations of bootcamp, talking to some of our steering committee members, so that’s just a group of leaders from the different product lines that can help advise us, and then we went out and just talked to a lot of people who onboarded recently. What was missing, what do we need to fill in the gaps for? And that really informed how we designed the overall program.

Abi Noda: The last couple of companies I’ve joined, I recall my onboarding experience was, “Here’s your laptop, here’s your Slack account. Open a ticket if you need any software.” I don’t recall anything that resembled anything like a program. So for folks out there listening, zooming out a little bit, what do you even mean by onboarding program? What are the components of what a good onboarding program looks like?

Gail Carmichael: Yeah, so most companies will do hopefully some kind of company-wide onboarding. And ideally, your company would also introduce you really well to things like company culture, but also just generally what your product is and who your audience is for that product and so on. I do find companies kind of vary at how good they are at that. So that’s sort of like your company-wide onboarding.

For our engineering bootcamp, the way we were looking at it is, actually I’ll zoom out, engineering onboarding, because bootcamp is technically just a portion of that, we kind of have it in phases. So engineering bootcamp is for that first week. So after day one where they get their company onboarding, they spend four days with us and a little bit more if you count some little self-paced learning and things that they can continue doing after bootcamp. And then they have different things, like a buddy program so they can try to have somebody there for them and guide them through team specific stuff, onboarding into their product, and so on. And then sometimes we have things like team tracks and so on.

But coming back to bootcamp, the way that that looks is based on the research that we did, we realized we did need to dive a little bit deeper into who our customers were. So even though that doesn’t sound like something engineering-centric, it’s something that was kind of missing from the general process. We have an entire session called Voice of the Customer, for example, to help people understand who our customers even are and kind of try to get excited about the product. That’s something that you maybe take for granted when you’ve been at a company for a while, but you want to know why you should be excited about a product when you first join.

But then there’s also the technical side. So in our case, we have, for example, a little session on our technical history, we have a session about our product ecosystem, so trying to give a bit of a systems-level view of all our products and services, or at least the most important ones, how they all fit together without getting too detailed, how we do software development and so on. So this bootcamp is generally done through live sessions, or we did have to do a self-paced version for a while, which is essentially recordings of live sessions. So this is sort of your week one. It’s very structured. The intention is to focus your time on onboarding, try to ignore other meetings, don’t try to go to your team meetings, all that kind of stuff. Just really get that context, and then after that you can start diving a little bit deeper.

So we have different levels of materials, so to speak, for the different product lines that we have right now. For example, one of our product lines is called Platform. It’s kind of the core product in a way. We had a lot of hiring for that, so we have more deep -dive materials for them, but over time we need to get more for other product lines and so on.

Abi Noda: This is a great overview of what the bootcamp looks like. Definitely far more comprehensive than anything I’ve ever experienced. I have a few questions. One question that popped into my mind as you were explaining this, how did you decide or make the case for doing live sessions as opposed to asynchronous or self-guided learning?

Gail Carmichael: ** ** It’s funny you ask that because of the fact that we actually had to do self-paced for quite a while, but at the time we started this, we knew we were getting a decent amount of hiring, and there were large enough cohorts that it felt worth it to run a bootcamp every couple of weeks. It wasn’t a hard sell, because the previous iteration of bootcamp was live as well, maybe less effective with just these kind of mishmash of lectures, but it was live, so continuing that wasn’t such a hard sell.

You can put together a really effective bootcamp with self-paced or e-learning designed specifically for that purpose. I think it’s a little bit harder for a lot of people to create that kind of learning. It takes a lot more upfront preparation, because if you want an impactful learning experience, you really have to anticipate what kind of questions people have and so on. Whereas if you start with live, you can be a little bit scrappier, because you can kind of answer questions live or kind of correct the mistakes you’ve made in designing it on the go. So that makes it a little bit easier to start with live anyway.

I also personally have noticed that it’s really nice for new hires, for their first experiences to be with a real person in real time for various reasons. One is that your calendar gets blocked for the session that you’re invited to, so it’s a little easier for everybody else to leave you alone. Whereas if you’re doing stuff self-paced, you’re starting to get pulled and your focus is lost, right? It’s also nice just to feel like you have a cohort. The more that we can build a cohort, the more you have a network of people that you can turn to later on in your career. You’re like, “Oh, yeah. I remember that person came from this product area,” or like, “Oh, this person works on a similar team to me. They might know the answer to my question.” Or even just people who might be friends in the future, and especially because we are remote-first too, I think it could be very isolating to do everything completely virtual. So I think these all kind of factor in to why that first week is a little bit more live when we can.

Abi Noda: And as someone who’s done this in a really comprehensive way, you mentioned the mission of your team earlier, I want to ask you how does good onboarding contribute to the goal of having a world-class engineering organization?

Gail Carmichael: ** ** I think good onboarding gives people a really good headstart. I think if we, just for a moment, imagine there was no onboarding, you’re sort of thrown into this world, and we hire smart people, most companies try to hire smart people. Maybe they could figure a lot of this stuff out on their own, but is it a good experience when, especially remotely, when you are in this new realm and you just don’t have any mental model of how anything is going on, especially if the underlying systems and processes aren’t super well documented, which usually they aren’t perfectly documented in any company? If there’s not consistency between how the teams do it and so on, it feels bad. It feels very confusing. So I think that even just being able to give people this high-level view is super valuable.

And honestly speaking of that, something really interesting that we’ve noticed here is that nobody else in the company seems to even have this high-level view of everything. Everybody knows their area, and they often know it very deeply, but if you want to zoom out and, for example, explain what our product ecosystem is, what are all the pieces, how do they connect, how do they talk to each other? What’s an app on the platform versus what’s a standalone SaaS thing? Nobody knows. And so if we didn’t have this onboarding team working on onboarding, we never actually create an artifact like that.

And it’s so interesting because once people start realizing you ask these questions, “I didn’t know that,” they start learning, and sometimes people start realizing they have to be like, “Oh, we should probably fix the fact that nobody knows, because then maybe that’s something that would be helpful for everybody, and we should actually determine this.”

Abi Noda: Yeah, that’s really interesting and resonates the piece about how, by solving onboarding, you’re actually creating knowledge artifacts that are useful for everybody, because it’s only during onboarding that you even maybe think about the fact that this knowledge doesn’t exist anywhere, but really that knowledge should exist for all points of the journey of working at a company. I know one other challenge you mentioned to me before this call was figuring out that right abstraction level. How do you draw the line between what your centralized onboarding program solves versus what the individual teams focus on? How do you think about those lines?

Gail Carmichael**:** It’s a really big challenge, and it’s going to be different at every company, too. Just as a point of comparison, Shopify was a little bit easier because it’s a little bit more of a consistent tech stack that we could teach to, whereas at Splunk, every product line and maybe even every product does things differently, and not even all the software SaaS. There’s enterprise software that gets installed on computers too, and that really makes things confusing. So it was really hard in our case to actually kind of figure out even what that high-level view is. And so part of how we figured it out was lots and lots of research, talking to lots of people, and just figuring out even what the stories were in all these different areas. So that came through interviews, through trying to figure out from documentation and so on, and then trying to see what’s common.

In some ways, it’s not a lot, if you want to be really pedantic, I guess, because like I said, everyone does everything a little bit differently, they’re using different languages, different frameworks and so on. But I did mention that, for example, we needed to learn more about the customer, and that is something that came through the research as well. It’s something I personally wanted, too, for my onboarding and didn’t get it, so I was on a mission to fill that gap.

There’s also a consideration that it’s worth learning stuff about what other people are doing, even if you’re not going to do it yourself. So I mean that in two different ways. So in one sense, we from our research had a lot of people say, “I didn’t join to work on the platform. I didn’t learn it during onboarding. I always meant to go back and try to learn about it, because it’s so important in a lot of ways, even though I don’t need to know it for my day-to-day, but I don’t have time. After onboarding, I’m just too busy all the time.” Classic story, you never have time to learn when you’re busy shipping. “I always wish that I could do that.”

So I was like, “Okay, that makes sense. So let’s make sure everybody in onboarding learns about the platform and just emphasize to them this is why even though you may not need it in your day-to-day and you’re not working on that, this is why we specifically picked this one for everyone to learn about.” So that’s sort of one sense. The other sense is, like you said in your question, what’s the right level of abstraction? So thinking about our technical history, for example, I mean obviously it’ll be interesting, well, hopefully it’ll be interesting to everybody to learn about all the different areas of the company through this technical history just because it’s kind of cool. It’s kind of cool to see the iterations, the failed attempts we had in the past at trying to get to cloud, all that kind of stuff.

It’s neat, because then you learn a little bit about just even what other products exist and so on as well through the fact that you’re talking about them for this history. But then the product ecosystem as well. So I’ve kind of mentioned this one a couple of times. Finding that right level of abstraction is really interesting because you ask people like, “Hey, can you give me a little bit of info about your architecture?” or, “This is what I’m doing. Do you have anything for me?” Of course, they’ll have super-detailed architecture, way beyond anything that you could even comprehend because all the words on it are products you don’t even understand yet. At the same time, you don’t want just the voice of the customer level, the marketing kind of speak of what all the products are, because you do need that too, but it’s not enough. It doesn’t give you any idea of the technical side.

So we have the platform, and it can be deployed in two different ways, so let’s kind of convey that. Let’s give a little bit of hint of in the cloud how it’s deployed, because there’s some weirdness there and some interesting nuances. Some of our premium solutions are actually just apps on the platform, little apps someone can make for free, so let’s convey that. Try to color-code everything. Try to show if it’s a standalone thing, how is it communicating? Is it using Kafka to communicate with this other thing? Not in a very formal diagram kind of way. We’re not doing anything crazy that way.

So what I just described with the product ecosystem, I think every company needs something like that, and usually it doesn’t have it. For Shopify, it could have been simple as, “We get to kind of build stuff like this over time for onboarding, but hey, my backend is a monolith. It’s in rails. Let’s draw that on a picture. It talks to the React front end through GraphQL. Let’s put that on a picture. And then we have this shop app. Oh, it’s just another React mobile app, React native full app, and it’s talking through the same GraphQL, just have it on a picture, put it all together.” So yeah, that’s where we’re at there.

I think you do have to draw the line where you can ask them a little bit to be like, “Hey, just trust us that even though this seems not something you’re working on, you’ll be glad that you have this knowledge over time with learning the platform when you’re not working on it.” But there’s a point where you’re kind like, “Okay, I’ve kind of used up that level of trust,” and if you’re like, “Here’s all the nitty-gritty details about how platform deploys, even though I’m working on O11y, which is all cloud-based, I want to know about O11y, but do I really need to know that?” At that point, you kind of have to diverge. If there is some sort of high-level view of the software development lifecycle, it’s great to be able to share that high view, but diving deep on the specifics, you really got to diverge.

Abi Noda: Previously, I asked you how does your work actually drive creating a world-class engineering organization? And I think part of that in your role is actually making that case and being able to prove that at your company to get continued investment for the work you’re doing. So I want to talk about measurement, and of course, there’s no silver bullet answers to this topic, but what you’ve shared with me is definitely interesting, and I think listeners will get value. I want to actually start with you mentioned to me how different ways in which you approach this at Shopify, so maybe start there. How did you all look at measuring developer onboarding or the impact of the work to improve onboarding on engineering?

Gail Carmichael: So in learning and development, there’s kind of different levels of how you can measure the impact of what you’re doing. Kirkpatrick is the guy who kind of came up with one set of levels that you can use. It starts off with, “Did they like it?” It doesn’t tell you whether they learn, but, “Hey, if they like it, that’s a good start.” Because if they absolutely hate it, they might ignore the learning parts anyway. The second part is, “Did they learn?” The third part is, “Are they going to be applying that?” And then, we kind of get bigger to the business impact.

At Shopify, we hit the first level and maybe to an extent the second level through simply asking what they thought of each session that we did through a survey. And that was really great because we could bake it right into the live session at the end, or even if we had a self-paced course right at the end of the course, say, “Hey, can you fill our survey? Basically, how would you rate this, one to five? Any comments you have for suggestions? What was good, what was bad, whatever.” And that was really great, and we did get good qualitative data as well about the effectiveness of learning.

I think, personally, most places kind of skip level two and don’t actually check if they’re learning anything, especially if their learning is not as interactive, because then you have no way to see what the learner’s learning. So I always personally try to bake in sneaky ways of finding out if people are learning what I want them to learn. For example, you might put little quizzes that aren’t designed as quizzes for assessing knowledge, for passing a test or something, but fun ways just to be like, “Hey, what do you guys think? Which one of these?” And you write a really good question and get them to think about it.

And then you start to see, “Okay, do they understand this?” Maybe they don’t at first, but you have a chance to correct it. If you have projects that they’re doing, that’s a really great way to see if they’re learning. If you actually look to see, “Did they achieve the goals that I set out for them to do? Does it seem like they need a lot of help for it? Did they produce the thing in the end that we wanted them to?” So you can definitely bake ways in, and if you care to, you can actually decide whether they’re learning or not. “Are they applying it and is it having business impact?” A really common metric for that is time-to-first or 10th MR or PR, depending on what you’re using, GitLab or GitHub, and Shopify was working on that. I can’t remember if they actually ended up implementing that as a metric they could look at.

Interestingly, we have recently been able to do that at Splunk, so I can speak to that, and we talk about the Splunk case, if you want. It’s not a perfect metric, because your first PR could be one line of code or it could be a hundred or it could be a thousand. Who knows, right? It’s not perfect, but in absence of anything better, it does give you a bit of a sense, because you can start to see are people taking less time to get to that first or 10th PR? And if so, the hope is that what you’re doing is helping them do it faster because you’re giving them the context, the knowledge, getting them set up with their tools, whatever, that they now don’t have to go figure out for themselves.

Abi Noda: So curious how you’re now approaching this at Splunk, and I know you had mentioned to me you’re doing an NPS survey after these sessions, and you have mixed feelings about it. It’s helpful for reporting up, but not maybe telling the full story. How’s it going at Splunk?

Gail Carmichael**:** Yeah, that’s exactly right. NPS is a very, very, very common thing to do for learning and evaluating your learning programs. I guess the reason I have mixed feelings about it is because to an extent, it’s kind of a lot like the “Did you like it?” At the same time, “Who cares if you liked it?” Because what we really need to know is, “Did you learn and did you apply that, and am I making something better for the business?” But it can give some kind of high-level indications, nonetheless. If a lot of people are saying, “I would not recommend this,” you can at least dig into why, right? You can go and ask them or maybe they left comments, and you can kind of uncover some kind of mistake you made in your understanding of what you should have built or something like that.

So yeah, when we started our onboarding program, I actually did borrow from the Shopify idea and did a survey at the end of each session. It wasn’t NPS per se. It was again, “How would you rate the quality?” and, “What would you suggest?” And that was really, really helpful at the beginning of the program, because we were brand new to delivering it, we wanted to be able to iterate quickly to see if we can make it better, and it was very helpful for that.

We didn’t continue to do that, because what we really needed is a single score for bootcamp as a whole. So if we ask them at every session and then ask them again at the end, well, it’s already fatigue. You’re probably not going to get as many answers at the end as you want. So at the end, we do ask for that recommendation again to get our NPS and we get our comments and so on. And then in terms of actually seeing longer term, are we having an impact, we do what we call our 45-day survey. The goal there is not to assess just what we’ve done necessarily, but just generally what their onboarding process has been. 

So what we do is in our 45-day survey, we do ask, just generally, “Would you recommend how you onboarded to other people?” But it turns out, we’re actually this year not using that as our main metric. We’re actually using a confidence score instead. So we ask them, “How confident do you feel in your new role between one and five?” And we want to aim to be, let’s say, four average. And I think that’s really interesting because they could be confident in spite of us, potentially. Maybe they felt like boot camp wasn’t quite right for them, because maybe they came in with a certain level of experience that made it redundant or whatever. Who knows? Or maybe we haven’t got to creating materials as much for their team yet because maybe their team isn’t hiring as much for whatever reason. Or maybe our buddy program really helped boost it, or maybe their team is just really awesome, which is great too. Because if we see someone who has low recommendation and high confidence, we can probably ask them why, and then maybe we can extract that and try to spread it around.

But yeah, so we have that, and then we also, as you mentioned, recently have been able to start measuring time to first and 10th MR in our case because we’re GitLab. So that’s been really interesting, just kind of early look at the data. It does actually look like that’s come down quite a bit since we started our bootcamp. So it’ll be good to get more data the longer that we run the program, but we want to get that as low as possible and kind of keep it low on average. So we look at, of all the people who onboarded in a month, what’s the 50th percentile for those times? That’s kind of where we’re at there.

Abi Noda: Do people feel confident? Do they get up and running and productive quickly? Those all make sense. You’re someone who’s been doing this for a while. I’m curious, what are other attempts at measuring this that you’ve seen not work? For example, I’ve heard people talk about correlating onboarding experience to retention of developer. Have you tried looking at other relationships and business outcomes throughout your career, both at Splunk and at Shopify, and has that ever surfaced any interesting insight?

Gail Carmichael**:** Yeah, so we did attempt to do that at Shopify. I wasn’t leading that, so I don’t 100% know what happened with it. I think one of the hardest parts of that is that you need access to HR type data that people from HR have to help you get, and they’re usually too busy to help you. So I think probably the hardest part of doing those kind of correlations, at least the first hurdle, is just getting the resources to make it happen. And so because it didn’t happen there and we haven’t attempted it at Splunk either, I don’t know if they actually work. I think it would be really interesting. I can see maybe trying to be careful about making too strong of a conclusion from it, like remembering correlation doesn’t equal causation, but I think it would still be really, really interesting to just get insights into what’s going on and how your program might’ve affected things.

But there’s so many conflating factors too, right? That’s actually, or sorry, succeeding because of you.

Abi Noda: I love that, and it really resonates. The confounding factors problem I think is such a common challenge with measuring stuff in software engineering organizations. So yeah, having some signal that’s really direct, “Did you learn from this?” or, “Did you find this useful? Is this saving you time?” Those types of direct signals I feel like are really the only things that can often give you a very confident look at whether what you’re doing is providing value, and then the more correlation downstream business outcome, those things are I think really good for telling a story and justifying investment in your program overall. But as you said, deep down, you never know is that in spite of or because of the work we’re doing? So it sounds like you’ve had a similar experience.

Gail Carmichael: Yeah. I think that’s a challenge throughout the industry, trying to get those higher level indications of whether you’re having impact are just so hard. And even just to pull out one thing you said, you mentioned, “Is it useful?” Even just understanding whether something’s useful to someone is actually really hard, because yeah, you can ask them and they can say, but there’s even issues there, because for example, a lot of learning research shows that people are really bad at knowing whether they learned something or whether something was actually useful, which is super interesting. To just ask them, did you learn or is this useful is actually fairly unreliable. It’s still useful to ask them, but have to keep that in the back of your mind. Again, that’s why I love having some kind of objective measure that we can see, “Did you actually produce an artifact or some kind of concrete illustration that you learned?” Now, that also relies on the fact that we’ve done a good enough job in deciding what you should learn because we’re measuring against that. So many layers you can see. It goes so deep.

Abi Noda: Yeah. Well, Gail, this has been a great conversation. I get asked from people all the time, “How do you tackle developer onboarding? How do you measure it?” This is going to be something I direct people to for a blueprint of how to approach the problem and some of the unsolved challenges, particularly around things like measurement. Thanks so much for the tips and for your time today. Really enjoyed having you on the show.

Gail Carmichael: Yeah, thanks so much. It was a lot of fun.

Abi Noda: ** ** Thank you so much for listening to this week’s episode. As always, you can find detailed show notes and other content at our website, getdx.com. If you enjoyed this episode, please subscribe to the show on Apple Podcast, Spotify, or your favorite podcast app. Please also consider rating our show since this helps more listeners discover our podcast. Thanks again, and I’ll see you in the next episode.