27. Colin MacArthur of the Canadian Digital Service

In this episode of Dollars to Donuts I chat with Colin MacArthur, the Head of Design Research at the Canadian Digital Service. We talk about bureaucracy hacking, spreading the gospel of research throughout government, and embedding researchers in complex domains.

Often the idiosyncrasies in people’s research and the sort of surprises that don’t fit within the template are the most important things that our researchers find. – Colin MacArthur

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

I just read the 2011 book “It Chooses You” by filmmaker and artist Miranda July. It’s one of the best books about ethnographic research that isn’t really actually about ethnographic research. In the book she describes a period of her life where she was creatively stalled in finishing the screenplay for her film “The Future.” As a way to either get unblocked or just avoid what she should be working on, she develops another project, to call people who have placed ads in the free classified newspaper the PennySaver, and arrange to come to their homes and interview them.

She reports on each of the interviews, including excerpts of the transcripts, and some amazing photographs. The interviews are sort of about the thing being sold, but because she’s getting outside of her cultural bubble, she takes a wider view, asking people about a period in their life when they were happy and whether or not they used a computer (since even in 2011 a newspaper with classified ads was a relic of a previous era).

These interviews are confounding, hilarious, disturbing, touching – everything you’d hope. And July is honest about what makes her uncomfortable, about her own failures to properly exhibit empathy when it’s needed, or her challenge in exercising caution in some dodgy situations while still being open to connecting with strangers. She incorporates her feelings about her own life as she hears from people about their hopes and their reflections back on their lives, lived well or not so well. She articulates her own judgements about the people she met and how that informs her current thinking about her own life and her aspirations for her future.
In one chapter she meets Beverly, a woman with Bengal leopard babies and birds and sheep and dogs. Beverly was clearly excited for Miranda’s visit, and prepared an enormous amount of fruit-and-marshmallow salad which neither July nor her crew want, but accept out of politeness, eager to get away from Beverly and her home, but then head straight to a gas station and throw the marshmallow salad in the trash, covering it up with newspaper in case Beverly stops by. Reading it, I felt my own judgement of Miranda for her judgement of Beverly, but I can imagine doing the exact same thing in a similar circumstance, and I appreciate July’s ability to observe her own judgment and infuse it with compassion at the same time. Ultimately, she views her struggles to connect as her own personal failure, saying “the fullness of Beverly’s life was menacing to me – there was no room for invention, no place for the kind of fictional conjuring that makes me feel useful, or feel anything at all. She wanted me to just actually be there and eat fruit with her.” In articulating something so nuanced and personal, we learn an awful lot about Miranda July as well as all the people, like Beverly, that she meets.

I can’t believe it took me this long to finally read this book, and I can’t recommend it highly enough.

Through Dollars to Donuts, I’m gathering stories about maturing research teams, looking for best practices, emergent approaches, and insights about organizational culture. This is of course highly related to the services I offer as a consultant. In addition to leading research studies in collaboration with my clients, I also help organizations plan and strategize about how to improve the impact research is having. Whether that’s working as a coach for individuals or teams, or running workshops, or advising on best practices, or leading training sessions, there’s a number of different ways you can engage with me. Please get in touch and let’s explore what the ideal collaboration would look like.

You can email me about this, or with feedback about the podcast, at donuts@portigal.com.

Coming up soon is the first Advancing Research conference. Rosenfeld Media is putting on this conference March 30th through April 1st 2020, at the Museum of the Moving Image in Queens New York. I’ve been working on the curation team helping to assemble the program and it’s looking like a really fantastic event. I’ll put the URL for the conference in the show notes. You can use my discount code Portigal dash ar10 to save 10 percent on your registration. I hope to see you there!

All right, on to my interview with Colin MacArthur, the Head of Design Research at the Canadian Digital Service.

Well, thanks so much for being on the podcast. I’m really happy to get to speak with you.

Colin MacArthur: My pleasure. Thanks so much for having me.

Steve: All right, let’s just start with who you are? What role do you have? What organization do you work for? I’ll just throw all the launching stuff onto you and we’ll go from there.

Colin: Absolutely. My name is Colin MacArthur. I’m the Head of Design Research at the Canadian Digital Service. We are an office of the Treasury Board of Canada that works with departments across the Canadian federal government to improve the way they serve their constituents. We do that on my team using design research and by helping people inside the government get closer to the folks who they’re trying to service. We do that in partnership with designers and engineers and product managers and policy experts on interdisciplinary teams, but the perspective that the researchers bring is a hard and fast focus on the people who we’re trying to serve and what their experience with our services is like.

Steve: Why is this under Treasury?

Colin: (laughs) Good question. The Canadian federal government, or government of Canada is organized with departments and then with a number of central agencies. And Treasury Board is one of those central agencies. Its name is a little odd in that it’s not actually the department of finance. It’s not the Ministry of Finance. It plays a central role in sort of managing and consulting with other departments about how they run themselves. It’s a management board of government. And so we are in kind of an interesting place because from Treasury Board we have a view of lots of interesting things happening across government. We’re positioned kind of naturally to give advice and also learn from departments that we work with and then to work across lots of different departments in a way that would be a little more unusual if we were nestled inside a dedicated department itself. So, Treasury Board is sort of one of the central agencies of the government and that’s why we’ve ended up where we are.

Steve: Is that where the Canadian Digital Service originated?

Colin: That’s exactly right. So, we’re relatively new. Founded just a couple of years ago and we started right in the same place we still are, inside Treasury Board, reporting to Canada’s first Minister for Digital Government who was also the President of the Treasury Board.

Steve: What’s the relationship between digital service and, I don’t know, what would you call like regular service. Like the things that government does. Because you included policy in kind of the mix of people that you collaborate with. So, everyone else you listed seemed – design, engineering, research, PM – seems very – yeah, this is how sort of software is made. How digital services are made. But policy – and this is from someone outside government, so maybe it’s a naïve question, but policy just sort of begs the question for me, like oh what’s digital versus just services?

Colin: Yeah. What a good question. I think the way we choose to look at it is we’re interested in improving services, period. So that means the elements of those services that are online, but also the elements that are offline and drift into paper processes and drift into things that are more typical policy problems. But in this day and age it’s pretty hard to have a meaningful discussion about service in general without talking about the digital pieces of those services. So, when we put together teams to work with departments we absolutely come with a digital emphasis. That’s one of the strengths that we can bring. That’s one of the pools of expertise that we have. But we’re just as interested in looking at the non-digital sides of that service. And in reality, they all fit together and attempts to kind of separate them out into the website and the paper part are often pretty hard and don’t end very well for the people we’re trying to serve. So, we kind of view ourselves as tied up in both and we try to staff our teams with expertise that allows us to do that. That said, our name is the Canadian Digital Service and I think often our entrée is our digital skills. But we try to be more than that. We don’t think digital problems can really be solved by just looking at the technology piece.

Steve: That seems to me analogous with so many efforts to do transformation or introduce new services or new products – you know private and public sector around the world where – I mean you’re kind of hinting at the power of the like the construct of digital. That it invites maybe a different mindset in approaching a problem that can exceed beyond the boundaries of what is digital. Or, as you said, they’re intertwined.

Colin: Yeah, absolutely. I think we view the computer part as only part of the digital mindset and approach that we try to bring. I think departments appreciate that. I think that we are often able to look at a problem from different angles because we’re not just looking at the IT systems involved. We’re looking at all the pieces related to the problem they have.

Steve: So, what is your role specifically?

Colin: My role as head of design research is to first of all lead and coach the researchers on our team. So, I help them as they’re embedded in product teams chart the direction they want to take with their research, improve the quality and the quantity of their research. Make sure their research is kind of fitting into the product lifecycle in the ways that we hope. And also make sure they’re developing as professionals. We are happy that folks don’t just come here to build services. They come here to build skills and we do that with our researchers, like with all of our other disciplines. So, helping and supporting the researchers is one big piece of my job.

Another is to convene the government wide community practice around design research. So, we know that design research happens in places other than CDS. We know that research is a tool that lots of other departments are using. But what I noticed when I first arrived was that there were very few conversations between people doing research in different departments, or even within a particularly large department. So, I use my role and my limited additional time to convene folks across government of Canada who do this kind of work and talk about the challenges of doing it and talk about ways that we’ve worked through some of those challenges. The joy of that is getting to see all of the sometimes hard to spot places where people are doing interesting things. You know taking interesting methodological approaches, having new discussions that aren’t visible if you don’t get them all in a room and get them to start talking. So, convening that government wide community of practice is another key part of the job. And related to that is trying to create some tools and some policy change that helps that whole community move forward. We try not just to talk about what the problems are and swap stories on the solutions. We try to learn from that community and then use our position and knowledge to nudge forward policy changes or government sort of practice changes that can help researchers across government be more effective.

Steve: Do you have an example of a policy change or process change?

Colin: Sure. So, one of the central pieces of research regulation in the government of Canada is something called the public opinion research rules and guidance, known as POR. I’ll try not to use that acronym, but public opinion research is this kind of class of research that the government has a process for managing – very thoughtfully and deliberately over a relatively – I don’t want to say slow, but certainly a longer timescale than most user research or design research would happen on. And so one of the key areas of confusion we saw when we started doing design research with partner departments was wait, can you really do this without going through the whole process for public opinion research? Isn’t what you’re doing just public opinion research? Why are these things different? Why should we believe you that they’re different?

And so we went to our colleagues in another part of Treasury Board who actually own the policy on public opinion research, and we said to them, “look, these are the kinds of things we do. These are the kinds of questions we ask. If these are the kinds of questions involved is this really public opinion research?” And their response was, “if that’s really all it is then probably not.” And what was great was that we could then work with them to write a clarification to the guidance around the policy which really just meant like updating a webpage to have a section that talks about POR relates to user or design research and kind of further explain to departments that they could be doing user design research that didn’t need to fall within this typical public opinion research cycle. And that kind of guidance is so important to loosening structural challenges to doing research across government, right? That kind of guidance from the center is what helps people trying to do research in the edge of their departments make the argument to their manager that this isn’t quite as risky as they thought it might be.

Steve: The perceived risk is in doing the research? Is that what you mean?

Colin: Exactly. And I would say risk is probably a strong word. People say “well we have these rules for doing public opinion research. I don’t really know about any other kinds of research, so you better follow these rules.” And I think that what we’re trying to do is say “well there are actually some other kinds of research that are useful and not quite the same thing”. And think about those and realize that maybe what you’re trying to do falls within those umbrellas and thus doesn’t need to go through the same process and make management and the executive layer of departments a little more comfortable with that fact.

Steve: I think implicit, and maybe you said this explicitly, the kind of processes and procedures and things that would be required to do “design research” are less onerous than to do public opinion research. Is that correct?

Colin: That’s right. I think that public opinion research is – is I think often interested in policy and government level questions about public opinions, right? And design research is often focused on questions like what’s it like to interact with this department or this service? And those are different things that require different methods and different rules of evidence and so what we try to do is keep people from getting them mixed up in one another. Now I think if my colleagues from other parts of Treasure Board were here, they would remind us that there are some forms of design research that can kind of verge into typical public opinion research and then it’s important to engage the public opinion research process. But there are also lots of things like usability testing of a new service interface that are clearly not in the realm of public opinion research and therefore we’re really happy to encourage departments to do.

Steve: So, in this convening that you describe, sort of finding these, you also mentioned that there are maybe hard to find – I may be putting a word in here – surprising…

Colin: Yeah.

Steve: …sort of areas of research. How have you come to find those people and that work?

Colin: Well you know it’s funny Steve. I think as we were setting up the community of practice, I realized that recruiting participants for research was kind of the best practice I could have for recruiting members of a government design research community of practice. So, like when you’re recruiting people for research, you put out a call for the kinds of folks you’re interested in, but you also – you snowball, right? So, we started with people that we knew and we said hey, do you know anyone else that does this kind of work, or is interested in these kinds of questions and do they know anyone else and do they know anyone else. Often the third or fourth degree from there, you get to folks who are like, “oh, I didn’t even realize what I was doing was design research, but it is and I’m excited to find this group of people who’s all trying to do something similar.”

Steve: So, by bringing these people together, some of whom wouldn’t even have identified with the labels that we would put on what you’re doing and creating a chance to share and improve practices, this seems like it spreads far beyond what design research within Canadian Digital Service would be involved, right?

Colin: Right.

Steve: The reach is much broader.

Colin: That’s absolutely right and that’s the reason why we did it, right. So, our mandate isn’t just to build services with departments. It’s to continue to improve how the government delivers service more generally. And one of the ways we do that is through finding the folks involved in doing what we do and trying to enable them, right? Give them more tools, whether they’re policy tools or methodological tools. Whether it’s – give them a space to vent or give them a space to celebrate. One of the hard things, I think, about pushing something like design research within government is it’s – it can be hard to find a place where there’s a group of people like you who are really excited about the kind of work you’re doing. And so I think there’s some practical benefits of the community and there’s also some sort of emotional support that happens in the community of practice that’s really heartwarming.

Steve: Are there ways to find – this is maybe the counter example, or the other part of the set – so that teams or departments or groups that should be doing this, that maybe don’t know that it exists, or don’t know that it’s accessible to them, that it’s reasonable or feasible, but would help them in their efforts to improve the way that they serve the people that they’re serving? How do they fit into – I mean it’s kind of a boil the ocean question since you’re already finding everyone that is doing it, but sort of opportunities to build the practice, whether you all are providing that, or you’re enabling them the way you’re enabling the people that you’ve convened? How do you think about that?

Colin: Yeah. It seems like your question is really like what about the people that aren’t doing research and should? And they are also an important group. So, I’ll say a couple of things about them. I think that most major departments do have some groups of people trying to do research by some name, right? So, they’re trying to do client experience improvement, or your digital transformation group has a group of user centric specialists, right? So, there are people trying to do something sort of related to design research, at the least. I think what we try to do is find those folks and then we try to introduce them to the body of knowledge that’s specific to conducting research. Like we believe research is a craft in and of itself and it’s hard. And it takes some work, but it’s also accessible and something many folks can learn. So, with that in mind we try to locate the folks who are working in the generally related areas and inspire and equip them to do this kind of work, or continue to improve how they do this kind of work. You know we talk about the challenges to not only design research, but sort of digital best practice, at multiple levels. One of them is the policy or sort of structural level and we – things like our POR guidance clarification, the public opinion research clarifications, those certainly help folks, but they’re not enough, right? They – folks also need the skill to do the work and they need opportunities to learn. We try to create lots of informal opportunities to see what other people are doing. And then, you know, the bottom layer, beyond just policy and skill, is, for lack of better word, inspiration. Showing that it’s possible, right? Like showing that this work can happen within the government, within all of the unique factors of the government, and that it’s useful. And so I think we try to not just give folks kind of on the edge some policy tools, but we also try to expose them to the skillset and we frankly just try to show it’s possible and continue to cheer them on as they push it forward.

Steve: That’s good. I think we’ll probably come back to the ways that you’re working with teams and departments, but I want to go back to one of the other things you said early on. You were kind of describing the two main things that you’re involved in. We’ve talked about sort of convening the community of practice aspect here, but you also talked about leading and coaching researchers. You were really emphasizing building skills was almost a core value. Like this is a thing that you’re really thinking about being an outcome for people that work in research. Where does that come from? That is not a universally held belief I think in groups of researchers.

Colin: Yeah, well I think part of it is born out of the broader mission that we’ve been discussing, right? So, I think we are not just here to do research that helps create better services. We’re here to help the gospel of research spread around the government. And so in order to do that I think I need to care both about continuously building the skills of our staff and then taking those same tools and making them available across the government. So, for me it’s hard to figure out how I could say, “ah, broader government you should be building your skills in this way” if I wasn’t also practicing that as a leader in our own community. And I’ll say we’re not just interested in sort of helping folks build their skills, but helping – and helping researchers within our group build their skills. We’re interested in them learning about how to teach other people about research. So, I think we’re trying to build the empowering and enabling and teaching ethos into the folks that come into our group that makes it easier for them to interact with our partner departments. It makes it easier for them to have useful conversations across the design research community of practice. So, I think that from a mission angle is why that’s been a central part of how I view my work. I also just fundamentally believe we can all continuously become better researchers and that one of the ways to do that is focusing on skills building as a continuous improvement approach, not as a sort of set it and forget it professional development activity.

Steve: There’s often service design happening, of various kinds, without there being research happening. Even though we should all clutch our pearls at that idea, it still happens. So, at some point research is given a title, given a mandate. Someone like you is involved. Can you talk a little about sort of your role and your own trajectory? Where it came from and how it got to where it is today?

Colin: We’re relatively new and relatively young. I think we are blessed to have a dedicated research group and research leadership, given our size and our age. So, how did we get there? You know as – when CDS was a young and scrappy startup or a handful of people, I think the roles were not as clearly defined and people did whatever they could do to help the partnerships move forward. And luckily one of the skills in that mix was research. There were people there who thought that design research was important and who liked going out and talking to people about services. And so from the very beginning that was a part of who we did our work. And so as we grew, and then we had to kind of start dividing into more discrete communities of practice, and being a little clearer about what people’s roles were, I don’t think there was ever a question that research was kind of an important dedicated skill. I think it’s very related to design and so I think that we’ve gone through some iterations of figuring out kind of exactly how we relate to design. But right now we’re a parallel community at the same level as design or product management or engineering. And I think the more we exist that way, the more we like it because researchers need different kinds of support and coaching than other folks do, right? Research is a really different skillset then designing, at a service or an interaction or even a visual level, right? And so I think our researchers are really happy to have a dedicated group where they can get feedback on their craft and have managers that are rewarded and selected by not just their expertise sort of in general design, but in doing research in this world. So, I think we’ve been happy with how that’s shaken out. And I think we’re just – we’re lucky that research was kind of part of CDS’s way from the very beginning, down to the people who were founding the team.

Steve: Did you come into CDS in the role that you have now?

Colin: I did. So, I came to CDS with experience at a similar organization in the U.S. federal government. And when I arrived it was to lead the design research team with a recognition that that was a distinct team with sort of distinct support needs and a need for a manager that knew research and knew about doing research in government. So, I know that that’s not always true and so I feel very blessed to have ended up in a situation like that.

Steve: So, there was a team, or a nascent team, and that team needed leadership?

Colin: Yes. It was a small team at that point. I think there were three of us when I arrived, and we’ve grown substantially. I think we’re now 10 people or so, and continue to be a part of the growth plans for the org.

Steve: So those 10 people – you sort of described early on about how they would be working closely with a specific department or team that they were kind of part of.

Colin: Yeah.

Steve: Maybe you can talk about what that cycle looks like or sort of how projects or jobs or roles and researchers are matched up over time. What does that look like?

Colin: Sure. Yeah, so researchers are embedded on interdisciplinary teams. We call them product teams. And those product teams work with partners through some phases. So, the first phase is a discovery phase and that is a phase where researchers really lead in open-ended research about the nature of the problem that the department is trying to solve. We usually get set up with departments who know they have an issue and are interested in kind of our different approach to things, or they have a goal and they’re interested in us helping them achieve it. But researchers lead their teams in unpacking what that means, particularly for the humans involved, right. There’s important technical discovery that happens in parallel, but we really try to get the whole team involved in talking with both the members of the public implicated by the service, but also the staff involved in delivering a service. You know government services are this wonderful like sociotechnical network of people and we try to understand how those all fit together in discovery. It’s very rare that we would just talk to the public. I think we would spend a lot of time doing that. We really emphasize that, but we also really try hard to see the other side of the service and how all those people work together to create the experience the public sees. So, that’s discovery, right. Getting the lay of the land and understanding perhaps some of the possible roots of the problem.

And then a team transitions into alpha and beta phases of building something. What that is varies tremendously based on what the problem is. I think one of our product teams with Immigration, Refugees and Citizenship Canada did a bunch of work on a letter that was involved in a rescheduling process. That was sort of classic paper content design, tested by a researcher. Happened in conjunction with some digital work they did, but was just as important. So, that could be part of an alpha or a beta, or it could be about building a new digital service like we did with Veterans Affairs Canada building a new directory of benefits for veterans. Those products take different shapes, but regardless the researcher is trying to bring the voice of all of the people we talk to in discovery back into the product development cycle. and so that can mean usability testing, or content testing. As our products get more mature, we also use quantitative methods. We run randomized control trials. We try to use analytics on things that are out in the wild. They use all of the methods at their disposal to try to keep bringing the voice of the people we’re serving into the product process. And I think what that looks like, as I say, just varies tremendously and we kind of like it that way. I think that’s one of the cool parts of being a researcher at CDS is we say well you have this team, they have their needs, you need to serve them and make sure they have the right information about people to make good product decisions, but there are lots of different ways we can get at those answers and we agilely – probably to overuse that word – assemble the methods and the timelines accordingly.

Steve: So, a researcher working on, for example, the Veterans Affairs, for the duration of that program that’s the thing that they’re working on?

Colin: Exactly. Exactly. So, we’ve been really fortunate to basically have one researcher per product team and to be able to keep them on the same team. Which – and sometimes that’s not possible, but you know research, as hard as we try to not do this, sometimes becomes a practice of implicit expertise, right? I think researchers who have a long history in an area, or with a product, kind of know things about how people will respond to the service that are hard to articulate or sort of systematically articulate in reports or the other ways that we try to codify that knowledge. And so we see huge benefit to people having some longevity in the product teams and thus in the domain that they’re working in. Not always possible, and I understand not possible everywhere, but for us I think researchers really enjoy kind of building a deep expertise that comes with doing 10, 15 or 20 studies for a particular product in a particular area.

Steve: Depending on the type of organization, the carryover from one product team to another – I mean in my mind government is sort of an example of a category where there’s a lot of different things that you’re doing.

Colin: Yeah.

Steve: And obviously there’s cultural things and organizational things. Whereas if you’re working in some commercial enterprise that maybe makes a lot of different products and maybe serves different customers, the breadth might be less.

Colin: Yeah.

Steve: So the value, I guess, if I’m doing the math in my muttering – the value of that sort of hard won knowledge is maybe necessary to preserve or cherish it in a different way, just given the breadth of what you’re doing.

Colin: I think you’re right. And one of the fun things about getting to support these folks is that I can have conversations on a given day that range from how the Royal Canadian Mounted Police want to handle cyber crime to how Canada Revenue Agency processes tax returns for low income people to how the Employment and Social Development Canada issues benefits, right. And those are incredibly different, both business processes and missions, but also involve very different people and very different kinds of concerns on the parts of those people. So, it’s pretty neat to get to hear about all of that, but it also creates a challenge in that when we start on a new team, or a new researcher comes to a new team, I think there’s some sort of basic government knowledge they’ll bring with them, but often they’re trying to come up to speed on a pretty complicated area, pretty quickly.

Steve: When you described earlier your role is to – you know in that kind of coaching relationship you have with the different researchers – but it sounds like you’re the one that has the overview or has the window into these different product teams and what they’re doing. Are there researchers – what kind of interaction do they have that isn’t through you necessarily about what each is working on or what kind of challenges they’re facing?

Colin: Yeah, absolutely. I am – one of the things I learned very early on was that I was not the best conduit for all of that information between them and each other. And so I increasingly view my role as creating opportunity for them to share directly with each other and across the org. So, what that boils down to – there are first of all researcher standups, where we talk about what research we have done, every week. And we try to focus on not just kind of what we did, but what we’re learning. And that’s usually not enough to create understanding, but it’s enough to create a hook, right. It’s enough for one person to say, “oh, that person is really talking about the experience of submitting a form that’s an application for benefits and that’s actually kind of similar to what we’re doing, so maybe we should go have a more deep conversation, or I should ask for some documentation.”

We also have rotating, dedicated research critique groups. So, researchers form groups of 3 or 4. They meet weekly to give each other feedback on their work. It’s a little trickier to give critique on research than it is on design artifacts, but just as important. So, they’ll go through research plans with each other. They’ll go through recordings of interviews and sort of talk about the approach that the researcher took and different ones other people might have took. They’ll go through reports and deliverables. And although the kind of stated purpose of those sessions is to help people grow and share skills with one another, I’ve observed that a real common output is better cross product knowledge between them, right. If you’ve spent the time really thinking about the pros and cons of someone’s research approach you tend to understand their product a bit better. So we have critique groups.

And then one of the other kind of structures we have at CDS is these research community meetings that are actually open to everyone from the organization. So, every other week we host these 45 minute kind of brown-bag, lunch style meetings where researchers give talks. They give talks on either their product work, when they’ve recently finished it up, or on their – or on new or interesting kind of methodological or government logistical things that they’re working on. So, we try to create lots of channels for that information to flow. I think that we’re still kind of a size where some of those meeting and interaction driven approaches work. As we grow, I suspect we’ll have to get more creative.

Steve: Right, critique circles for ten is different than doing that for 80, or brown-bags. Which is crazy that I would throw that number around and everyone would nod, like yeah, that’s the size of research groups in some organizations now.

Colin: Sure, sure.

Steve: It’s not that long ago that that was an absolutely ridiculous idea.

Colin: Absolutely.

Steve: So, at ten, you can have some kind of communal knowledge just based on – well, you’re putting formal things in place for sort of semi-formal knowledge exchange.

Colin: Yeah. Yeah, that’s right. I think that we’re certainly beyond the totally informal everyone talks over lunch about what’s happening size. I think we are at the edge of what works well for critique groups and sort of community meeting exchanges. But again, it’s a little trickier for us because our domains are so different, right. And so it’s harder to say we’re building a common understanding around the user of a particular product that we all study. Often, we’re studying very different things and there are things to learn across them, but there’s also real differences between them.

Steve: I’m going to switch gears a little bit. I think it builds on some of what we’re talking about, but as you talk about the team having grown, and maybe growing into the future, what do you look for? What makes for a good researcher for CDS?

Colin: So, we focus broadly on a couple of things. I think the first is craft, for lack of a better word. So, when we hire people, and when we’re going through interviews, we really dig into the details of their previous work. You know what – how did you construct the research questions? Why did you pick those questions? How did you pick the methods that you used to answer those questions? Why did you pick those methods? We spend cumulative hours working through that with folks because we really believe that the basic skills are so important because they’re hard to – if you don’t have them down, they’re
hard to maintain in our challenging context, right? So, you really have to be a great baseline researcher and know the basics really well to succeed here because – and this gets to the kind of second factor – it’s hard to do research within government and on product teams. And I think it’s hard to research everywhere, but there are a number of challenges that folks need to be ready to meet. One of those that we look for and also spend a lot of time talking about is ability to recruit research participants and build relationships and structures around doing that. So, because we’re changing domains and we’re going into new areas, it’s not uncommon for us to start a product and put a researcher on that product that has a very specific kind of group of people that we’re trying to talk to, and would be very hard to pay a commercial recruiter to access. And so then they have to get creative. They have to go make friends in the advocacy worlds related to that department. Or they have to work with that department to use administrative data to find people. All of these things are much harder than hiring a recruiter to do the work for you. That’s not to say we don’t use recruiters. Sometimes we can, but a lot of what we do is so narrow that we need to find people who don’t just love the craft of doing research, but love the craft of recruiting. I think over time we continue to look at ways to make less of the job, but it frankly just still is a reality given how many contexts we operate in and what people have to do. So, recruiting is one piece of it.

And I think the other piece of it is broadly what we call bureaucracy hacking and that is being able to navigate a bureaucratic process to get your work done within the time that we need it to get done and with enough cognitive flexibility to kind of think through what the really important pieces are, where there might be alternative routes to the really well trodden one, and ultimately get a result despite a complicated, multi-person, multi-process situation. So, we look for people who are excited to do that and who have some demonstrated skill navigating situations like that.

Steve: Is the public opinion research story you talked about earlier, is that an example of bureaucracy hacking?

Colin: I think it’s an example of institutional level bureaucracy hacking. I think that product teams themselves and people who are working on those teams often have to do the same thing, but kind of at a lower level. So, they’re working with a partner who says, “oh, like we have a departmental process around talking to this group of people. We need to go through that process in order to do this research. That process usually takes nine months. Your research is supposed to take 3 weeks, how do we make that happen?” And it’s kind of an interesting skillset, right, that’s required to navigate those situations. It’s not just sort of knowing what good research is. It’s also your ability to think analytically about a process and understand the reasons for the pieces of those processes and then look more broadly. I will say, as we grow, we try to do less of that at a departmental level and more of that at kind of a broader institutional level, but that’s still a work in process.

Steve: It’s interesting, and maybe just coincidental, but you talked about looking for recruiting skills, very specific kinds of skills, in talking to researchers, and then this bureaucracy hacking example that you gave is around the logistics of recruiting participants.

Colin: Yeah, yeah.

Steve: You know and – I mean I’m glad we’re highlighting recruiting. I think it’s sort of a neglected – just get people and talk to them sort of is sometimes the belief in research. You’re talking in some cases about getting to maybe harder to find groups of people, or groups where there’s a specific relationship. And we also talk a lot, in research in general, about operationalizing some of those things. We haven’t talked about that. What’s your view on – whether it’s the recruiting part or just in general, in the context that you’re in, how does that idea play out?

Colin: Yeah. So – I mean I think that it’s not surprising and certainly important that the industry at large is increasingly focused on operationalizing processes like recruiting and thinking about I think ways to devise that work and to scale it more efficiently. That seems totally reasonable given the size of research teams that are now part of the modern organization. I will say, at CDS I approach it with some caution, and I’m worried I’m going to sound a little like old and cantankerous, despite not being much of either…

Steve: You have me though on the call. So, just compare to me. I can be old and cantankerous. I’ll take the heat on that.

Colin: I appreciate that, Steve. No, I – for a lot of our work, the work of doing the recruiting is part of the research itself, right. So, actually going out and making connections with community organizations or professional boards, or senior centers, or all of the places where we do our research, that’s part of how we understand who we’re trying to serve and the social structures that surround them. And so I – when I think about how we’ll scale, I try not to think about ways that we would totally take that out of researchers’ hands because I think they would lose part of the picture of who they’re trying to study if they were to do recruiting in its own right. If they were to sort of separate that into a different role. I think that – the other thing that strikes me is that so little – like when I look at the substance of our research, at the substance of what we do, some of it would I think be possible to kind of mechanistically do faster, right? Like if we had better templates and better knowledge stores and sort of better processes to string those things together, like it could speed things up a little bit. But often the idiosyncrasies in people’s research and the sort of surprises that don’t fit within the template are the most important things that our researchers find. And you know perhaps that’s a sign of our organization’s maturity, but I get worried about kind of whisking all of it into a well-oiled, sleek process for – you know I wonder about the sort of edge of whiteboard, straggly sticky-note on the margins insights that we miss. And those – and for us, in our work, those are so often key, right. Those are – we’re still building an understanding of the space and so often those things that don’t fit well within your predetermined framework are the most interesting parts for us. And I worry about losing that. That said, making the logistics of scheduling people and where they go easier, I am all for it and we continue to look for ways to do that. But I think we have to be thoughtful about what we automate in our industry, just like all knowledge workers should be, I suppose.

Steve: You made a comment a few minutes ago when I asked – when you just said research in general – you said, “it’s hard.”

Colin: Yeah.

Steve: I just want to – I want to go back to that. I don’t know if I agree or disagree. Can we just reflect on that notion of research is something that’s hard? Like what about it is hard? Should it be hard? Is that a bug or a feature?

Colin: Yeah. As it came out of my mouth, I’m like oh that’s kind of a complicated statement, I wonder what you meant by that – to myself? I would say this. In some ways research isn’t and shouldn’t be hard. I think one of the things I love about this work is that I can sit down with someone from any one of our teams and explain some basics and they can go out and start learnings things in a more systematic way with some good pointers from a good 30 minute discussion. And that’s great. And so I don’t want anything I’m about to say to be read as kind of discouraging making research accessible to everyone. We say everyone at CDS can be a researcher and I really believe that. I will say that research is also a – one of those things that’s very easy to do mediocrely. I think that there’s a lot of subtleties to how you make people comfortable in research sessions. There’s subtlety to how you digest lots of qualitative information. There’s subtlety to how you arrange your research, so it influences your product in the most responsible, but impactful way. That’s all – that’s all – I think there is subtlety and trickiness to that and so I think it’s important to have an appreciation for people who are really good at doing those things and to – I don’t necessarily count myself as one of them for all of them – and to recognize that that is a real skill and that is a skill that we should – that we should celebrate in kind of our broader industry community. That is, I think we can do that without also saying hey, there are some basic things that people can do that are easy and quick and help make more people gather more data to make better decisions. I think those are sometimes placed in a false binary in the Twitter sphere, for example, and I am not as interested in that. At CDS, and I think my comment was somewhat related to the Canadian Digital Service in particular, there are some things that are quite – I won’t say uniquely hard about research, but are specific to our context. You know one of those is the sort of incredible domain switching that we expect researchers to do every 3 to 9 to 12 months. And that’s, I think, not unusual in consulting, but it is somewhat more unusual in a product driven organization like ours. There is also, I think, the hard work of figuring out how to fit your research into a interdisciplinary team. You know we don’t just write reports and make presentations and give them and sort of hand them off to designers. We don’t let people off the hook at that phase. They’re kind of in the trenches with all those other people as the product is being developed. And we expect the way that teams run their agile development process to reflect research and for researchers to be a voice in that. And I think there are lots of organizations that have that expectation. I think we’re trying to do that in combination with domain switching and with the third part of it which is, again, we’re trying to build skills. Not just in ourselves, but in our departmental partners. So, you’re trying to learn a new domain, fit your research skills into this rapidly evolving, interdisciplinary team, and help a partner learn the basics of research and appreciate research. I think that’s hard. I think it’s fair to say that’s a difficult thing to do.

Steve: I want to switch a little bit. I would love to hear if there was a point in your own personal, professional path when research was a thing that you identified with? Like I want to do that, I am that, that’s for me? I don’t know, is there a moment or a stage at which you connected with the field that you are in now?

Colin: Yeah. There was. I conducted my first usability test when I was in 7th grade. I was – this was pretty early in the days of such things. But I was working on the school’s website. That was sort of my hobby. I helped the computer teacher with that. And I was reading Jakob Neilsen’s Homepage Usability book. There was this beautiful book with multiple sort of printed out homepages and he talked about his methodology in the back and I read that. I was like oh, this usability test is sort of something we could do. So, I got someone into the computer lab and tried it out and there was this moment of like wow, when you ask other people to try something, and when you ask them questions about what they’re thinking and what their goals are, you challenge yourself in ways that you – that I didn’t expect. And it’s also incredibly rewarding. It’s an incredible high. I don’t know how else to describe it. And it continues to drive me and the folks on my team. So, from that moment on I knew that I wanted to do this kind of research in some way professionally. There were lots of steps between there and here, but I knew pretty early in life that I really loved learning about how people used computers and services and understanding their approach. And, you know, I’m really blessed to work alongside lots of other people who are similarly jazzed to learn those things about people.

Steve: And that’s an astonishing story both – to me – in the early point in your life at which this happened and the specificity when it happened. I think often these stories are about oh I went to a farm and then I realized I wanted to be a marine biologist. Like the connections are more diffuse. But you, at a very young age, were doing exactly, or pretty close to exactly the thing. I mean it’s not even a metaphorical discovery.

Colin: Yeah.

Steve: You literally discovered the work.

Colin: It’s odd to me too. And I do look at friends’ and family’s career trajectories and I’m like hah, I didn’t think that’s how it would work out for me, but I really – I just loved it. And I looked into other things, but kept coming back to this, this being what I really just enjoy doing.

Steve: Are there any points, whether it’s 7th grade or older or younger – what kinds of things can you look to in your earlier parts of your life that were, I don’t know, maybe weak signals or kind of nascent superpowers that connect to what you’re passionate about and what you’re spending time in now? Does anything exist earlier in your life?

Colin: You know, one thing that has always been somewhat odd about me, and I do think relates to my passion for this work, I have always loved seeing the metaphorical or physical backroom of the process. So, like when I’m flying, I’m always like hah, like what does the computer system that the, you know, gate agent used look like? And like what do they do and when do they do it and why? And I’ve always – I’ve always been like that to a point of I think some well natured teasing from my family about my desire to understand the details of how the parking ticket machine dispenser system works and what the numbers on that thing mean. So, I think, especially to do this work in government, you have to kind of have a love of uncovering process and humans and how they interact with that process. And the trick is you don’t have to love like process, but you have to love kind of post modern process. Right? You have to love the fact that like it is different things to different people and kind of is this great mirror game of clarity and unclarity all at the same time. That’s something from an early age that I’ve always really enjoyed. I also would say that I – so early in my career I worked for the U.S. National Park Service and one of the things that I did there was being a park guide. So, be out on the trail talking to people and listening to them and answering their questions and seeing how they interact with a place. And there – it’s real joy – I feel real joy to seeing people in an environment and both helping them discover things and see how they discover things. And often I feel like my work today isn’t that much different, right. I think I’m more open ended than park interpreters are, but it’s still here is this new land and this new world that you are encountering, and I get to be with you as you’re doing that and help you think through what you’re doing. And so I think the kind of odd areas of joy I found in that work certainly kind of echo through to what I’m doing now.

Steve: Is there any example of a notable success? Something that, you know, maybe that your researchers have done and kind of completed that you feel proud of? Or just kind of a notable example to share?

Colin: Yeah. I think that – it’s funny. You know we’ve done a lot of research that I’m proud of and I’m proud of our researchers for doing at CDS. But the things that make me most excited are the moments when I see our researchers teaching and coaching people and departments to start doing research themselves, right. So, for example, with the Veterans Affairs Canada project they – the department identified someone who would be a researcher and sort of continue on the research with that product after we handed it back off to them. And I, you know – those moments when those people were working together and getting to support them through that, I think those probably make me feel prouder than any particular deliverable or product that we’ve gotten out the door.

I will say the other exception there is I’ve looked – I was looking at our sort of internal database of research before this interview and I’ll say the other thing I’m particularly proud of our team for doing is becoming more and more diverse and creative in the types of research it does, right. I think when we started, we were trying to make the basics happen and so we did lots of interviews and we did lots of usability tests. And those are great, but there are other things to do too. I think we’ve become not just qualitative, but mixed methods. I think that we have embraced the complications of trying to kind of test services in more live ways, right. So, not sort of just usability test prototypes, but wire something up so that someone can complete an actual transaction with it and see the back end experience and orchestrate kind of a Wizard of Oz experience for them. Those, those increases in our skill and our ability to mix and match tools responsibly, that’s incredibly rewarding as a researcher at heart.

Steve: Is there anything that I should have asked you about that I didn’t, and you want to make sure we talk about?

Colin: No. I think your questions were well rounded. Nothing comes to mind from my list.

Steve: Do you have any questions for me?

Colin: What’s the most surprising thing you’ve heard during one of these interviews?

Steve: Well, this is going to sound like I’m pandering, but the fact that you did a usability test in 7th grade is – I’ve just never come across that in my career, let alone in one of these interviews. But actually – I will say – I said I would take the weight of being the old curmudgeonly person. So, here we go. I don’t like the what’s the most surprising thing question ‘cuz it presupposes that the person – and I realize this is not exactly research since we’re doing a podcast, but as an interviewer in general, I’m not trying to be surprised. I’m trying to be focused and interesting and sort of the most benign sort of factual thing should also be exciting and triggering of interpretations. So, you know what I mean? Like the surprise is me bringing my judgement into it. So, of course I gave a concrete example. I was surprised by something that you said, but I also…

Colin: Yeah.

Steve: People ask researchers that, right? When you did this study what’s the most surprising thing? And I think sometimes that question can be good if it provokes what are we learning, or where are our assumptions being challenged? But, how do we – I also want there to be space for like hey I didn’t come in here with assumptions. I came in here with just curiosity.

Colin: Yeah.

Steve: So, I’m kind of talking out of both sides of my face here, I guess. I do that.

Colin: Well, I agree. And I do think there’s a real risk to boiling research into a set of surprising and non-intuitive insights, you know. Like the idea is that we should be manufacturing surprising little blobs of knowledge that can be passed off into the ether. And I worry about that. I worry about the work getting boiled into that. And so I hear you. And I guess I agree with your curmudgeonly instincts.

Steve: Aren’t I a nice person? Any questions for me? I mean I’m going to slap you for the first one that you asked me. So, I’m just living up to the curmudgeonly mantle I chose.

Colin: I expect nothing less. Well, that’s good. No, I would say I am curious as to what your views on the burgeoning research ops movement are? As someone who I think has been in this world for many of its evolutions, what’s your reaction to the increased operations emphasis?

Steve: I mean I have – learning about research ops provoked a uninformed critical reaction. And you said it really well when you talked about the value that the recruiting can bring to the researcher. And, you know, we hear researchers talking about the value that they get out of doing their own transcripts. Personally that’s not a thing that I could deal with, but immersing yourself in the subject matter and the people in the community and the data and the conversations is super labor intensive and the idea that we’re going to fix that by kind of operationalizing things scares me because it takes away the quality and – you talked about subtlety, I think. But if you look a little deeper into what research ops people are championing, they’re not championing that. They’re – that’s my chauvinistic misinterpretation. And I think just by creating that word, I think it invites other people to also do that and maybe even implement that. But the people who are championing it are thinking about process and thinking about how does this exist? And, you know, having spent my entire career as a consultant, so not embedded in the organization, not building these processes – maybe advising on them, or kind of giving feedback, it’s sort of an interesting thing to watch from the outside. It comes up in a lot of groups. I talk to people who are kind of early on in their organizational maturity. You know I read something recently – oh, some research ops people were sort of declaring that the first researcher hire should be an ops person which, you know, from the outside seems like well of course you would say that. That’s sort of a discipline that you are evangelizing for. But I found myself working with a group where I just mentioned that idea as an idea that exists. I wasn’t in a position to say they should do this, they should not do it. But I said, given what you’re talking about your barrier here is ops, is infrastructure, is recruiting, is not reinventing the wheel. And I said, you know – they had headcount to hire one person and I said well what proportion, from zero to 100, is that person going to be dedicated towards this? You know, if the same way that you are – that you talked in the beginning, about your organizational change efforts, you know convening people, finding processes, sharing best practices. I think a research ops person can also be doing that, you know at a different level and using kind of different processes than you’re talking about, but…

Colin: Yup.

Steve: I mean that is championing the growth of research as a thing that we have skills to do at various levels efficiently. That’s what you’re – that’s one of the things you’re doing. That’s one of the things a research ops person could be doing. Taking away the hard, labor intensive, inefficient tasks that lead to the subtlety that makes research great, that you described -that’s not a thing I want to do, but I – it doesn’t have to be what ops is, so.

Yeah. If there’s nothing else, then maybe we’re at the end. Really great conversation. We covered so much interesting stuff and I learned a lot and I’m really so happy that we had the chance to speak today.

Colin: Likewise. Thanks so much for your interest. It’s been a pleasure.

Steve: Thanks for listening! Tell your colleagues about Dollars to Donuts, and give us a review on Apple Podcasts. You can find Dollars to Donuts on Apple Podcasts and Spotify and Google Play and all the places where pods are catched. Visit portigal.com/podcast to get all the episodes with show notes and transcripts. And we’re on Twitter at dollarstodonuts, that’s D O L L R S t o D O N U T S. Our theme music is by Bruce Todd.

About Steve