28. Laura Faulkner of Rackspace
Podcast: Download
In this episode of Dollars to Donuts I speak with Laura Faulkner, the Head of Research at Rackspace.
I’ve never just sat and done just what I was asked to do. I’m always looking for something new, something else. It’s probably just part of how I’m built but it’s also a conscious choice of, of just doing my my current job is simply not enough for me and even just that extra 10 minutes of curiosity or desire to see something else or learn something new while still doing my job and delivering on that, it’s just opened so many doors helped me step into a lot of opportunities and learn new things. – Laura Faulkner
Show Links
- Public Speaking
- Laura on LinkedIn
- Laura on Twitter
- Rackspace
- Blink by Malcolm Gladwell
- Applied Research Laboratories
Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.
Transcript
Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.
I just watched Martin Scorcese’s 2010 documentary “Public Speaking” a profile of author Fran Leibowitz. I honestly didn’t know very much about her, but she’s a smart and acerbic social critic, whose persona, and perhaps her life is built around her strong opinions and her skill in expressing those opinions. She repatedly emphasized in the film how there are too many books, and thus too many poorly written books; she’s pretty critical of the ability of most people to have an interesting story to share and to have the ability to communicate it in a valuable and valid way. While her career is indeed public speaking in one form or another, for the rest of society, she advises public listening. But really this is because she wants to be listened to, she doesn’t necessarily want to listen, she wants to talk. This is in no way meant to be an indictment of her – she very clearly exists in a world of her own making. While the mode of the times, and indeed given the work that many of us do, emphasizes the importance of everyone’s stores, and we put real effort into inviting people in and giving them space. I mean, this approach, this mindset is at the core of user research! But I found it remarkably refreshing to hear a different point of view.
It reminds me of an experience I had in the field last year interviewing a head neurologist, someone who was very successful, who had written multiple books and taught surgery around the world. Before this project, I was kind of warned to expect a lot of ego from surgeons especially, and I guess in some ways this particular interview subject was the platonic ideal of that. He would not sit through my carefully scripted concept presentation. He interrupted. He pointed out his bookshelf of published books, he told us about his children attending an Ivy League school. He gave examples of where he’d been influential, how many times he’d taught, how much money he had saved the hospital with his process innovations. I had to adapt my approach for him so that we could get through the session. After we left, my colleague, who worked my frequently in this domain half apologized and half-checked-in with me to see if I was okay or not, given how quote awful and difficult the interview had been. But I didn’t have that feeling at all. The interview was amazing! We learned so much from him! He was incredibly accomplished, and very very opinionated. And like Fran Liebowitz, a total New Yorker about it.
But I wasn’t having lunch with him, I wasn’t trying to be friends with him. I was trying to learn from him. And I approached him – as I do everyone I interview – as an expert. He was just more able to act in that role that most people are.
I am very sensitive to people who don’t listen to me, or who let their ego run all over me, or otherwise disrespect me. But this was an interview situation, and his so-called arrogance and ego was amazingly helpful and honestly didn’t feel disrespectful in any way, so it was just fascinating to see how someone else in the same session felt about it.
I’m glad there are people like Fran Liebowitz and this surgeon. They are in the world to do amazing work, to inspire people, to make a difference. That doesn’t mean I want to hang out with them and try to tell them about myself. But that’s okay, isn’t it?
So, we’re back with some more episodes. As of this recording, we’re in a particular bizarre period. This episode was assembled quickly, with the budget I have available for this, which is zero dollars. I’ve foregone my professional editor and my professional transcriptionist. You will hear some digital noise and distortion on my part of the upcoming interview, but it comes and goes, and I’m not talking too much, so I hope you don’t get too bothered by it.
It is a challenging time for businesses of all types, and certainly for small consultancies. It’s a challenging time for my own practice. I hope if you’re listening to this that you’re in a position to consider me as a potential collaborator for user research, training, coaching, and facilitation. Thanks!
All right, let’s get to my interview with Laura Faulkner, the Head of Research at Rackspace.
Steve: Laura, thanks for being on Dollars to Donuts. It’s great to speak with you.
Laura Faulkner: Thanks, my pleasure.
Steve: So I think it’s a great starting point would be to ask you to introduce yourself. Tell me and everybody what about your work and we can kind of dig in from there.
Laura: So I’m Laura Faulkner. I have a doctorate in research psychology focusing specifically on human computer interaction. I got that from the University of Texas about 10 years ago as I was already working in the field. As of just a few weeks ago, I celebrated 24 years in this field. So I was in it before it was a job. And when we were just doing it and practicing it alongside whatever else we were doing, especially in technology now I’m the head of research at Rackspace, which is rather fun in in a couple of ways. One is that it’s an enterprise organization. And we do really deep technologies essentially provide the internet and manage that for folks. So we’re doing really complex interfaces. And I love that I love that problem space with all those bizarre technical details and add lots of complexity to manage. And our to make that easy for customers who like are sitting in their chairs, working hours and hours every day, and often in high stress situations. It’s just really fulfilling. So I’ve been here about three years, three and a half years and our team works as as an internal agency really, for the entire organization of Rackspace. So we’ll do pretty much any kind of research for anybody who asks
Steve: That’s a nice position to be in or not? I don’t know. We’ll do research for anybody who asks you, how did that come about?
Laura: So within the organization, we were we sit alongside design and where we work with them and and we have some close partners with a content team. So we’re like always at the ready for them. But that ends up being just a fraction of our work. We will, will work for product to the any of the product managers or product teams that need to want to market test their product ideas, and the directions that they’re trying to go or to prioritize their products to discover demand in the market, what’s going to make things easy and best for them. And then we’ll do larger scale market research. We’re currently running a large scale market survey, because it it lets us flex to whatever questions that that any of right anybody in Rackspace has that is that where there’s unknowns and they can’t move forward to make a decision that we’re all about that we’re our mission statement is to spark fast, confident decisions. And so that’s where we come in. And sometimes it’s just to confirm unknowns that certain teams are having, or to break ties. We break ties a lot with the data and and also to support business cases based on what’s best for our customers and our users.
Steve: Stay over the 24 years. I was gonna ask about this later, but now you’re making me think like the arc of, you know, your, your experience in the profession over 24 years, but also your observation of what the professional looks like. you’re describing an interesting state. I think, you know, this this mission statement and the The positioning and the interest and the provision, I guess, to help the whole company kind of wherever this, this, you know, critical decision, information is needed. You’re there to do it. You know, how, if you look at that positioning where you are now and kind of what you’re able to provide? How, how would you describe the arc of just this one part of it? Like, who and how do we work with and what’s kind of how are we? I keep saying positioning, but but I guess that’s really what it’s about. How do you think that has evolved over 24 years?
Laura: That’s a great question. And I really I get really excited about this question because I see this as one of the possible futures of our profession. So we we began and even the DNA of the team I currently lead began holy just like as design testing kind of organization at the beginning of my career. It was really a unicorn position. It’s like there was the design and the research. And I, I like I have a special like passion for the research part because I have degrees in sociology, anthropology and psychology. And so I just love love love studying human beings and I love research methods is like finding that that question and applying the best method to it. So as the the need has evolved, and William say as the field and the practice have evolved as a as the need has evolved, for answers from people who begin to be seen, as reliable, as valuable as as somebody who can really help and help get the right answers, the freedom to apply the best method possible. Is is what’s so exciting about how we’re working today and I see this as that potential future one potential future for our field is the opportunity to drop it having to everything having to be specific methods and having this one toolkit that’s just user experience and open up to the broader possibilities of any kind of method that’s going to help deliver the best experience in the end. And and I’ve just having watched that happened and I also just, I landed in rather a dream position here at Rackspace in that I, I inherited a team that was very deeply educated, there’s masters and PhDs is is what we just happen to have. It’s not what scare anybody about that. That’s what you need to practice in the field because that’s not accurate, but because we had a lot of deep experience in a lot of different kinds of methods and the ability to understand applying a finding a best fit method for Whatever question is out there and whatever way that you need to prove that are convinced it’s true. It’s just really opened up our possibilities and it’s also accelerated our cadence. Because we don’t we get very little pushback on the back end once we’ve delivered our results because we’ve chosen a great method at the beginning that actually gets to the heart of the questions.
Steve: So in this this futures, that we’re talking about, where do these methods come from?
Laura: a broad range of of research capabilities for me, they came from my education in sociology, anthropology and psychology. So, all of those different approaches such as being able, the very fact that you can know how to ask a question well, means that you can put it in multiple different formats. Perhaps it’s a large scale surveys, because you need, you need a you need a shallow data set that just has a whole lot of yes and no answers upfront. Or maybe the best way to answer the question that you come up with is to have a deep qualitative interview where you generate a large number of, of what I call them vertical data points, which is that you create these deep, deep pits of thousands of unstructured data points. So, gosh, don’t remember did that? Did I answer your question? They’re asking me again.
Steve: Well, I’ll say I mean, you started to I mean, it’s where did the where did the methods come from? And I think your answer starts to beg the other question, as was what sort of the granularity of the phrase method? You know, are there millions of methods are there eights like? What is that what’s underneath the term method?
Laura: Wow, that is such a great question. And now I have to like take my take an expert mind and try to quantify it because it’s Something that after all these years I do rather naturally. And so I don’t know if you’ve read Malcolm Gladwell blink, but it talks about, experts can’t always put a word on how they do something. So let me see if I can parse that out. What do I mean by method? So we’ll take the problem, what’s the problem to be solved? Like from a business perspective, what’s the business value that needs to be delivered? We look at that from a project perspective. And then we then we drill down on that, well, what is holding you back? What are the unknowns holding you back? And maybe they just they need to know what are like to get some sentiment and they just need some words on some sentiment. Maybe they need to know about does this workflow work or does it break down and that would be qualitative tests where you’re watching people or a remote test, where you get a lot of data where you’re gonna, like recorded online, and you look at multiple people having done task testing and that sort of thing. And then maybe it’s, look, our CEO just doesn’t believe this because it it only has a few people and and he’s just really wanting to see the numbers. And so then we would convert whatever questions and methods we have up to. Okay, let’s, you know, let’s we’ll, we’ll do a survey, which by the way, surveys are hard. I think it’s one of the hardest kinds of research to get right. And it takes the longest it takes the longest to get it right before you release it, even though it’s fast on the data collection end and the analysis end. So does that begin to get at what you’re asking about methods? does it begin to codify What is it my head?
Steve: Yeah, it was. So maybe I can reflect back a couple of things. And then maybe I’ll try I’ll try asking you a question with hypothesis, which we might call leading, or you might call good interviewing. I don’t know, we’ll see how we go.
Laura: Sounds good to me. I love that you’re asking this by the way, love it.
Steve: Cuz I feel like the when you started off talking about the team and the, you know, the deep level of education and your own backgrounds, and then even you saying earlier on, like, Look, we do market research, we do all these things that sort of sometimes exist under other umbrellas. It seems like, you know, approach a in, you know, building up a super set of methods is look at a range of disciplines, whether they’re professional disciplines or academic disciplines. And you have, you know, your set to choose from just gets larger. I guess the, the leading question is, is was to ask what you think about inventing methods, generating new methods.
Laura: Oh, my gosh. Absolutely love that. And and it’s really necessary to this approach and and can get you like the best possible thing. So a great example of that recently, we we did a study where that we were studying a concept of domain trusts, which is where you have ad administrators have to hand over the permission to do administrative tasks to other people. And so what they wanted to know what would people trust this with? Would they would they be willing to do it? And so we, the, the original method, and the the, the, the stakeholder who came and asked for in that case was very adamant about how they wanted to do it. And in that case, we went ahead and rolled with it and then they will also wanted to like do some of it themselves and so we part of how we scale it. In the organizations that sometimes we will also mentor DIY research. And this person has done some sessions. And so we were willing to do it. So we went with it, but we weren’t sure that the task testing method was going to get to that underlying question of trust. They believed that it would come up naturally as they were going through and that if they really questioned it, and it was kind of a way that they thought it was a way to fake them out, if they really questioned that, that these system administrators would trust somebody else to have the permissions and then not just trust somebody else, but somebody else is somebody else that they might not know about. So like, I pass it off to to Joe and Joe passes it off to Mary, but I don’t know that Joe ever passed it off to Mary. And so so that that was the original method and it was one of those cases where it failed miserably. It looked it on the surface. It looked like they all trusted it. But it was very clear that they didn’t quite get it. And by showing them the interface in the interfacing eat seeming easy, because it was beautifully designed, the workflow was great. There, it was misleading. And we were quite sure that something was missing. And they came back and said, Well, this isn’t they didn’t answer the questions we needed. So we went into a room and we just brainstormed and thought through, okay, how can we get at the psychology of this trust thing, without showing them something tangible like or having them work through a tangible workflow that that’s beautifully designed to this easy so that we’re not testing the design. And so suddenly, we started, like mapping out on the whiteboard. And we came up with this method of creating diagrams of people and these apps. diagrams of people and messages and connections and created these stories around it. And we had them put themselves in the story. So we would show like this one box and there’s a little icon and it’s them and we put their name on it as they told us their name. And then we said, okay, who is somebody that you might turn over permissions to, that you might allow to? And so they would say that and we built out the diagram as they talked like, Okay, well, here’s these other icons, and these are, these are these other three people, you’re going to turn it over, so like your level B people. And so we went from there and continued the story and we had them tell us a story of their own organization, we would live map out the organization as they talked about it. And so then we got down to the really down to the heart of it. And we we had, like person number five in their in their first first degree connections. Say Jay Okay, you’ve Turn it over to them. Now they’ve turned it over, they have these, these now, third degree connections from you. And they’re turning it over to them. And they said, Well, that’s probably okay. Because they wouldn’t turn it over to anybody that wasn’t trustworthy and this and that. And they then they said the key word, and I, and I assume I would know who that was. And I’d be able to see who that was, because then I could go back and check and see who that was. And that was the key. That was the key that broke the whole thing open, which is that the truth was, they were not willing to have the mechanism work in that manner. Were without without their permission, though, that was another piece that they would often just volunteer and say, it’s like, well, I’m assuming that my first degree connections will come back to me and say, Okay, I want to grant permissions to this person, or you’re going to notify me that they’ve done that and tell me who and and we blew up in the whole model that way and It turned out to be exactly what they needed. So that was like live creating a model. And that we actually did live in an abstract way through the combination of these diagrams and the storytelling.
Steve: And were you showing the this beautifully designed interface in this round of sessions?
Laura: We, we did it at the very end as a like a bonus piece, like, okay, now given that, and then what we did was we said, okay, now assume that those problems were solved. Now, let’s experience this. And so we that we just got extra data for them about the about the design in that process.
Steve: Can you say a little more about, you know, going into the room with a whiteboard and thinking through how like, you know, the process of creation of this new method? You know, and maybe that’s another unkind question for me, but I don’t know. Can you reflect on that process?
Laura: Not at all, not at all that that really begins to get to it. So this is where you have some understanding of how a human is built and how a human works and thinks just really works and comes in handy. It’s like and then also understanding one of the key aspects of, of any research design in any research method is being able to, to know that you’re measuring the right thing. And so how do you get to that you’re measuring the right thing and you’re getting at what the actual human behavior is going to be. And so that particular day, it’s like what we were looking at is like, okay, we knew that showing the interface was getting in the way of getting at where where’s that fear button in there? It’s like we were trying to see Is there a fear button in there and have we pushed it and so it this was, this was a combination of like the the minds in the room that understood how human beings work. We have a couple people with Human Factors degrees, and and then others are just, you know, have experienced so much experience in the user experience field. And then just some common sense is like About what? what would what would get at that that fear that concern that which is what we’re actually trying to measure. So that’s really the big question there is like, what are we trying to measure? Looking at your various methods and saying, Okay, if we use this method, whoops, this thing might get in the way. Or if we use that method, this thing might get in the way. Just like when you do surveys, the order of your questions can change the results that you get. So there’s one bit of guidance that I like to apply, which is that in a survey, you don’t ask the demo, or if you can avoid it. You avoid asking the demographic questions up front that have you telling them about themselves, because that can actually bias how they think about what they’re getting ready to do because you and they have Suddenly put them into a certain box and they’ve stepped into that point of view, you can get us may be able to get a slightly more pure response. If you if you just dive in and do the, the the other the basic questions that you’re trying to get at first and then add the demographic on at the end. So yeah, that’s that’s some of how we do it is make sure that that you’re getting at what you need to get at and that whatever you’re doing is not getting in the way of getting at what you need to get at. Mm hmm.
Steve: And then part of the story, I think is, you know, is your collaboration with the the other your stakeholders, your clients, I mean, however you kind of label them who, who have something in mind who have a goal in mind for this research. And so how do you how do you have that conversation? You know, maybe if we, you just told the story, but like, maybe we could talk about what happened before the part of the story that you’ve told But how to how to projects like this get formed.
Laura: That that is absolutely one of my favorite favorite things. So after many years of pain of, of studies like that, that again is a really good example of one where this happened and that that we didn’t do it. We didn’t set the foundation well up front. For, for both what was really needed we believed we’d gotten it that but turns out as we had the conversation after the first study failed, we we we went back to our disciplines conversation. So after some years of having those kinds of experiences in my career of studying, not getting what they needed after we did all of this work, is that I now ask I have a series of questions that I ask up front, it’s a weakness as we call it, our research intake process. We do it in about 20 minutes. So anytime somebody comes to us for research Was it? Oh, that’s great. We’re so excited about what you want to learn. Let’s have a 20 minute talk. And that’s we keep it really simple and lightweight for them. And there are six areas that we ask about and drill down on the first really being the most important one. The first and most important question, and I’ll drill down into each of these in a little more detail is what problem are we trying to solve? So a lot of times they’ll, they’ll think that they have one thing that they need, and turns out there’s another so what problem they’re trying to solve? Who is asking them to do it? Who are they accountable to? Because that can change the questions too. Are there any constraints to the to like the solutions or anything? Is there? Is there something that that’s in the way is like a technical constraint? Or is there a business constraint? And then are those negotiable? Are there any risks? What do we know or already have because sometimes they don’t need a test at all because they already have it. And then what is the timeline on this? And are there dependencies in that timeline? So, so those are those are our six, what What problem? are we solving? Who’s asking you to do it your stake your stakeholders? constraints, risks? What do we know? Or have? And what’s the timeline?
Steve: Yeah, that sounds like that’s 20 minutes to, to cover that.
Laura: Yeah, we’re really good at it now. I think our first ones maybe took an hour. But we’ve really got it down. Because the first one we are able to get down to, really to the heart of it very, very fast. And it’s the single most important if we get to nothing else, the one that we have to have in terms of a conversation that you can’t just ask over email, you have to really hold their feet to the fire is that what problem are we trying to solve? So it will sometimes have people come in and say, well, we need to deliver this new ticketing application or we need to make improvements to that ticketing application, or we need to test our design. So that’s not really the business problem that you’re trying to solve. That’s, that’s what you’re wanting us to do is is test those or, or that’s what you’re going to deliver. But what is the business problem that that application is trying to solve? And so in this domain trust case, it’s like we we needed to get down to what their real concern was not was, would they do this? Would they be willing to do it? And you know, is does this workflow work? It was more, does this frighten them? Is there something about this that they’re worried about? That could go terribly wrong? And then so we really have to hold them to to that on the way out, and what business value does this deliver? What are you trying to do? Who are you solving it for? So there’s kind of three questions and what problem we’re trying to solve. what problem are you trying To solve, that’s the that’s question number one, who are you solving it for? So are is it for a certain kind of customer or an internal person or a certain kind of user? Who’s the what’s that persona? And then what is the desired outcome? What do you hope will happen not as a result of the test, but as a result of actually, like releasing this thing that you envision that you want to release out into the wild? So if we do nothing else, we’re really going to get them down on that. And they often think they can do that in a in a one line email, and then they find out in the conversation that it’s not quite that simple. I think one of the best compliments that we’ve had is that we have one, what one repeat customer, he’s a product manager who comes to us all the time. And he said, Okay, so I need you to ask me those questions that make product managers cry.
Steve: Is that just take those All these questions are is where to where’s the tiers for them here? That problem one seems really interesting.
Laura: Yes, that’s definitely that’s the first and biggest. And if we get to nothing else that one can deliver all sorts of value, then we’ll also you know, then we also get them to this, the stakeholders who are who’s asking you to deliver this because what they really need can often change with, they may be driven by something external. One of the frustrations that we sometimes have, when we don’t ask this question is that we’ll go and do this and deliver it. And they’ll go a certain direction, but really, they or they don’t believe you or they won’t take our they won’t, they won’t. They won’t take our findings and they won’t do anything with it. And we find out after the fact, oh, their boss asked them to do this specific thing. And so if we know both who what, who they’re accountable for like with this product manager, it’s like After a while, it’s like I got to know who his boss was, or whether the his boss or whether the engineers were asking him to do this thing. And based on those two things, we could structure the study where it would actually get the answers that would convince that person. So we actually do that skip over of our stakeholder by asking them who’s your stakeholder, and then we arm them with the data that will help them be convincing to their other to their person. So that’s, that’s really another layer.
Steve: And you mentioned this, but I feel like it bears reiterating that. I mean, none of these questions or anything about the how we will approach this and, and, and I think, and I think you said this as well, that’s often the starting point for people who approach you, right, we need to do step x, or task y. And you’re, you’re setting that aside and asking a lot of very specific whys and how’s and whens and who’s That’s correct.
Laura: So yeah, this this is all developed not just out of pain in user experience testing over the years, but also out of having gotten to be a program manager and being able to think from that space and in various times in places where I had engineers as well as as user experience and other folks working with me and for me, and and so it was really this this concept of the, who are we accountable for? What are the constraints? What are the risks that that sort of thing does actually come straight out of business management experience? But now we’ve armed researchers with Yeah,
Steve: I can’t help but think that there are, you know, either small teams or you know, research teams of one or just lots of researchers who their role is very much To take assignments, and that may be a methodological assignment or problem assignments. And you’re describing, you know, a context for research at Rackspace, where that dynamic has, has changed entirely and you’re there to, you know, I guess work as a partner and, you know, former foreman approach together, you’re guiding them, you’re facilitating, you’re not being directed. Go do this. So we need it tomorrow. If I’m characterizing where you’re at, right now, I’m wondering if you have thoughts about, you know, how does, is there a way to get from A to B, like, you know, if A is Yeah, delegated assignments and B is facilitative, empowered partner who’s you know, who has a mission statement? What is what does it look like to go from one to the other?
Laura: So, so we have found that even our most beginner, Junior researchers are able to have this conversation because one, it’s kept very short. And we don’t reveal these things. It’s like we asked them more from that empathetic perspective, like Rao, what do you what do you need to do in this? And what do you know about this? Just as a discovery, it may not change a whole lot of what you do, but it can provide really valuable context for what you do. So somebody may have asked you here, can you do a usability test on this really quick for me? And, and yeah, we just need you to run this and run these workflows to go ahead and ask these questions up front. It really builds out the goals and objectives sections of your test plan, so that you get to a real goal of what is the ultimate goal from of this thing that you’re delivering, it could actually end up changing you Your script and how you ask those questions. So even if you’re a very frontline researcher who’s just doing what people asked this, this can really inform your your test planning and give it great context.
Steve: It’s interesting to think about that case, you sort of not, like you said, You’re not really revealing how this works. It’s not Well, before we agree to do the usability tests, we’re going to we need to ask these questions. It’s it’s kind of a yes. And that’s great. Tell me Tell me more. Tell me more. Tell me more.
Laura: Yes, it’s taking what we do from a we asked great questions, perspective and from but from this this at the beginning planning point of view and place In the fight in the study,
Steve: yes. And would there be a scenario where, you know, this frontline researcher is being asked to do usability tests, ask these questions. You know, might they identify a different approach? And if so, what would they do?
Laura: Absolutely. And so this is where you really get to shine is where and and, you know, because a lot of what we’re about is keeping people from making really serious mistakes and delivering things that don’t work. And so this is where you really have that opportunity to, to guide encourage mentor your stakeholder, even if they’re bigger than you by by saying, okay, so if we, if we do that, that user experience test, here’s what might happen is that you would get this and these kinds of answers. But if we were to test this this other way, and it’s good to kind of build it out. First thing and get an idea in your head, we don’t typically reveal the method at that time, we typically have them know the method within about an hour after this after we’ve said and thought about it. And sometimes we know instantly and we’ll reveal it, but rarely do we actually reveal the method in that situation. We’re taking in we’re being a sponge, and we’re encouraging them to think in a different sort of way. And, and, and it’s only it’s then when we’re looking at how that might work and and how it might have a method that they were asking for isn’t going to quite work. And that happens a lot like when we have people come to us say, we need a survey. And we we did there were times when we used to just say well, actually, you probably don’t need a survey, that that didn’t work, telling them that they’re wrong. And redirecting them without context didn’t work. hearing them first. So by asking these curious questions and diving down, honestly, we found those people really enjoy the conversation. Because when they get to talk about their work and their challenges, who they’re accountable to, and what success looks like for them in this project or this effort, and sometimes they’ll actually walk out of that those just those few questions, thinking about their own work a little differently. So that’s the first part is that we’ve listened and we applied empathy to that, and we’re very kind when we do this, this experience experiment and this experience and so that’s, that’s really the first foundational piece is that we’re, we’re giving them an opportunity to speak about things that they nobody thought to ask them. And so that’s, that’s step one in the success measures of this and then second is that then if we develop another method, that that that we then go have another conversation like okay, so looking at this, it looks like you’re This is going to get you better what you’re going to want and they may or may not buy that No, I just need this, this run like this. It’s like, okay, run an experimental session, run a pilot session with both and see how they work. Sometimes the stakeholders are right. And I think we’re, we’re more on point most of the time simply because that’s, you know, that’s our superpower. We were the were the methods people. But there are times when, when we thought it needed to be a certain way, and we were wrong. But when we are more effective when we’re delivering something more effective, we can show it, then we have this opportunity to do it. In that particular case with our domain trust case study. It was simply that we went back and told them we didn’t say, okay, you all screwed up by saying, making us do this method and all of that we actually took it on ourselves. We said, Look, we we heard you, we believe that that maybe there should be a different approach. We and we didn’t follow that up strongly. Would you be willing and we’re going to try this other experiment, we’re going to go do these other studies, and we’re going to see if we can get you exactly what you need. We’ve developed this other method. And so in that case, they felt the pain and they were willing and by the time we got there, they were like sponges with with with our data. And yeah, really answered the question. So that’s a that’s a Those were the steps, which is that empathetic listening upfront and really using it the conversation as an opportunity to discover and to hear. And then, and then, if they’re insistent, go ahead and do it, perhaps run a pilot of the alternate method and see if those look different and if that gets them closer to what they want. And then if there is a fail, then you have the new method in your back pocket and kind of apply it at that time.
Steve: You know, I have this not particularly brilliant theory, but I guess this observation that user research, given who’s in the profession, how it works. It’s a helping profession. You know, the way that reference librarians are or you know, some medical people or something, and I mean, I hear that in your story that that, that this is not about being adversarial, towards whatever client stakeholders people in other departments or disciplines, you’re, you’re really approaching them to help them and it becomes you just it’s framed very collaboratively. It seems to me that, that starting point of that kind of philosophy, like you said, of being very kind and using the empathy and we use those words in research all the time, it sort of gets thrown around a lot, but when you describe how it’s implemented and how it’s sort of a philosophy to live by and work by, it’s really exciting to see how that creates a working relationship that doesn’t have a lot of the a lot of the challenges a lot of researchers raise in their own Work, you’ve kind of created a culture or context of cooperation that just is flavored very, very differently because of that.
Laura: That’s That’s right. And thank you for seeing and recognizing that, that that is how that works. And we actually have we have one more like, one of our superpowers in our toolkit for that, is that after we had a big push back situation, where product managers got really upset by our results, and then we found out that that he was under other pressures. And that was that we didn’t even know about and the fact our results like they were correct, and they were right, and they needed to be applied and used and they knew that but it they they just had a real reactive situation to it. And in that case, this this is on another project than that one. In that case. We really see sat down. And we did a post mortem ourselves about that interaction and how that there was some unfortunate Bad Blood really created between us and that, that person and and he was just really wounded for a while. And so we, we set but we were bright, but we were right about it, they and they needed to do it and they and they have to do this that they and they did they applied the results. But still it was like we had that relationship fail. And so we we wrestled with this for some hours and an offside over my kitchen table one day. And we finally came up with so we were what we’re doing is like coming up with with core value approaches that we could apply to to solve a situation like this and to respond to it. And and even before that, how could we seek from not creating it in the first place. And the big thing was, is that we wanted to do that without giving up our power, our authority. as researchers and our knowledge, so it wasn’t just about caving and saying, okay, you’re right now our data is wrong or whatever, or you can take it or leave it or whatever it is that that we might approach. It’s like, Okay, what can we do in a situation like this that’s proactive without giving away our authority. And the two words that we finally came up with, which are our watchword, our bill, whether we’re be curious. Be curious. That is one of our research superpowers is that when a stakeholder is pushing back on our results, or on our method, or anything about that, we might start getting defensive on the inside. And we will stop and recognize that it’s like, okay, we don’t want to give up our authority as the methods experts as the data experts as the findings, experts. But in that moment, what can we do? It’s like, oh, there may be Something more going on for this person or in this situation that we simply don’t understand. And there’s a reason why they might be pushing back in that way. And so we turn it around into a question. It’s like, wow, that’s it, you know that that seemed hard for you What? What is that about that? What are you afraid will happen? Is there somebody that might come back at you or push back on you? Is there a deadline you might not make? Is this going to make things more complex? So we we pause and and seek to be curious, and that has made a huge difference in our stakeholder relationships. So even the most basic researcher can be curious about the people who are asking them to do something. It’s a simple tool.
Steve: That’s really wonderful. It reminds me of so many of my own experiences where I have an interaction that is weird or uncomfortable or unpleasant, you know, even a minor thing and I might Hold on to it for a while and then try to have that realization that it’s even though I’m experiencing it, it’s probably not about me. Right? There’s something going on. And so, I mean, I think your team is, has got a really lovely framing for it. And then we can take the opportunity to answer that question like find out be curious, what is that thing that’s going on to, I would imagine there’s some self care in that because it sort of proves that they don’t hate you, or they don’t think you’re stupid or whatever the thing we do to ourselves when we get those signals, but also back to your kindness and your framing around helping if you understand what the issue is for that person, or even giving them the chance to articulate it, maybe they haven’t articulated it before. you’re providing value in every direction and a number of different levels with that with that mission.
Laura: Thank you. And that that that is correct. It’s like having that, that opportunity to speak about those. And we learn so much more about the business and about how it works and or about the technology and how everything works or the product direction by doing this. The other element of that that was starting to talk about a little bit was that retaining your own power in this. So we also in that same offsite session really looked at, okay, how do we retain and keep, you know, so yes, we’ve listened, we’re curious now. And the data is still the data. It’s like the answer is still the answer. And so there’s a couple of things that we do is that one, we’ve now taken the heat off of that by by using this phrase, the data so the data is telling us, the data is telling us and so instead of saying Yeah, but the users are saying this or we found this in the study, and this was this was the the output and whatever it was that and really starting to get our backup about that. We we use that more generic and in really, like research based answer of, yeah, I hear you with all of those challenges and things. And this will be complex. And we just want you to know that the data is telling us this, the data is telling us this. And that way, it’s it’s agnostic, and it doesn’t say you have to go do this. It’s like we also the third thing we do is we treat them as grownups who are accountable for their own jobs. And so the ultimate decision on what to do we leave to them. And so then the final thing that we do other than So, you know, saying this is what the data is telling us, and then leaving Leo letting them giving them the courtesy of making their decision. And and living with that, whatever it is, the last thing that we do is we retain our data. And we will, sometimes we will even take special care after a study is done on one that somebody is pushed back on. And we’ll make it even more attractive and pleasant and more complete of a report. So even if we’ve done like just a quick turn report will actually turn it into something more formal archival. And then we save that. And I’ve actually had as a situation with this actually happened more than once. But there was a large one that for a large corporation that I was consulting with. And this was a large scale refactor of the main navigation of their whole site. And it was intended to increase click throughs to their product sets. And we tested this and we actually traveled nationwide tested, you know, 18 year olds to 80 year olds and, you know, all walks of life and everything. And so we gotten all this stuff. And the data told us that there weren’t these critical fails in this and that it was it was going to fail. And they were they were on their timeline. There were business reasons they wanted to release it. And so they they rushed and did it. So, you know, we told them here’s the findings. We let gave them the courtesy and and here’s the bad things that might happen. We gave them the courtesy of making their decision they did. And then they went back to look at the at the analytics and discovered that not only did click throughs not go up by the 30% that they estimated. The click throughs went down by 30%. So something had gone drastically wrong. And they spent like the first 18 hours or so scrambling with technology and all sorts of things and all sorts of finger pointing and fight since then somebody said maybe we should ask research and so they came back to research and they said Do you have any ideas about what may have gone wrong? We said yes, actually. So we we documented in this report. And here are the things that that were risky that that you can look at. And they looked at those as like, well, there was some dissent in the organization about how did these get through, we didn’t go there. We just gave them the data, the information, we helped them interpret it helped them come up with designs to fix it. And they did fix it. And within 36 hours, the click throughs had not only corrected, but they had gone up the original 30% that was intended for the project. And we never went back to that we never said I told you so we lead with the data says tells us this, gave them the courtesy of making the decision and then coming back and being ready in case the risk the bad thing does happen. That that we we gave it to them without judgment and and say yeah, here’s how to fix it. said this is this is not only very humane it’s great business
Steve: there’s something here about you know, when are the moments when people are willing to hear like we are we are sometimes change agents we are sometimes telling people things that yeah the break their worldview.
Laura: Yes, that’s a great way to say it.
Steve: So, you know, you can sometimes you can choose your moments and I mean, in that story, you couldn’t really choose your moment it was kind of the moment came to you and you were you were prepared to respond. But your earlier your earlier attempts to to influence and sort of change minds. I mean, really didn’t didn’t land but you were ready when the when it came back around.
Laura: That’s correct. And most of the time where the the research results are supported, And the bad thing does happen. And then sometimes we you know, because we’re intentionally observing this, and we’re really going for high quality for our customers and end users. Almost every researcher has discovered that there are times when it maybe that bad thing happens, but it doesn’t have as big of impact as we thought it was. Would if they didn’t, that was with somebody not fixing something. And so sometimes, yeah, sometimes that happens, too. And, but most of the time, that’s, that’s what we’ll do. And actually, we we do find because we’ve had that in tech conversation up front. We, we have the tech conversation upfront, we review test plans with our stakeholders before we run them, and we invite them to the sessions so that, you know they’re in there too. We’re very transparent. We even put all of our working documents in shared directories, even our working reports and things we don’t hide anything from them. Most of the time, they don’t go look at the details, but the fact that we’ve made it transparent is created a lot of trust. And then ultimately, you know, sharing the data with them most of the time, they’re convinced because we’ve set up such a good foundation up to there. And then when they’re not we have this failsafe mechanism for for how to handle when they don’t take the advice.
Steve: In US, you said a couple times, no, you’re treating them like adults, the decision about what to do or not to do is theirs.
Laura: Well, yes. And it’s not just that it’s like recognizing that, that there’s other drivers that that maybe they’ve not articulated in all those conversations, or maybe they haven’t even identified themselves. We just don’t always know what those are.
Steve: And I think you’re surfacing. You know, what is what does success look like for researchers? And you’ve, I think you’ve articulated in a specific way that I haven’t necessarily heard that there’s a lot of there’s a lot pressure we put on ourselves as a practice to have influence in you know, in that word means so many things, but one of those is people doing what we think they should do. But saying, oh, that that they’re adults and that they’re going to make their own decision and there’s factors that we don’t know. right i mean even your mission statement about where you use the word spark is kind of what you’re doing. I mean spark spark plays a certain role in a fire being led or not led but it’s it’s right there kind of outside to make it happen but it’s not the fire right the fire is the is the next thing that happens.
Laura: Yes, I’m so that’s the spark fast, confident decisions, because that’s what we’re about. And so that’s actually another the fast part is another element of the method is that sometimes we need a faster method to to get at at the same sort of thing. So that would be another consideration. But ultimately, yeah, it’s like we’re trying to find where is the unknown in the audience. is keeping them from moving forward and moving forward to the right thing in the right direction. And yeah, once we’ve discovered that we, by setting it up that way up front, and having that in mind that, that we have more success influence on the back end.
Steve: When you spoke earlier in this conversation about, I think bringing tools and mindsets from business management to research, because we’ve, that’s what we’ve talked a lot about, right is sort of the management of making research, effective and successful inside your organization with all of your colleagues.
Laura: Yes. So the two pieces I brought specifically from business were the constraints and the risks, because those are two things that you really have to well and then what is the business outcome when we ask about the desired outcome in the problem solving conversation, and question we’re going for what is the business outcome, like no, not that just that we need to release these ticketing changes. It’s like, Why? What do you hope will happen as a result of those? So that’s one aspect is what’s the business outcome? Or the organizational or whatever it is that, you know, that’s driving the outcomes, the bigger ones? And then the constraints. It’s like, what are the technical business constraints? And are they negotiable? It’s like this is that that’s comes basic, from basic business management that really anybody can begin to can learn to understand and begin to understand is that there are sometimes there’s just you know, there’s there is a hard schedule or budget constraint or legacy technology constraint. We all hit that with a lot, don’t we? But then then the question to ask is, you know, not not just to say, well, that’s just going to work and you can’t give them what they want. Before you go there, you say, or, you know, give them what they need. The users are going to have to have this bigger investment you say, okay, is that negotiable And to what extent is that negotiable, that technology or that, and then that you, you begin to get them thinking in those other directions. So then then the third one, then is is the risk. So I use a basic risk management model, which risk risk assessment management model, we asked what are the potential risks of the of this solution? What is the likely you know what bad things will happen? So, that’s a simple way to ask it, what what bad things might happen if this gets released? Because a lot of times when they’re asking you to test something, they’re not necessarily thinking about the the potential implications of it. Or maybe that’s what they’re actually trying to get at in the test. They just didn’t really say it. It’s like, we’re afraid that this workflow will, will break down at this point. It’s like, well, what bad things will happen? What’s the risk? What, what’s about bad things would happen because they don’t always articulate that part. They’ll just say that it won’t work. And then Okay, what’s the likelihood of that is that really Something that would happen, is that a high medium or low likelihood that it’s going to happen? And then what? What would be the impact? It’s like, okay, yeah, that bad thing might happen, like, there’s this usability problem, and it would be there, and they’d have to work around it. And they would still have another way to do it. So that would be a low impact. There’s not a big, yeah, the bad thing might happen. But the bad thing that will happen as a result is is really not so huge. Or they might have a data loss or the system will crash or somebody will die like that those those would be really high impact. So once we have that high, medium, low likelihood, high, medium low impact, and we know what that risk is, then we can have that conversation. How do we mitigate that? So how can we mitigate that and sometimes I’ll actually change their solution live while they’re talking to you simply by asking these questions. And sometimes we even ask ourselves these questions about our own studies. It’s like is there a risk of this study so so we asked the the risk The likelihood high, medium low, the impact high, medium low. If both of those are high Oh, then we may need to add more participants cuz, because if this is a really risky thing, we need to find all possible capability reasons that are possible failures. It’s like maybe the first 15 people won’t reveal it. So we need number 16. So if it’s a really high high impact that that’s one of the decision points it tells us is we need to add more participants potentially, if it’s a really low impact, sure, we might be able to get feedback from six, we might and we only get feedback from three or hallway testing if it’s a low likelihood, low impact. And then how can we mitigate that? How can we fix that? So? So yeah, it’s it’s a great little conversation to have, and it’s not really complicated. Risk Management sounds like such a big, scary business II thing, but the risk, the likelihood impact and mitigation. It’s really rather rather simple. It’s so valuable
Steve: not just to step back a level a little bit, and we started off, you know, you talked about your, your, your degree in research psychology and you mentioned sociology, you mentioned anthropology and by the end of the conversation, we’re talking about business. Business. And so you know, for you in your career, like where did that come from? How did you pull that in, in addition to all the other things that you’ve kind of incorporated in from the social sciences? Where did the business stuff come from?
Laura: By taking opportunities that came along and by being curious, so that specifically for me how that happened was that I was in a defense consulting organization with the University of Texas at Austin, the Applied Research Laboratories, a very old and respected laboratory that started with the Navy and then expanded to other services. And there was a there was a pitch that was We got to do just people coming in. And I thought, well, I want to pitch user experience to them. And so I managed to weasel five minutes in with these people who were looking for high value, things that they could invest in toward this specific effort that they wanted. And have that whole day and the little five minutes, I was able to wrangle out of somebody else’s 30 minutes session, because I came in late to the game. They were lit up by this whole user experience concept, and it exactly matched on what they needed. And so because they had a need, and I presented it, I brought the contract and so I started asking my boss, how do I do this? How do I get a contract? What do we do? And so I started learning about the business aspects just because I went put myself out there. There was an opportunity, I’ll tell you, I was terrified. But it was it was like but I was also like, thrilled exhilarated like being at the top of the hill on a roller coaster, and the best way and So I took that on and and by doing that I began to, to have to learn that sort of thing. So that was the start of it. But the really big pivot was then when, in addition to that I, they they had a project that had a manager who left and they needed somebody just to pick it up really fast and keep it going. They thought it was going to die in six months, because it was not going well. The stakeholder wasn’t happy. So again, I was curious, it’s like, Okay, why is the stakeholder not happy? Well, it’s because things were not delivering on time or and it was like what all these requests for like a year overdue there were $100,000 over budget, and they just didn’t see the value anymore, except for this one guy doing little ad hoc fixes for them. So I started looking at their contract and seeing what they signed up for and discovered that, that we were doing a whole lot of things that were out of scope, but we weren’t doing the things that they signed up to And it was really simple way to begin to learn business and budgets and, and to that sort of thing was I was just curious. And I just started asking the question like we do as researchers, why, why? Why is why are they unhappy and I had empathetic conversations with them. And so when I when I started having those empathetic conversations with the people who were in charge of these projects, and even as a researcher start asking them well, so why are you concerned about that? What that what’s that about? What kind of pressures are there? That’s where I began to learn about this and in in, like, I go out, try to find, well, how do you manage this and this sort of thing, and I paid attention when they somebody did a risk study at some point and I was like, Oh, what’s that? So I went in and said, Can I listen in on your on your post mortem, you know, on your on that risk that happened, that bad thing that happened? And I that’s where I learned those those basic steps. Even though they weren’t basic in that situation, but because we’re user experience people, to boil them down to those basic questions, that was how I did it. That’s how it’s in this one tiny little part of this conversation and that really anybody can do it. So it’s like taking the best of who we are as user experience people, and especially as researchers, which is paying attention, asking questions and boiling down things to their simplest, simplest forms, and then applying those to make a difference. I’m really proud of myself. I’m glad you asked all these questions.
Steve: You know, we keep coming back to how be curious. It’s just as a mantra for good collaboration and for developing skills and you know, we we bump up against all kinds of stuff that may be intimidating. But you’ve, you’ve showed how to learn by being curious. And as you said, you know, paying attention and boiling it down. So all the skills of a researcher being applied to all these different contexts, it’s a really it’s a it’s inspiring, and I think but also very actionable. I really like it for that.
Laura: Yes. And then I, I’ve never just sat and done just what I was asked to do. It’s like I’m always looking for something new. Something else is probably just part of how I’m built but it’s also a conscious choice of, of just doing my my current job is it’s simply not enough for me and that even just that extra 10 minutes of curiosity or desire to see something else or learn something new while still doing my job and delivering on that, that that that’s just opened so many doors And, and and, you know, helped me step into a lot of opportunities and learn new things.
Steve: Is there anything we didn’t talk about today that we should
Laura: we could talk about all sorts of things. So that this this is a lot. And this is really a lot of the heart of of how my own practices developed, how I would love to see the field develop just in terms of, of one, you know, that that applying the best possible methods in being free to use any or invent them based on the real questions and to know those real questions to ask up front. And then how to treat people along the way. And grab the opportunities along the way, I think. Yeah, that’s that we’ve summed up a lot of my practice really, really well. Well, it’s, it’s really beautiful. As I said, I find it really inspirational. So thank you, Laura, for being so open and sharing so much good information. I really enjoyed the chance to speak with you today.
Laura: Oh, thank you so much for asking. It’s my pleasure indeed.
Steve: Thanks for listening! Tell your colleagues about Dollars to Donuts, and give us a review on Apple Podcasts. You can find Dollars to Donuts on Apple Podcasts and Spotify and Google Play and all the places where pods are catched. Visit portigal.com/podcast to get all the episodes with show notes and transcripts. And we’re on Twitter at dollarstodonuts, that’s D O L L R S t o D O N U T S. Our theme music is by Bruce Todd.