Posts tagged “ethnography”

NPR : The Art of the Interview, ESPN-Style

This NPR story looks at the “interview coach” at ESPN and their attempts to bring a high level of quality to their interviewing. Page links to the NPR program and an exemplary interview and critique.

Now, every single editorial employee at ESPN is expected to attend a three-day seminar, where they encounter a lanky, slightly awkward 58-year-old man with little flash. In his efforts to illustrate what he considers the “seven deadly sins of interviewing,” John Sawatsky methodically eviscerates the nation’s most prominent television journalists.

“I want to change the culture of the journalistic interview,” Sawatsky says. “We interview no better now than we did 30 years ago. In some ways, we interview worse.”

There’s a lot to be learned from journalists; not everything applies to other forms of interviewing (say, interviewing users) of course.

I haven’t listened to this yet; checking it out will be first up tomorrow.

[via kottke.org]

Cross-Cultural Research

Notes from my UXWeek session are online (not sure how helpful they are without hearing me talk through the issues, (Update: you can hear me talk through the issues here) but if you want to, check out Dan Saffer’s notes or the notes from the wiki.

dsc_0118.jpg

Hawai’i makai/mauka signs
different orientation toward navigation: toward mountain, toward ocean
difference in how we move through space

so what?
mundane observations reveal differences in cultural needs or drivers

Get Out Of The Office

Refreshing piece in the NYT (since it omits the usual players and the jokes about anthropology) about the importance of getting out of the office and getting to where your customers are.

Once a year, though, he organizes a different kind of hunt – which he calls a “branch hunt.” In it, the entire organization turns its attention from the suite to the street – and, by scrutinizing the fine details of how banks interact with their customers, sees the market from a new perspective.

“The most thoughtful and articulate strategies tend to come from the big banks,” Mr. Brown explained. “But their actual results seldom bear that out. When you walk the streets and look at what’s happening, the gap between strategy and execution becomes obvious. We can’t just listen to what executives say. We have to see with our own eyes what customers are experiencing.”

The dress code for a branch hunt is casual, but the approach is rigorous. For its fourth annual hunt, Second Curve pinpointed the location of every branch of every bank on the East Side of Manhattan, from 25th to 86th Streets.

All the firm’s employees – the analysts, the compliance officer, the computer geek, the receptionist – divided into teams, were assigned specific avenues and streets and set out with digital cameras, audio recorders and four crisp $100 bills for each team. They spent time at the branches, chatted up bank employees, opened checking accounts with the company-issued cash, snapped photographs – not a popular practice with bank security – and captured the flesh-and-blood experience of being a customer.

After the hunt, the teams returned to headquarters and described what they saw, from stories about horrible or remarkable service, to reports on flat-screen televisions that were meant for customers’ viewing but were occasionally found in truly bizarre places where the public could not see them.

Design Research: A Conversation With Steve Portigal (pt. 2)

Here’s part 2 of my conversation with LukeW. Please see Functioning Form for part 1.

Luke Wroblewski

Let me try to clarify by taking a step back. I see field research as a way to remove some of the tarnish that comes with more “traditional” market research like focus groups and surveys. The common perspective is that people in a focus group or survey won’t really tell you what they want or how they do things because often times -they can’t. They are a level removed from the actual activity and as a result may leave out key details or considerations they use the make decisions.

The classic example, and I can’t recall where I first read it, is the washing machine manufacturer that polls thousands of potential customers and asks them “what features do you want in a washing machine?” The responses they get back are: “just the basics”; “i just want a simple setting for colors and whites”; “nothing too fancy”; etc. So the company makes a bunch of no-frills, feature-lite machines and they don’t sell because when it actually comes time to buy a machine the same people that said they want “simple above all else” fall prey to feature-sheen. “Oooh but this one has more features…” I’m sure you’ve heard a similar tale or two.

So what we have here is people saying they do one thing then going out and doing something totally different. Field research should ideally be there at the point of the sale -in context- to enable the company to see what really happens.

Now let’s go back to my original question about digital context. In all the methods you described above -great list by the way!- we’re asking people to tell us what they’re doing rather than being there -in context- when they are doing it.

Maybe I’m picking nits here but I know there are lots of “hidden” subtleties within digital social systems that govern how people behave. There are contexts of when and where that alter behavior. As an example, during a home visit a buyer on eBay may tell you: “I leave positive feedback when I get an item in good condition.” Their actual behavior, however, differs. They may or may not leave feedback based on the type of seller (professional or amateur), how much feedback they have, how much feedback the seller has, the category they are buying from, their intentions for the item after they get it (resell, return), and so on.

I guess when I think of people that spend hours every day immersed in something like World of Warcraft I feel there’s more to their behavior and motivations in that digital space than they can explain in words. How can we be a fly on the wall within that digital context? Or is what I’m looking for already covered by the methods you outlined?

Steve Portigal

Any situation where you have someone telling you about their own behavior is going to include some amount of bias (and let’s pretend for the sake of discussion that our own bias isn’t an additional factor). In focus groups, those influences are hard to leverage (complex peer dynamics, sterile environment, closed-ended discussion), but in contextual research, we can try to take advantage of showing and telling, for instance. Having someone walk through their previous feedback log, and explain, is illustrative of patterns that person may not explicitly be telling us.

Q: Are you leaving feedback for the seller?

A: I leave positive feedback; it’s really important, I usually will look at the condition and decide based on that.

Q: [points to computer screen]Can you walk us through some examples of feedback?

A: Sure, umm, here’s one I left last week. The item was in pretty good condition, so, well, I only left 2 stars, because I didn’t get it right away. And this one here, I left 4 stars because the last time I bought from them it was great. Yeah.

Q: What’s that icon over there?

A: Oh, there’s a bunch of items awaiting feedback.

Q: How often does that happen?

A: Well, I’ve got about 35 in there, some of them let’s see oh yeah, some of them go back about 3 months, I guess.

Now, I’ve totally made that up to support my point so let’s not treat it as data, but as a likely scenario for a dialog in an interview. So much of the process involves triangulation – asking the question at different points in the interview, getting demonstrations as well as declarative statements as well as stories. When you come out of the session, you have to ask yourselves what you think that person’s approach to feedback is. And it lies somewhere in between, but it’s ultimately an interpretive answer.

We can measure people’s TV viewing habits, let’s say, with a Nielsen box, and we can ask them about their behavior. And history shows (like the washing machine example) a disconnect. People under-report their TV watching. It’s easy to think about why; TV is bad, it’s better to show yourself as someone who reads books and goes on hikes than it is as someone who watches a ton of TV. The insight comes out around the delta between the observed and the reported. Of course, not every gulf is an insight, or one that you can use.

And maybe to make it more simple, it’s easier to talk to people about what they did in the past, and why (i.e., leaving feedback) rather than their overall attitude. People make generalized statements but the actual examples contain a lot more subtlety. This of course leads to one reason that companies like to reject any form of customer research – because people can’t talk about what they will do or what they will want. My short answer to that is that the researcher is the one doing the interpretation; in all of this we aren’t simply collecting responses verbatim – we are dynamically choosing different questions and making inferences in order to build our own model of how people will behave.

That said, I had an interesting conversation the other night with Zachary Jean Paradis, a student at the Institute of Design. He described how he had tried to do an ethnography of World of Warcraft (a MMORPG), where presumably a lot more of the behavior to be understood is taking place inside a virtual world. He felt like his traditional tools of ethnographic research didn’t hold up, and he was wishing for another month to better refine his methods. Gaming is an interesting example (and one where I don’t have a great deal of personal experience) of the online-behavior-studied-offline that we’re talking about. I’ve heard that some researchers will videotape the faces and body language of people while they are playing; I imagine you could play those back along with the matching gameplay and have people reflect on what they think was happening at the time. You can see that technique in Gimme Shelter, where Mick Jagger is watching the footage from Altamont while they interview him. A UK company called Everyday Lives does this sort of user research exclusively, preferring to passively observe, and then only interview when there’s a video record of the event to be discussed. I think it’s an interesting tool, but I think we need an ever-expanding palette of methods to deal with new situations as they emerge, rather than dogmatically rely on a single approach.
Am I still dodging your question? Or are we any closer?

Luke Wroblewski

Maybe I’m dodging your answer! One thing you said, though, really resonated with me “getting demonstrations as well as declarative statements as well as stories.” Since we can’t actually be a fly on the wall within complex digital systems yet -and I say that because the tracking software and log analytics software I’ve used is still a ways off from being nuanced and effective enough to match what we can do it the real world- that’s how we need to understand context: through demonstrations, declarations, stories, and of course observation of what people are actually doing on screen. Personally, I do think as digital environments become even more immersive and complex, we’ll need additional methods.

That said, let’s jump into the other topic I wanted to bring up with you. Without getting into pure semantics, why do you think a lot of ethnographic or field research is being characterized as “design research”? Is it user experience design teams within large companies trying to own the research process/data? Is it an attempt to differentiate the type of customer insights a human-centric problem solving approach can uncover from the types of insights Marketing departments have traditionally owned -like customer segmentation? Or does this type of research intrinsically belong in the “design world”?

Steve Portigal

I agree that there’s more to behavior than can be explained in words. It’s up to us to look for the deeper meanings between the words – what is said, what isn’t said, and how it’s said. As far terminology goes, I agree with your suggestion that the label can be an attempt to distinguish the methodology and/or the results from market research, and the departments that do market research. I’m so frustrated by the chaos around methodological labels. I’m sure within organizations they can create a locally-relevant nomenclature (they can, I’m not sure they do) but once you leave the boundaries of their company (through any industry discussion, conference, or online group) they end up sowing confusion. The vendors, of course, who move between companies are even more guilty. There’s a desire to differentiate from the other providers by claiming some proprietary take on doing research: Context-Based Research, PhotoEthnography, Rapid Ethnography, etc. (some of those may be actual methods claimed by actual firms; others may just be me riffing). It’s tough to balance exploring the ideas and staying “on-message” isn’t it? I guess that’s why I don’t take kindly to the terminology wars; they seem to make it more confusing for people.

Luke Wroblewski

So this time you did dodge my question! I’ve consistently heard “design” added to “research” when describing the type of activities we’ve been discussing. Any thoughts on the inclusion of the design label? I know you find yourself in lots of designer-focused events like Design 2.0 in San Francisco, Core77, Overlap… is there really that strong a connection between design and ethnographic research? Why doesn’t this type of research feed business models more than mock-ups? From my experience, the designers eat this kind of data up, the business folks are slow to act on it. What’s you take on that?

Steve Portigal

One suggestion is that the term is historical. Bringing the tools of ethnographic research into product development was led by a few firms that were self-described design firms (like my old company, GVO) or that had ties to design (Doblin, with their connection to the Institute of Design). I would also say (and this is a gross generalization) that market research tends to focus on the evaluative and design research tends to focus on the generative. That’s more about the goals of the research sponsors than anything inherent in the methodologies (since ethnography is ethnography).

And I think those goals or orientations that differ by discipline will affect the gusto with which they eat this stuff up. In many cases, designers are faced with tangible goals. They are committed to acting, since the product has to be designed, and is going to launch. The information they gain from research can help them solve a near-term problem (i.e., what is the organizational framework for a navigation through a space?). Even though my strategic recommendations can be as tangible as my tactical ones, they ask for actions that are much slower (i.e., launch a newsletter that addresses the transparency concerns of customers), more tangled up in organizational (same example) and resource issues (same example), and with many degrees of freedom to create a good solution (and as non-designers, that can be paralyzing).

Luke Wroblewski

Thanks Steve!

Design Research: A Conversation with Steve Portigal

Over at Functioning Form I’m in conversation with LukeW.

To help me work through some recent thoughts I’ve had about Design Research, I asked Steve Portigal -founder of Portigal Consulting and all around bright guy- to talk about context within digital products and the connection between ethnographic research and design. Part one.

Part two will be here (although where “here” is remains in slight flux as this blog is soon to move to portigal.com but is not yet ready. We’ll announce the move when it happens and we’ll make sure you find part two of the conversation!

The Ethnography of Marketing (or, rather, the marketing of Ethnography!)

The Ethnography of Marketing is another BusinessWeek piece about, well, ethnography. (It should be entitled The Marketing of Ethnography, perhaps).

The Institute of Design…[has] developed the User Insight Tool, an ethnographic methodology designed specifically for business. It relies on disposable cameras, field notebooks, and special software that teases out new understandings from consumer observations.

How does the User Insight Tool work? Researchers decide what human behaviors they want to observe. They give observers disposable cameras to take photos of those activities. With pictures in hand, researchers talk to the people using a standard framework outlined in their field notebooks. The goal is to understand each person’s activities over a number of dimensions such as comfort level and product use. The notes are analyzed and entered into the software along with general insights and the original field notes.

The software lets the researchers look for similarities among all the insights gleaned from the different subjects. It organizes them graphically on the computer screen so large patterns of similarities appear as dense patches or clusters. The value of clustering is that it can reveal hidden patterns of behavior.

Interesting. The Institute of Design has been talking about this tool for a while now, and this is as close to an actual description as we’re probably going to ever get. It’s still remarkably opaque. Is this some advanced Artificial Intelligence system that does Natural Language Processing? That would be surprising to see emerge from the ID, wouldn’t it? If not, then perhaps the article is suggesting that the “observations” that are entered into the system must be put into a set of categories (pre-defined?) and then it does some rudimentary sorting on them? For it’s the creation of that categories that seems enormously challenging.

In science, you can determine your parameters ahead of time; you can even set up all your stats before you do your data collection. But in fieldwork, you don’t really know what the categories are, you can hypothesize, but the pattern recognition has to let you go broader than you imagined (that’s why you are doing this in the first place!).

I’m always a little nervous when I see a piece of technology emerge as the panacea to complex human problem (and we see this all the time, either it’s software, or hypnotism, or MRIs or something else presumably objective). In this case, we’ve got messy people (those who we study) and a slippery skill set (doing ethnography). And it seems that the story here is throwing some gizmo at the problem to eliminate that. Are the people doing the “observations” considered ethnographers or are they simply data collectors working to a script?

There’s always a market for short-cuts, easy answers, quick-and-dirty solutions. Although their case studies sound intriguing from the little bit of detail we’ve been given, I would want to know much much more about what they’re actually doing to get to these results.

When the Institute of Design compared the ethnographic data of both the P&G and Lenovo studies, it found that while the kitchen is the center of family activity in the U.S., the parents’ bed is the family social center in India. This is vital information for any company making global consumer entertainment products.

Is “the parents’ bed is the family social center in India” an ethnographic insight or something that any Indian would be able to tell you? On that note, Dina Mehta has documented a whole series of Indian cultural norms around business, consumption and beyond. It’s a brilliant reference piece. Check ’em out: part 1, part 2, part 3.

Scary stuff, kiddies

I’ve already blogged about last week’s BW story on ethnography, but I had to add another post once I saw the accompanying picture. Whoah.

Scary looking people in lab coats peering into a dollhouse (with a real tiny family living separately)? Could we evoke anything more horrific and antithetical to the whole point of doing ethnography?

And I still never heard back from the author of the article who invited feedback. Ah, well. Even if I don’t agree with everything BW does, it’s nice that some folks there are extremely interactive with their colleagues, readers, public, etc.

Sam Lucente: The Ethnographer

Sam Lucente: The Ethnographer is an article in the BusinessWeek IN magazine, a new thing they’ve launched – with a bit of hype and controversy – to focus specifically on innovation. They’ve got the usual set of folks no doubt, Claudia Kotchka, IDEO, Marissa Mayer (and if this sounds bitter, it’s not since I seem to be – on a much more mortal scale – included in the broader population of regular BW folks).

The story about Lucente is pretty good. I have liked and admired Sam since I had the opportunity to work for him on a project my old firm did for IBM many years ago. He’s done amazing things and is having an impact.

But he’s not, by any stretch of the imagination, an ethnographer. I would be enormously surprised if he claimed that identity for himself, and I would suggest he sees himself still and forever as a designer (just my impression of the guy).

I’m not going to get fussy and try to define what the heck an ethnographer is or isn’t, but I’d say that it’s like innovation, art, or p0rnography – we know it when we see it.

I’m not being territorial here. I’m not at all comfortable when people label me as an ethnographer, either. I think that BW’s ongoing enthusiasm for design and now ethnography and of course innovation is making them a bit careless with their terms, and that’s frankly going to simply devalue and commoditize the special things they are talking about. I don’t know how we in the community can help BusinessWeek – I want us to encourage them to keep writing about these great examples of people doing good work, but to keep their enthusiasm in check long enough to look more deeply (what do these words mean), broadly (who are some more usual suspects), and judiciously (maybe some of what we’re hearing has been hopelessly idealized for PR purposes).

BW on ethno

BusinessWeek has a new article about ethnography. The author posted a blurb about it on a mailing list I’m on, asking for feedback (I guess some on the list provided input into the piece) and expressing interest continuing the conversation. So far my comments have gone unanswered, so I’m summarizing them here.

It’s nice to see some fresh examples of success in the application of ethnography. The GE example is very cool and goes beyond the usual fix a product case study and into the evolve a business’s culture that really rang true from my own experience.

However, I was disappointed to see the article buy into the ethnography = anthropology myth and the corollary that all ethnographers are anthropologists. Indeed, the article incorrectly attributes the anthropology credential to some people such as Tony Salvador who I believe was trained as a psychologist, or the people at Steelcase, some of whom I know as graduates of the Institute of Design, and are definitely not anthropologists. IDEO may have anthropologists, but a great deal of their people involved in “human factors” (as they term it) are coming with other educational backgrounds.

It’s tempting to see a conspiracy of highly-placed anthropologists who work behind the scenes to ensure that any conversation about user research in product development and consulting succumbs helplessly to this myth, but I think really sloppy reporting is more likely the culprit here.

John Thackera Thackara writes about the article in his typical sanctimonious style (seriously – I will have to give up on In The Bubble because it’s filled with mean-spirited judgment of one profession or endeavor on one page, and then a capricious about-face on the next page to drool over another effort that meets his opaque standards).

Do ethnographers need exotic names to do well in business? A story in Business Week features two guys called ‘J. Wilton L. Agatstein Jr’ (who runs Intel’s new emerging-markets unit) and ‘Timothy deWaal Malefyt’ (an anthropologist who runs ‘cultural discovery’ at ad firm BBDO Worldwide).

Whoah. Racist much, John? Portigal is a pretty funny name. So is Thackera Thackara. What of it?

User Research Theory

As I’ve written about recently (here and here), this issue of theory within the practice of user research is challenging and at least, provocative. To that end, I’ve started an online study group that will be looking at theory from, well, social sciences, I imagine. We’ve got a diverse group of people signed up, more than 80, with a good range of experiences, and some great discussion around where to start (i.e., who to read) and how to interact together.

We’ve just launched our first assignment and I’m reading Clifford Geertz’s essay on Thick Description. Here’s an interesting passage:

And it is in understanding what ethnography is, or more exactly what doing ethnography is, that a start can be made toward grasping what anthropological analysis amounts to as a form of knowledge. This, it must immediately be said, is not a matter of methods. From one point of view, that of the textbook, doing ethnography is establishing rapport, selecting informants, transcribing texts, taking genealogies, mapping fields, keeping a diary, and so on. But it is not these things, techniques, and received procedures, that define the enterprise. What defines it is the kind of intellectual effort it is: an elaborate venture in, to borrow a nation from Gilbert Ryle, “thick description”

He goes on to explain the difference in meaning between the same gesture – an involuntary eye-twitch and a wink.

Good stuff!

Ethnography and new product development

From Innovation Weblog (via The Business Innovation Insider)

Simply put, ethnography – as it applies to innovation – is the process of doing observational research, going into the field to watch how customers utilize your products. Often used in consumer new product research, ethnography is an excellent way to uncover new opportunities for product improvement.

For example, speaker Pam Rogers, who is corporate director of global customer excellence and innovation, explained how the inspiration for a pedestal/storage unit for its Duet front-loading washers and dryers came from observing a woman who had placed her Whirlpool dryer upon cinderblocks, to make it easier to load and unload it without having to bend over.

Okay, yes, I guess, but really, no. It’s not simply about observation. That seems to be the easy part to explain and so that’s the part that gets spoken about. I’ve written a bit about ethnography here

So often, companies go to the trouble of studying customers, only to address the opportunities revealed by usage. For example, an award-winning snow shovel was redesigned when the design team went out to watch how their product was being used, found that women instead of men were shoveling, and so they made the handle smaller.

But there’s much more that can be revealed. What is the shoveling occasion (or, if you will, ritual) really about? What meanings does it hold? Does it hearken back to childhood? Or does it represent female independence? Or the nurturing of motherhood? Or the abandonment by men? Probably it’s none of those, but the point is that within the ordinary activity of shoveling we can find deep meanings that can provide enormous opportunities for innovation as we question the basic assumptions about what the product could possibly be.

I’ve found the word ethnography to be a troubling one, frankly. It’s a mouthful, it reeks of academic snootery and hand-waving inconclusiveness. It’s gets confused with anthropology and various parties have tried to claim the pure methodology only for those with the right doctorate. And I’ve been an advocate for stepping aside from the word and pointing to the key elements (getting out of your own context, observing and interviewing, and synthesizing something new). But that is troubling for some.

Grant McCracken has written a strongly-worded piece about the coming-of-age of ethnography in business in 2006, and there’s a spirited discussion in the comments below his piece, including several entries from me, including one where I advocate ignoring the word and just getting to the root of it (as I said above). Grant doesn’t take too well to that.

It’s a very troubling issue that is perhaps eating away at the development of an excellent practice and community of practice around that excellence. But I do think the terminology wars and the discipline battles are painful, frustrating, and perhaps fruitless. I look at the “interface” community which has split into many different professional networks based on what term they agree with (IxD, UxD, UX, UD, IX, ID, etc.) or what end of the egg they prefer to break open.

Yesterday I was in a conference call with a prospective client. We were proposing some work and hadn’t used the word ethnography at all. An internal person from another part of the organization was very interested in displaying her own mastery of the research process, and made numerous references to some ongoing work as “my ethnography.” Only she couldn’t even comfortably pronounce ethnography. And she wasn’t doing it; she was sending it out to the “only” provider that did this, apparently (?). And what were they doing? Inviting cool kids to an art gallery in Miami. [Okay, I don’t get this at all].

At a conference the other week I participated in a side conversation that included this snippet “Oh that’s not ethnography, that’s just depth-interviewing.”

I may be coming around to Grant’s way of looking at this. We have a problem. I’ve got my explanation, sure, but so does everyone else, whether they have more experience than I do, or worse pronunciation than I do. We’ve got experts like the Innovation Weblog getting it badly wrong, Pam Rogers perhaps missing some of the point, my recent encounters (presumed experts in their own peer group?) with their own versions of what we’re doing, and on and on.

Unfortunately, I have no solutions. And I don’t see a culture that is ready to reach a solution, establish a common language, speak in one voice (not millions), establish standards, or even work together on this.

T.O.

I’ll be in Toronto tomorrow to give a talk about user research and cultural insight to a group of people working in the food industry (primarily) as sensory scientists; smell-and-taste researchers who I think work on groovy stuff like “mouth feel” and so on. Sounds like a great group; they’ve got a lot of registrants for our workshop and we’re going to do a bit of an observational walk-around exercise in some different neighborhoods in Toronto. I’m looking forward to it, despite taking a red-eye flight tonight.

Projective Techniques for Projection Technologies

Projective Techniques for Projection Technologies, my paper for the dux05 conference, has just been posted online. Check it out here!

To facilitate the development of a new home-entertainment device (a portable projector with built-in speakers and a DVD player) we conducted in-home interviews that explored home entertainment activities, presented a demo of a rough prototype, and brainstormed with participants about future refinements.

I don’t often get to talk about my consulting work, so it’s great to have a fairly detailed case study published and available.

Interviewing and research tips

Dina has a bunch of great tips for people wanting to be better at user research/interviewing.

# reading between the lines — dont just go with what they ‘say’ – look for non-verbal cues that really tell you what they ‘feel’. Also, try and understand the rationale behind what they say – laddering down to end values is something that always helps. It doesn’t pay just to know that a Toyota Corolla = Amitabh Bachchan – we need to understand why the analogy is made
# agility – you’ve got to be so quick in your mind – pick up cues from what respondents say – and take them forward. Listen well and react quickly – you should never feel when you listen too your tapes – oh how I wish I had probed this a little more.

Series

About Steve