Message isn't just message anymore

I remember back in 1999 working in a web shop that was a sibling company with a traditional ad firm, and thinking “do they realize that digital means more than just packaging copy & images for a new medium?”

Then over the years since, I’ve continually been amazed that most advertising & marketing pros still don’t seem to get the difference between “attention” and actual “engagement” — between momentary desire and actual usefulness.

Then I read this quote from a veteran advertising creative officer:

Instead of building digital things that had utility, we approached it from a messaging mind-set and put messaging into the space. It took us a while to realize … the digital space is completely different.

via The Future of Advertising | Page 4 | Fast Company.

I guess better late than never …

I actually love advertising at its best. Products and brands need to be able to tell great stories about themselves, and engage people’s emotions & aspirations. It’s easy to dump on advertising & marketing as out of touch and wrong-headed — but that’s lazy, it seems to me.

I appreciated the point Bill Buxton made in a talk I saw online a while back about how important the advertising for the iPod was … that it wasn’t just an added-on way to talk about the product; it was part of the whole product experience, driving much of how people felt about purchasing, using and especially *wearing* an iPod and its distinctive white earphones.

But this distinction between utility and pure message is an important one to understand, partly so we can understand how blurred the line has become between them. Back when the only way to interact with a brand was either to receive its advertising message passively, or to purchase and touch/experience its product or service — and there was precious little between — the lines were pretty clear between the message-maker and the product-creator.

These days, however, there are so many opportunities for engagement through interaction, conversation, utility and actual *use* between the initial message and the product itself.

Look at automobiles, for example: once upon a time, there were ads about cars, and then there were the actual cars … and that was pretty much it. But now we get a chance to build the car online, read about it, imagine ourselves in it with various options, look for reviews about it, research prices … all of that before we actually touch the car itself. By the time you touch the car, so much physical engagement has happened on your way to the actual object that your experience is largely shaped already — the car is going to feel different to you if that experience was positive rather than if it was negative (assuming a negative experience didn’t dissuade you from going for a test drive at all).

Granted to some degree that’s always been the case. The advertising acts like the label on a bottle of wine — shaping the expectation of the experience inside the bottle, which we know can make a huge difference.  But the utility experience brings a whole new, physical dimension that affects perception even more: the ability to engage the car interactively rather than passively receiving “messaging” alone. Now it’s even harder to answer the question “where does the messaging end and the car begin?”

Scaffolding and messy truth

I liked this bit from Peter Hacker, the Wittgenstein scholar, in a recent interview. He’s talking about how any way of seeing the world can take over and put blinders on you, if you become too enamored of it:

The danger, of course, is that you over do it. You overplay your hand – you make things clearer than they actually are. I constantly try to keep aware of, and beware of, that. I think it’s correct to compare our conceptual scheme to a scaffolding from which we describe things, but by George it’s a pretty messy scaffolding. If it starts looking too tidy and neat that’s a sure sign you’re misdescribing things.

via TPM: The Philosophers’ Magazine | Hacker’s challenge. (emphasis mine)

It strikes me this is true of design as well. There’s no one way to see it, because it’s just as organic and messy as the world in which we do it.

I mean this both in the larger sense of “what is design?” and the smaller sense of “what design is best for this particular situation?”

Over the years, I’ve come to realize that most things are “messy” — and that while any one solution or model might be helpful, I have to ward against letting it take over all my thinking (which is awfully easy to do … it’s pleasant, and much less work, to just dismiss everything that doesn’t fit a given perspective, right?).

The actual subject of the interview is pretty great too … case in point, for me, it warns against buying into the assumptions behind so much recent neuroscience thinking, especially how it’s being translated in the mainstream (though Hacker goes after some hard-core neuroscience as well).

Courageous Redirection

roadsigns

I’ve recently run across some stories involving Pixar, Apple and game design company Blizzard Entertainment that serve as great examples of courageous redirection.

What I mean by that phrase is an instance where a design team or company was courageous enough to change direction even after huge investment of time, money and vision.

Changing direction isn’t inherently beneficial, of course. And sometimes it goes awry. But these instances are pretty inspirational, because they resulted in awesomely successful user-experience products.

Work colleague Anne Gibson recently shared an article at work quoting Steve Jobs talking about Toy Story and the iPhone. While I realize we’re all getting tired of comparing ourselves to Apple and Pixar, it’s still worth a listen:

At Pixar when we were making Toy Story, there came a time when we were forced to admit that the story wasn’t great. It just wasn’t great. We stopped production for five months…. We paid them all to twiddle their thumbs while the team perfected the story into what became Toy Story. And if they hadn’t had the courage to stop, there would have never been a Toy Story the way it is, and there probably would have never been a Pixar.

(Odd how Jobs doesn’t mention John Lasseter, who I suspect was the driving force behind this particular redirection.)

Jobs goes on to explain how they never expected to run into one of those defining moments again, but that instead they tend to run into such a moment on every film at Pixar. They’ve gotten better at it, but “there always seems to come a moment where it’s just not working, and it’s so easy to fool yourself – to convince yourself that it is when you know in your heart that it isn’t.

That’s a weird, sinking feeling, but it’s hard to catch. Any designer (or writer or other craftsperson) has these moments, where you know something is wrong, but even if you can put your finger on what it is, the momentum of the group and the work already done creates a kind of inertia that pushes you into compromise.

Design is always full of compromise, of course. Real life work has constraints. But sometimes there’s a particular decision that feels ultimately defining in some way, and you have to decide if you want to take the road less traveled.

Jobs continues with a similar situation involving the now-iconic iPhone:

We had a different enclosure design for this iPhone until way too close to the introduction to ever change it. And I came in one Monday morning, I said, ‘I just don’t love this. I can’t convince myself to fall in love with this. And this is the most important product we’ve ever done.’ And we pushed the reset button.

Rather than everyone on the team whining and complaining, they volunteered to put in extra time and effort to change the design while still staying on schedule.

Of course, this is Jobs talking — he’s a master promoter. I’m sure it wasn’t as utopian as he makes out. Plus, from everything we hear, he’s not a boss you want to whine or complain to. If a mid-level manager had come in one day saying “I’m not in love with this” I have to wonder how likely this turnaround would’ve been. Still, an impressive moment.

You might think it’s necessary to have a Steve Jobs around in order to achieve such redirection. But, it’s not.

Another of the most successful products on the planet is Blizzard’s World of Warcraft — the massively multiplayer universe with over 10 million subscribers and growing. This brand has an incredibly loyal following, much of that due to the way Blizzard interacts socially with the fans of their games (including the Starcraft and Diablo franchises).

Gaming news site IGN recently ran a thorough history of Warcraft, a franchise that started about fifteen years ago with an innovative real-time-strategy computer game, “Warcraft: Orcs & Humans.”

A few years after that release, Blizzard tried developing an adventure-style game using the Warcraft concept called Warcraft Adventures. From the article:

Originally slated to release in time for the 1997 holidays, Warcraft Adventures ran late, like so many other Blizzard projects. During its development, Lucas released Curse of Monkey Island – considered by many to be the pinnacle of classic 2D adventures – and announced Grim Fandango, their ambitious first step into 3D. Blizzard’s competition had no intention of waiting up. Their confidence waned as the project neared completion …

As E3 approached, they took a hard look at their product, but their confidence had already been shattered. Curse of Monkey Island’s perfectly executed hand-drawn animation trumped Warcraft Adventures before it was even in beta, and Grim Fandango looked to make it downright obsolete. Days before the show, they made the difficult decision to can the project altogether. It wasn’t that they weren’t proud of the game the work they had done, but the moment had simply passed, and their chance to wow their fans had gone. It would have been easier and more profitable to simply finish the game up, but their commitment was just that strong. If they didn’t think it was the best, it wouldn’t see the light of day.

Sounds like a total loss, right?

But here’s what they won: Blizzard is now known for providing only the best experiences. People who know the brand do not hesitate to drop $50-60 for a new title as soon as it’s available, reviews unseen.

In addition, the story and art development for Warcraft Adventures later became raw material for World of Warcraft.

I’m aware of some other stories like this, such as how Flickr came from a redirection away from making a computer game … what are some others?

Why We Just Don't Get It

In an article called “The Neuroscience of Leadership” (free registration required*), from Strategy + Business a few years ago, the writers explain how new understanding about how the brain works helps us see why it’s so hard for us to fully comprehend new ideas. I keep cycling back to this article since I read it just a few months ago, because it helps me put a lot of things that have perpetually bedeviled me in a better perspective.

One particularly salient bit:

Attention continually reshapes the patterns of the brain. Among the implications: People who practice a specialty every day literally think differently, through different sets of connections, than do people who don’t practice the specialty. In business, professionals in different functions — finance, operations, legal, research an development, marketing, design, and human resources — have physiological differences that prevent them from seeing the world the same way.

Note the word “physiological.” We tend to assume that people’s differences of opinion or perspective are more like software — something with a switch that the person could just flip to the other side, if they simply weren’t so stubborn. The problem is, the brain grows hardware based on repeated patterns of experience. So, while stubbornness may be a factor, it’s not so simple as we might hope to get another person to understand a different perspective.

Recently I’ve had a number of conversations with colleagues about why certain industries or professions seem stuck in a particular mode, unable to see the world changing so drastically around them. For example, why don’t most advertising and marketing professionals get that a website isn’t about getting eyeballs, it’s about creating useful, usable, delightful interactive experiences? And even if they nod along with that sentiment in the beginning, they seem clueless once the work starts?

Or why do some or coworkers just not seem to get a point you’re making about a project? Why is it so hard to collaborate on strategy with an engineer or code developer? Why is it so hard for managers to get those they manage to understand the priorities of the organization?

And in these conversations, it’s tempting — and fun! — to somewhat demonize the other crowd, and get pretty negative about our complaints.

While that may feel good (and while my typing this will probably not keep me from sometimes indulging in such a bitch-and-moan session), it doesn’t help us solve the problem. Because what’s at work here is a fundamental difference in how our brains process the world around us. Doing a certain kind of work in a particular culture of others that work creates a particular architecture in our brains, and continually reinforces it. If your brain grows a hammer, everything looks like a nail; if it grows a set of jumper cables, everything looks like a car battery.

Now … add this understanding to the work Jonathan Haidt and others have done showing that we’re already predisposed toward deep assumptions about fundamental morals and values. Suddenly it’s pretty clear why some of our biggest problems in politics, religion, bigotry and the rest are so damned intractable.

But even if we’re not trying to solve world hunger and political turmoil, even if we’re just trying to get a coworker or client to understand a different way of seeing something, it’s evident that bridging the gap in understanding is not just a peripheral challenge for doing great design work — it may be the most important design problem we face.

I don’t have a ready remedy, by the way. But I do know that one way to start building bridges over these chasms of understanding is to look at ourselves, and be brutally honest about our own limitations.

I almost titled this post “Why Some People Just Don’t Get It” — but I realized that sets the wrong tone right away. “Some People” becomes an easy way to turn others into objects of ridicule, which I’ve done myself even on this blog. It’s easy, and it feels good for a while, but it doesn’t help the situation get better.

As a designer, have you imagined what it’s like to see the world from the other person’s experience? Isn’t that what we mean when we say the “experience” part of “user experience design” — that we design based on an understanding of the experience of the other? What if we treated these differences in point of view as design problems? Are we up to the challenge?

Later Edit:

There have been some excellent comments, some of which have helped me see I could’ve been more clear on a couple of points.

I perhaps overstated the “hardware” point above. I neglected to mention the importance of ‘neuroplasticity‘ — and that the very fact we inadvertently carve grooves into the silly-putty of our brains also means we can make new grooves. This is something about the brain that we’ve only come to understand in the last 20-30 years (I grew up learning the brain was frozen at adulthood). The science speaks for itself much better than I can poorly summarize it here.

The concept has become very important to me lately, in my personal life, doing some hard psychological work to undo some of the “wiring” that’s been in my way for too long.

But in our role as designers, we don’t often get to do psychotherapy with clients and coworkers. So we have to design our way to a meeting of minds — and that means 1) fully understanding where the other is coming from, and 2) being sure we challenge our own presuppositions and blind spots. This is always better than just retreating to “those people don’t get it” and checking out on the challenge altogether, which happens a lot.

Thanks for the comments!

* Yet another note: the article is excellent; a shame registration is required, but it only takes a moment, and in this case I think it’s worth the trouble.

Data vs Insight for UX Design

UX Insight Elements

Funny how things can pop into your head when you’re not thinking about them. I can’t remember why this occurred to me last week … but it was one of those thoughts I realized I should write down so I could use it later. So I tweeted it. Lots of people kindly “re-tweeted” the thought, which immediately made me self-conscious that it may not explain itself very well. So now I’m blogging about it. Because that’s what we kids do nowadays.

My tweet: User Experience Design is not data-driven, it’s insight-driven. Data is just raw material for insight.

I whipped up a little model to illustrate the larger point: insight comes from a synthesis between talent, expertise, and the fresh understanding we gain through research. It’s a set of ingredients that, when added to our brains and allowed to stew, often over a meal or after a few good nights’ sleep, can bring a designer to those moments of clarity where a direction finally makes sense.

I’ve seen a lot of talk lately about how we shouldn’t be letting data drive our design decisions — that we’re designers, so we should be designing based on best practices, ideas, expertise, and even “taste.” (I have issues with the word “taste” as many people use it, but I don’t have a problem with the idea of “expert intuition” which is I think more what a lot of my colleagues mean. In fact, that Ira Glass video that made the rounds a few weeks ago on many tweets/blogs puts a better spin on the word “taste” as one’s aspiration that may be, for now, beyond one’s actual abilities, without work and practice.)

As for the word “data” — I’m referring to empirical data as well as the recorded results of something less numbers-based, like contextual research. Data is an input to our understanding, but nothing more. Data cannot tell us, directly, how to design anything.

But it’s also ludicrous to ask a client or employer to spend their money based solely on your expertise or … “taste.” Famous interior or clothing designers or architects can perhaps get away with this — because their names carry inherent value, whether their designs are actually useful or not. So far, User Experience design practitioners don’t have this (dubious) luxury. I would argue that we shouldn’t, otherwise we’re not paying much attention to “user experience” to begin with.

Data is valuable, useful, and often essential. Data can be an excellent input for design insight. I’d wager that you should have as much background data as you can get your hands on, unless you have a compelling reason to exclude it. In addition, our clients tend to speak the language of data, so we need to be able to translate our approach into that language.

It’s just that data doesn’t do the job alone. We still need to do the work of interpretation, which requires challenging our presuppositions, blind spots and various biases.

The propensity for the human brain to completely screw stuff up with cognitive bias is, alone, reason enough to put our design ideas through a bit of rigor. Reading through the oft-linked list of cognitive biases on Wikipedia is hopefully enough to caution any of us against the hubris of our own expertise. We need to do the work of seeing the design problem anew, with fresh understanding, putting our assumptions on the table and making sure they’re still viable. To me, at least, that’s a central tenet behind the cultural history of “user experience” design approaches.

But analysis paralysis can also be a serious problem; and data is only as good as its interpretation. Eventually, actual design has to happen. Otherwise you end up with a disjointed palimpsest, a Frankenstein’s Monster of point-of-pain fixes and market-tested features.

We have to be able to do both: use data to inform the fullest possible understanding of the behavior and context of potential users, as well as bring our own experience and talent to the challenge. And that’s hard to do, in the midst of managing client expectations, creating deliverables, and endless meetings and readouts. But who said it was easy?

The Deep Dive, 10 Years Later

It appears someone has posted the now-classic episode of Nightline about Ideo (called the Deep Dive) to YouTube. I hope it’s legit and Disney/ABC isn’t going to make somebody take them down. But here’s the link, hoping that doesn’t happen.

About 10 years ago, I started a job as an “Internet Copywriter” at a small web consultancy in North Carolina. By then, I’d already been steeped in the ‘net for seven or eight years, but mainly as a side-interest. My day jobs had been web-involved but not centrally, and my most meaningful learning experiences designing for the web had been side projects for fun. When I started at the new web company job, I knew there would need to be more to my role than just “concepting” and writing copy next to an art director, advertising-style. Our job was to make things people could *use* not just look at or be inspired to action by. But to be frank, I had little background in paid design work.

I’d been designing software of one kind or another off and on for a while, in part-time jobs while in graduate school. For example, creating a client database application to make my life easier in an office manager job (and then having to make it easy enough for the computer-phobic clerical staff to use as well). But I’d approached it as a tinkerer and co-user — making things I myself would be using, and iterating on them over time. (I’d taken a 3-dimensional design class in college, but it was more artistically focused — I had yet to learn much at all about industrial design, and had not yet discovered the nascent IA community, usability crowd, etc.)

Then I happened upon a Nightline broadcast (which, oddly, I never used to watch — who knows why I had it on at this point) where they engaged the design company Ideo. And I was blown away. It made perfect sense… here was a company that had codified an approach to design that I had been groping for intuitively, but not fully grasped and articulated. It put into sharp clarity a number of crucial principles such as behavioral observation and structured creative anarchy.

I immediately asked my new employer to let me order the video and share it with them. It served as a catalyst for finding out more about such approaches to design.

Since then, I’ve of course become less fully enamored of these videos… after a while you start to see the sleight-of-hand that an edited, idealized profile creates, and how it was probably the best PR event Ideo ever had. And ten years gives us the hind-sight to see that Ideo’s supposedly genius shopping cart didn’t exactly catch on — in retrospect we see that it was a fairly flawed design in many ways (in a busy grocery store, how many carts can reasonably be left at the end-caps while shoppers walk about with the hand-baskets?).

But for anyone who isn’t familiar with the essence of what many people I know call “user experience design,” this show is still an excellent teaching tool. You can see people viscerally react to it — sudden realization about how messy design is, by nature, how interdependent it is with physically experiencing your potential users, how the culture needed for creative collaboration has to be cultivated, protected from the Cartesian efficiencies and expectations of the traditional business world, and how important it is to have effective liaisons between those cultures, as well as a wise approach to structuring the necessary turbulence that creative work brings.

Then again, maybe everybody doesn’t see all that … but I’ve seen it happen.

What I find amazing, however, is this: even back then, they were saying this was the most-requested video order from ABC. This movie has been shown countless times in meetings and management retreats. And yet, the basic approach is still so rare to find. The Cartesian efficiencies and expectations form a powerful presence. What it comes down to is this: making room for this kind of work to be done well is hard work itself.

And that’s why Ideo is still in business.

You Are (Mostly) Here: Digital Space & the Context Problem

Here’s the presentation I did for A Summit 2009 in Memphis, TN. It’s an update of what I did for IDEA 2008; it’s not hugely different, but I think it pulls the ideas together a little better. The PDF is downloadable from SlideShare. The notes are legible only at full-screen or on the PDF.