Menu

Quick and Dirty Usability Testing: Step Away from the Book

by Dana Chisnell

People often say to me, “I want to do usability testing at my company, but there isn’t time to do it the way it should be done, using all the steps in your book. So, since we can’t do it ‘right,’ we don’t do it at all.” They usually continue by saying how depressing that situation is. They’re stuck.

The Book they are talking about is The Handbook of Usability Testing, the second edition of which I wrote with Jeff Rubin. Wiley published it in April 2008. It’s 350+ pages of, as you might expect, instructions on how to plan, design, and conduct effective usability tests (just like the subtitle says).

But in the same way any good high school algebra teacher did, Jeff and I give readers the ”long way“ first. The idea is that after they learn the classic way, they can figure out shortcuts and variations that work for them and still get the “right” answer. The point is to do enough to find out what they need to in order to inform the design they’re working on, and no more than that.

You don’t have to do it by the book to get useful data. But it is different data from what you get from a formal method. There are trade-offs to be made. You do have to understand where the data came from and what it means. You can conduct usability tests that are quick, cheap, and generate all the insights about your users and your design that you can handle.

So, I suggest my readers ditch the book (after they’ve bought it and read it), and do whatever level of testing works for their situation. I mentioned trade offs. Here are some things to think about when considering no testing, by-the-book testing, and just-right testing.

No testing: The Battle of Opinions

When I hear that teams are not doing usability testing, I wonder what they’re basing design decisions on. When I ask, I often get a mix of answers from, “The CEO wants…” to “market research tells us,” to “we brainstormed in a team meeting.” It feels to me like they’re guessing.

Some teams may have new products that the team may not have customer service data for, much in the way of market research, and certainly no web analytics. Start-ups with novel products often start out this way. In my experience with start-ups, the design decisions are made by whoever has the strongest opinions, the most clout, or a combination of the two.

Some teams may have market research. Others may have customer service logs available. Maybe some web analytics. I’ve met teams that rely on the way the chief engineer’s/product manager’s kid/mother/cousin/last boss/competitor approaches the problem the design is trying to solve.

All of those can be good things to base design decisions on if the team is designing for the product manager’s cousin’s last boss. But, even taken together, a design team might have a less-than-complete picture of how and why a design helps (or doesn’t help) real users in real situations reach their very real goals.

How to resolve the conflicting opinions and fill in the gaps? Go watch your users use the design.

By the Book: Detailed Data Based on a Rigorous Approach

To get away from the opinion wars and the guessing, the best teams use data to base decisions on. One way to do that is with a thorough, rigorous shake-down through classic usability testing.

Now you’re imagining a lab setting with recording devices everywhere, an experiment-like situation with a stable product to test, and a robot-like researcher to administer tasks and collect data. Well, even classic usability tests aren’t quite like that.

Jeff and I made usability testing look like work in our book because it is. Going through the steps is important in some contexts to support broad generalizing of findings, meet management expectations, and ensure credibility.

You may be thinking, I just don’t have time for all that. We’ve got a product to deliver! You’re hearing the secret here first: you don’t have to do it by the book.

Just Right: Quick Insights as Needed

Even if teams don’t do classic usability tests, they still need insights on which to base design decisions. The secret to usability testing in the wild is that you can conduct usability tests following the basic methodology, just less formally. Call it Usability Testing Lite: sit next to someone using a design and watch them.

Teams that do testing in the wild don’t need a lab. They don’t usually record anything. But they do have everyone on the team fully present, with at least one other person from the team in each session held with a user. They conduct sessions in cafes, or malls, trade shows, or street fairs, even their own reception areas—anywhere the users might be—and ask nicely for a few minutes of time to try out something new.

These are quick, cheap, and insightful sessions. And since these smart teams were able to gather a few insights in a few days rather than a few weeks, they just do another round as soon as they can. They repeat the steps until they start to see trends. Then adjust as more questions come up.

The thinking of the best teams is, how could having some insights be worse than doing nothing? At least they got out of the office, maybe got a reality check on some small part of a design, and started to make a case for having more contact with users. Sounds better than opinion wars to me.

About the Author

Dana is the co-author of the must have book on user research testing, The Handbook of Usability Tesing 2nd edition . Though much of the Handbook is about the process of usability testing, one of the major updates in the book from the first edition are about recruiting participants for usability tests.

Dana is an independent usability consultant and user researcher who founded Usabilityworks in San Francisco, CA. She has been doing usability research, user interface design, and technical communications consulting and development since 1982.

How to Win Stakeholders & Influence Decisions program

Gain the power skills you need to grow your influence on critical product decisions.

Get mentored and coached by Jared Spool in a 16-week program.

Learn more about our How to Win Stakeholders & Influence Decisions program today!