Four Steps to Great Usability Testing (Without Breaking the Bank)

I’m involved in something called the Product Design Guild in San Francisco. It’s pretty cool. We talk about ideas and do guerrilla usability testing of concepts. There’s a really smart group of people there, and they know great products only come out of testing an idea over and over again.

At one of the events, I’m “training” developers and product types in usability testing for a number of startups. The document I wrote I think is good enough to share and explains the basics of conducting usability tests. I’m convinced anyone can be trained to do it or at least write the test. Usability tests don’t have to be really fancy — only digital agencies looking to bill or drain the retainer would do eye-tracking on a half-baked idea — and the feedback you get is incredible.

Here’s the document for download in Word format.

There’s nothing worse that running a usability test and the user (read: idiot) using the mouse gushes about what they like, and they still can’t figure out where the login button is. The best suggestions I’ve gotten have been from users that provide more negative feedback (“You know, this is done better on this other site”) than constantly good feedback.

Step 1: Write The Usability Test (With Tasks that Actually Work)

  • Create five tasks you want the user to perform.
    The task should require at least six to nine clicks to complete the task. The tasks could be signing into the system or uploading photos and setting permissions.
  • Write scenarios for each task.
    You should explain the task without telling the user what to do. An example would be, "You were at a family picnic where you took photos of your nephews and nieces. You want to share those photos with your family on the Internet on Flickr. Where would you go to sign in, upload those photos, and set the preferences so only your family can view them?"
  • List follow-up questions you might want to add.
    The magic of usability tests are the answers you get outside of the scenarios. So you might want to come up with additional questions within the screens you are testing.

Step 2: Find Participants of All Sizes (Sort of)

  • Find people that fit your target audience.
    Very few sites have a target audience of "everyone.” Ideally, you would test people that would use your site. In the example above, you want to use people that liked sharing photos with their family and friends.
  • Find people with a range of experience.
    If you find five professional photographers that share photos constantly, you'll get a narrow view of how your site is performing. It's best to find users that could be both expert users (a professional photographer) and users than might not have as much experience (an uncle that just bought a camera and uses Facebook sporadically).
  • Find people that will HATE your product.
    There’s nothing worse that running a usability test and the user (read: idiot) using the mouse gushes about what they like, and they still can’t figure out where the login button is. The best suggestions I’ve gotten have been from users that provide more negative feedback (“You know, this is done better on this other site”) than constantly good feedback.
Use words that explain what they need to do, but are not contained within the site. Say "sign in" instead of "login.” Use "find files" on the hard drive instead of "browse.” Remember that you are testing the terminology the site uses as much as the design of the site.

Step 3: Perform the Usability Test (In 60 Minutes or Less)

  • Do a dry run.
    There's nothing worse than finding out on the first test that your script doesn't work. Before you jump into the testing, you should run through the test before performing it in front of test subjects. Be sure to work out any software or site issues before doing it in front of users.
  • Vary the tasks and keep the tests in the 15-30 minute range.
    For the purposes of most applications, you'll be testing three of five tasks with each user. The longer the test, the more often the user will get bored with doing the test. Keep the tests short and snappy.
  • Do not guide the user.
    Use words that explain what they need to do, but are not contained within the site. Say "sign in" instead of "login.” Use "find files" on the hard drive instead of "browse.” Remember that you are testing the terminology the site uses as much as the design of the site.
  • Ask questions.
    The beauty of usability testing is that you can ask questions that are also outside of the test. You don't necessarily know where the user is going to go. Instead of simple yes/no questions, ask open-ended questions, like "What do you think this would do?"
  • Let them talk.
    People are amazing – once they start talking. They'll tell you all sorts of things. I've had participants tell me their complete business process, including profit margins, during tests. The best feedback is sometimes outside of the what I was asking.
  • Record the sessions with some kind of screen recorder and take notes.
    I've used everything from Camstudio to WebEx to record results, but sometimes the best way to take notes is simply pen and paper. Don't get hung up on the tools. Eye tracking software is ideal but also expensive. You can get close to the same results just by recording what the users do with screen capture software.
  • Record the results on a range of pass/fail.
    In the past, I've used the pass/fail method on a task but found on a scale of 0 (they passed) to 3 (they couldn't find anything) was a better approach to grading tasks on a step by step basis and in aggregate.

Step 4: Evaluating the Results (By Reading the Tea Leaves)

  • Look for patterns.
    A focus group of one is not a good thing – too many usability professionals and product managers do this. I tend to discount those as preferences for that particular user. If two or three users say close to the same thing ("The button is hard to find."), there's a usability issue you may want to look at closer.
  • Highlight comments that can make your site better.
    The best feedback from testing is something that you didn't expect. Highlight some of the comments that might be outside of the tasks but make great sound bites for describing issues with your site.
  • Remember results are subjective, so discuss them with the team.
    Put a team of three usability professionals together, and sometimes they'll come to three different conclusions about the test. Ideally, you could combine qualitative results with quantitive results (i.e. stats from watching the conversion funnel).
  • Combine testing with other data collection methods.
    I sometimes use a site called Attention Wizard that calls itself a visual attention prediction tool to see where users might click based on contrast of color values. If you combine testing with other validation methods, you'll get better results.
  • Test your assumptions again.
    Testing is not a one-time thing. You should test as much as possible (I've tested as much as every two to three weeks in an engagement), because the more you test, the more you get to refine your product. I’ve had a couple of clients where we tested over 15 users with different tasks, creating a great amount of data that helped in product design.

Extra Credit: Other Resources

Other Blogs

Free Software

$99 Tough Love Resume and Portfolio Review

Tough love. Great Advice. Receive an one hour portfolio review and career coaching session online, or in person if you're in Seattle.

$99 with PayPal