When it comes to planning usability tests, it remains surprisingly difficult for many companies and individuals to answer the question, “How many participants do I test with?” When I first started conducting them, sample size was the hottest topic in the UX industry. Jakob Nielsen had just released his famous paper declaring that five users were all you needed to conduct an appropriately informative study. Most of the clients I worked with were looking to target more than one user group (typically around three to four), so I would spend a full week or two crafting a recruiting screener aimed at identifying the right set of 15-20 participants.
The more I conducted usability tests, the more I observed that differences in participant segmentation were not leading to differences in how they tested. Usability studies test for ease of navigation, content comprehension and whether or not user needs are met. In the studies I conducted, if a critical navigation component was located below the page fold, participants experienced issues locating it regardless of their age group or other demographics. It was a poor user experience: we reported it in our findings and the client fixed the issue.
Thanks to the Lean UX movement, companies (particularly startups) have begun to explore ways to conduct usability testing at a cheaper price. We’re told the “leanest” way is to just get out there and test — with passers-by on the street, with other employees in the company cafeteria, with friends and family. It doesn’t have to be a true “user” of an app or service. Anyone is perfectly capable of providing honest feedback.
Now that I’m running my own startup, I understand all too well the budget and timeline challenges that exist when launching something new. In the beginning, I thought the “just get out there and test” approach was great. After all, “It’s better to test with somebody than with nobody.” But does this approach really hold true?
The Age-Old Reliability Question
Compare this approach to an all-important question I might pose to my husband: “Do I look fat in this dress?” His answer is always no. But as someone I’m trusting to deliver honest feedback, is he telling the truth? If, as a designer, you ask your friends and family to give feedback on the product you’ve created and that they know you’ve been working on for some time, will they give you the honest truth if something isn’t quite right? Or will they try to avoid hurting your feelings?
When I moved my company into its new office, I asked the first 20 people who visited what they thought about its design. Of course they “liked it;” they were my friends and family. I got the answer I was looking for: that the office was beautiful. So I stopped asking other people. But was it the truth?
Even if you’ve got the most honest friends (or husband) in the world, it can be hard to know how to ask a question that will elicit an honest answer. “Do you like the new office?” and “Tell me what you think about how this office looks” will lead to two very different answers. Sure, you could consider the former to be a “research” question — you asked a question and you received an answer. The latter, however, is going to give you insight.
The line of thinking I tell my clients these days is: If you don’t have the time or budget to do a usability test with the right participants and using a methodology that will honestly answer your questions, don’t force it. Take the time to really think through your approach and don’t compromise the quality of your responses. Is A/B testing the way to answer your question? Maybe you can spend some time monitoring your customers’ direct feedback to hear what people are complaining about, and turn those issues into a focus for usability testing. Adding time to your planning process inherently makes it a little less “lean,” but you should always conduct usability research to find the truth rather than settle on the easiest path.