Facebook and CNET

I just finished two more projects I can't really talk about.

One was a user testing project for CNET (a CBS Interactive property). I conducted six remote interviews over two days and delivered the report a week later with recommendations and next steps.

The other was an onsite user testing project for Facebook. I conducted four (six were scheduled, but only four made it) in-person interviews over two days and delivered the report two weeks later, also with recommendations and next steps. If anyone asks, the snack options at Facebook are INSANE.

I'd like to thank Applause and Motivate Design for opportunities to work on such interesting projects and trusting me with their clients.

Now I'm ramping up another research project for Lucky Brand.

Magic Numbers

There are a two numbers in research that are seemingly magical: 5 and 400

According to the Nielsen Norman Group (one of the originators of user research), you only need five participants to understand if your website, app, or software or ready for launch. Why? Because the direct observation experience is universal. If you do enough user tests, you'll find that the problems you discover with a given participant start repeating themselves.

NNG actually did some math on the subject and you can read their post if you are so inclined. If not, it basically says that a study will identify 75% of usability issues with five users. If you choose to go beyond five, you will start to see repeats of the same issues.

I usually recommend recruiting six people for sake of safety. Just in case one of your five doesn't show up or just ends up being weird and you can't use their data for some reason.

This can make your usability test deceptively cheap. The test itself is cheap, but as you can see from my process page, you are not meant to test only once. User testing is an iterative process. Design > develop > test > analyze > redesign > redevelop > retest .... over and over again until you solve your usability issues and the product is ready for launch. That can be expensive, but necessary for the product's success.

Another magic number is 400.

When doing quantitative research you want to have enough respondents for the data to have statistical significance. This is your sample size. Below is the equation for determining your sample size and I know it looks scary, but don't worry, there are lots of online sample size calculators available to do the heavy lifting for you.

Sample Size Equation

Sample Size Equation

The sample size is dependent on a few variables:

  • Confidence Level - This is how confident we are that our sample accurately reflects the population we are studying. 95% is acceptable in most cases.
  • Confidence Interval/Margin of Error - This describes the accuracy of your findings. I usually default to 5%. This means that whatever findings we get back, there has to be at least a 5% difference for it be statistically significant. Under 5%, there is not enough of a difference to report.
  • Population Size -  This is the number of people in your entire population. If you are doing a survey of your friends and you only have five friends, then your population is 5. If you are surveying everyone in California, then your population size is 38.8 million.

Let's say we're catering a party at work and we need to decide between hamburgers and hot dogs. We work in a large office with 200 people.

  • Confidence Level  - 95%
  • Confidence Interval/Margin of Error - 5%
  • Population Size - 200

Sample Size  = 132

Now let's say that we're organizing a party for everyone in my hometown of Vancouver, BC and we need to make the same hamburger vs. hotdog decision.

  • Confidence Level - 95%
  • Confidence Interval/Margin of Error - 5%
  • Population Size - 2.4 million

Sample Size = 385

If we were to increase that population size to 5 million or even 5 billion, the sample size would still be 385. See, magic. I quote 400 as the magic number for padding.

So now you know. If you want a successful user test, you need to recruit at least 5 people. If you want a statistically significant data on a large population, then you need to get at least 385 survey respondents.

Megaphone Magazine Mobile App - Pre-Launch User Test

Client: Denim & Steel and Megaphone Magazine

Product: Mobile Application

Background: Megaphone Magazine is a not-for-profit social enterprise that empowers Vancouver's at-risk population through commerce. Magazines are produced at a fixed cost and sold to the vendor, who turns around and sells it to his or her customer base for cash at a reasonable markup. Vendors keep all of the revenue they make from their sales.

The growing ubiquity of credit and debit cards caused a decrease in sales for the vendors. Megaphone received a grant to develop a mobile application which would allow regular buyers to purchase magazines using their credit card without bypassing the vendor.

Denim & Steel asked the researcher to conduct the first round of user testing on the mobile application prototype with limited financial resources as is typical for non-profits.

Objective: Ensure that the Megaphone mobile application will work and be understood by magazine buyers, magazine vendors, and Megaphone administrators upon first use.

Research Process: A temporary lab was created in the Megaphone Magazine offices using a quiet meeting space, a simple hand-held voice recorder, and Reflector to mirror and record the actions taken on the app. A fake credit card was also used to assist in setting up the app for each participant.

The researcher instructed Megaphone administrators on who to recruit, how many of each, and how to schedule them. Buyers were qualified by having a smart phone and being regular buyers of the magazine. Megaphone was allowed to use its best judgement on who to recruit as vendors and admins could be anyone employed by Megaphone who would be likely to edit information on the backend or deliver payments to vendors.

Each testing session was viewed as a multi-step process -

  1. Buyers - Setting up the app for payments, then buying a magazine, then editing an order, and confirming payment
  2. Vendors - Recording the transaction number and then getting paid by an administrator
  3. Administrators - Paying a vendor and then editing vendor information for the app

Participants were asked to go through each step in the sequence and the researcher and the developers watched as they succeeded and struggled with the process.

Outcome: Most of the problems found with the app were minor, but one interaction stood out as a potential fault in what was considered a critical point in the purchase process - the confirmation screen. One user swept past the confirmation screen without event realizing it. While researcher suggested an overlay, Denim & Steel went a very large step further. They created a benign disruption by turning half the screen upside down. This created enough cognitive dissonance to catch the user's attention and prevent a reflexive dismissal of the confirmation screen and helped maintain personal space between the vendor and the buyer.

Before and after rendering of the Megaphone Magazine cashless payment mobile application.

Before and after rendering of the Megaphone Magazine cashless payment mobile application.

Another insight from the user test was the need to train the vendors to act as an ad hoc help desk for the buyers when they use the app. Suggested subjects included:

  • What is an app? What kind of phones does it work on?

  • How do you get the app?

  • Credit cards vs. debit cards

  • Phone etiquette and personal space

  • How is the service fee calculated and who does it go to?

  • What to do if the customer skips the transaction screen?

"Denim & Steel Interactive turned to Lauren to run our user testing sessions for the Megaphone App. Though we often do user testing internally, we wanted an additional degree of certainty that we had properly tested and validated the app’s design. Lauren’s professional manner and ease of collaboration made the process easy. Her final report both gave assurance to Megaphone of the app’s quality and helped us identify areas for optimization that led to truly novel solutions we would not have otherwise explored. On the next project where we need a higher benchmark for research, we’ll be calling Lauren in." 
Todd Sieling, Co-Founder of Denim & Steel