A couple of months ago, the Test Pilot team sat down with six volunteer users, one at a time, and asked them to go through the steps of installing Test Pilot and submitting test results.

(Yes, that’s right: we were testing Test Pilot — feel free to make infinite recursion jokes.)

What we found was extremely valuable. The same problems happened again and again. Six users may not seem like enough to give you useful information, but believe me, after you’ve seen the fourth or fifth user in a row trip over the exact same usability problem, you’ll have a pretty good idea of how high a priority it is. (Statistics rule of thumb: if you have a problem that affects 1/3 of users, then you only need to interview 5 users to have an 85% chance of seeing it).

This was my first time doing user interview style studies. They’re a very different kind of usability research from the massive data collection we do through Test Pilot. One is qualitative and the other is quantitative. One deals with individuals, the other with large groups. The two methods are complementary. You can collect all the statistical usage data you want, but it will never tell you what a user is thinking, or how they feel about your software, or how your software fits into their life, or what the meaning behind their interactions was. On the other hand, if you rely only on individual user interviews, you’ll fall into the trap of anecdotal data: you have some interesting stories, but you don’t know how typical they are of the whole population, or how important one user’s favorite use case is in the big scheme of things. Using both methods together produces a much more complete picture than either one can do alone.

What we found out

Most users weren’t sure what to do next after installing Test Pilot, seeing no clear call to action on the “Welcome” page. Almost everyone was confused by the many options in the Test Pilot menu. They were confused about the difference between a “survey” and a “study”, and why they saw both an “Accounts and Passwords Survey” and an “Accounts and Passwords Study”. They weren’t sure whether a given study was currently running or not, or how to tell what they were supposed to do about any given study. Most people went to the “All Studies” page and were disappointed to see that it didn’t really list all studies, only the ones currently running. Most people missed the notifications, accidentally dismissing them without even noticing that they had appeared. Finally, many people requested a way to be notified when the study they participated in had produced some kind of tangible results.

A funny story: Several of the users in the study believed their data submission was being rejected when they clicked the “submit” button. They got understandably frustrated, asking what was the point of going to all that effort if their data wasn’t going to be accepted.

Actually, the data had been accepted. In fact, Test Pilot doesn’t even have a concept of rejecting a user’s data submission, even after a test ends. So what was the problem? Well, here’s the message that appeared after a successful upload:

Thank you for submitting your data!
This study is no longer collecting data, and the data that was collected has been deleted from your computer.

Oops! What I meant was that the local phase of data collection had been successfully completed, and that since it had been uploaded, the local copy had been wiped in accordance with our privacy policy. But it’s easy to interpret the message as meaning that your data was rejected because the study as a whole was over. A poor choice of words on my part.

In response to these findings, we did a total interface redesign, which has now been released as part of Test Pilot 1.0 alpha.

Left: Test Pilot 0.4 menu. Right: Test Pilot 1.0 menu.

Left: Test Pilot 0.4 study page. Right: Test Pilot 1.0 study window.

Tips for doing user studies

  1. Give the subject a task to do, but don’t explain how to do it. The idea is to observe the process as the subject tries to figure out the interface. We wanted to observe the Test Pilot install and first run process, so we started out with “Pretend you’ve just heard about Test Pilot and want to install it. What do you do?” and we went from there.
  2. While you talk the subject through the test, have an extra person sit in the back of the room taking notes on everything the subject does. This is much more reliable than trying to remember the important parts afterwards, and much less distracting than trying to both talk and record at the same time.
  3. Make the subject feel comfortable. People’s behavior changes if they feel nervous, uncomfortable, or self-conscious. You want them to be happy and relaxed so they can focus on the task.
  4. Never, ever make the subject feel bad about their mistakes! Make it clear that the subject is not on trial here: the software is. If the subject apologizes for screwing up, or if they feel dumb because they can’t figure something out, then redirect the blame away from the subject and onto the software where it belongs!
  5. Encourage the subject to “think out loud”: the more they can verbalize their thought process — what they’re looking for, why they’re looking for it in a certain place, what they expected to happen vs. what actually happened — the more useful info you can get.
  6. Resist the urge to tell subjects what they’re doing wrong. Bite your tongue if you have to. The whole point is to see what naturally gives them trouble. You’ll defeat the purpose of the study if you try to steer them away from trouble spots.
  7. Even if they ask you how to do something, don’t tell them!. Instead, say “I’ll answer any questions you have after we’re finished with the test. For now, please try to figure it out.” Again, the idea is to see if they can figure out how to do the task on their own.
  8. However, if the subject gets completely stuck, you can give them a little nudge towards the next step. It’s better to continue with the study than to let the subject sit there stewing in frustration.

A more experienced usability tester could probably offer several more suggestions.

Advertisement