As an increasing amount of consumer and enterprise technologies deploy mobile apps, we have an increasing demand for usability testing outside of the lab in real-world contexts. For a recent client project, we tested an e-commerce app designed for use inside of retail stores. Here's the setup we used for a formative usability test conducted inside of a retail store.
GoPro for on-the-go shots
Photo: Wearing a stylish necklace-mounted GoPro
Problem: researchers lack a standard setup for video-recording mobile tasks
Solution: necklace-mounted GoPro (or glasses-mounted camera)
Video clips of people using the app in a real-world context were the most compelling deliverable to the client. A wearable necklace-mounted camera such as a GoPro gives a great shot of what people are doing with their hands on the phone, and has a wide enough angle to see much of the world in context. By pointing the camera diagonally downwards, it's possible to protect the privacy of bystanders by not capturing faces.
See our sample GoPro video of a mobile app usage.
A glasses-mounted camera like Google Glass seems like a good option for this in the future, for another first-person perspective, although it's important to have a swappable battery (like the GoPro does) for recording video during several sessions over a long testing day.
Sit-down tasks using Reflector app + AirPlay
Photo: iPhone running Maps app mirrored to MacBook running Reflector app
Problem: participants can’t do all tasks standing due to fatigue
Solution: walking and sit-down portions of usability session
A typical hour-long lab usability study does not translate into a mobile study since people get tired standing after more than about five minutes — or less with less-physically-able participants. Dividing the session into sitting and walking parts reduces the time spent standing.
Many mobile tasks and interview questions can be performed sitting. Sitting is a natural choice for browsing and entertainment-oriented activities that people do on their phones while waiting or on public transit.
During the sit-down portion, laptops make a great recording device.
To capture participants' interaction with the app, we used AirPlay to mirror the iPhone screen to a MacBook running a screen recorder. Using AirPlay, you can broadcast from an iPhone to a MacBook. AirPlay is included in iPhone 4S and above (search online for directions); you need to install the Reflector app on the MacBook to enable it to receive AirPlay. AirPlay requires WiFi, which was available in the retail store where we were testing; had it not been available, we would have used another solution such as using a webcam on a tabletop tripod oriented down to capture the hands of the participant interacting with the screen.
To capture participants' facial expressions and commentary, we used the laptop's built-in camera and built-in microphone.
To record the video, we used Camtasia for Mac. Camtasia allows you to composite the screen recording with the video of the participant's face. It saves the recording locally and offers basic editing tools to generate video clips. We considered WebEx for recording and compositing, but we didn't trust the reliability of the WiFi in the busy retail store, and we didn't need to broadcast live to observers.
We did not record a separate video feed of the participant's fingers interacting with the app. We preferred the crisp video feed directly from AirPlay, and we found that having the video feed of the participant's face was sufficient for making it feel real, invoking empathy in viewers, and capturing expressions such as surprise and hesitation. To record fingers, we would have most likely set up a tabletop tripod and asked the user to interact with the phone under the camera. We found it more natural to broadcast via AirPlay so the participant could hold the phone however they wanted, without the constraints of a camera angle.
Note-taking on the go
Photo: iPad running Pear Note with script and session notes
Problem: researcher can’t easily type notes on a laptop while walking around
Solution: tablet with notes synced to voice recording
Typically we use a printed script and full keyboard for note-taking, but for this study we needed a way to read a script and take notes while walking around. Our solution was an iPad running Pear Note, an iOS equivalent to Microsoft OneNote, which matches up typed notes with audio recording. This enabled us to jot down brief notes as we walked around with participants, and fill in exact quotes later at a desktop. We held the iPad like a journalist's notepad, pretty close to the participant's mouth, so their voice was audible in the recording. We used Pear Note for note-taking during both the sit-down and walk-around portions of the usability test.
We also found it helpful to copy the interview script into the note-taking app. This provided a guide for the moderator without having to hold a separate sheet of paper, and it made it easy to browse the notes later. We typed notes instead of using a handwritten solution like LiveScribe smart pens because of the ability to easily copy the script into the note-taking app.
To speed up typing, we used fill-in-the-blanks for multiple choice questions. We considered more structured data entry on the iPad, but we have not yet found a solution that indexes form entry with voice recording.
Paper scoresheet and non-disclosure agreement
Photo: Paper scoresheet and NDA
Problem: researcher prefers not to carry an extra device for participants to enter ratings
Solution: paper scoresheet
We collected quantitative ratings of each task and the overall experience. We prefer that participants enter ratings themselves so that they can remember previous ratings and rate tasks consistently.
We found it easiest to have participants write scores on a paper scoresheet, while the moderator continued to take notes about their comments on the iPad. While it required manual entry later, it was one less device to worry about charging and carrying around.
We also used a paper non-disclosure agreement, which is consistent with our lab study protocol.
Get permission from space owners and notify bystanders
Photo: Easel with filming notice (source: Eden Project)
Problem: bystanders may have privacy concerns over video-recording in semi-public, semi-private places like retail stores
Solution: permission of owner and disclaimer at entrance
The store owner that we were working with put up a sign at the entrance of the store notifying customers that they might be recorded. There isn't a reasonable expectation of privacy in most retail stores (except in places such as a dressing room), so a statement about in-store filming and use of the footage was sufficient for our study. The store's lawyer reviewed the exact language of the notice.
I found that interactions with customers and staff went more smoothly when I immediately identified myself as a member of the team doing a customer feedback study on this particular app. This also helped with dealing with security who were on the lookout for people taking pictures. I also wore the t-shirt of my research firm to emphasize that identity. Our liaison at the retail store communicated with the store staff before we came to inform them about the study and the filming, and to emphasize that their job performance was in no way being evaluated, but there were many staff members in the store, and not everyone remembered the notice, so I had to have a quick introduction about the study ready each time I interacted with the staff.
Your Turn
The fundamentals of usability testing don't change with mobile apps. The testing is motivated by the same reasons, is moderated with similar questions, and is analyzed and reported on similarly. We hope that these tips about the logistics of mobile video recording make it feasible for you to usability-test your team's apps out in the real world context of use, whether that's the great outdoors or the great indoors.
We'd love to hear your own experiences and tips-and-tricks for mobile usability testing and user research. Please share in the comments or reply to us @EchoUser.