December 15, 2016 - No Comments!

How mobile usability testing and in-app analytics enable smarter growth

Mobile usability testing is hard to do properly, and in-app analytics aren’t all that easy to collect either. But without the user insights both can provide, you’re leaving the success of your business up to guesswork. We can’t recommend you blindly launch a website, then try to improve usage by looking only at pageviews. So we also don’t let clients toss together an app, then guess at improvements based on download numbers.

That’s why, even though it’s hard, we’ve developed a mobile user testing process. We start by collecting the best insights we can while we’re building the app, then we track usage statistics and analytics after it’s launched. That way we can learn why users do what users do. To make informed decisions that improve the product and grow the business, we need to start with mobile app usability testing.

What is mobile usability testing and why do we do it?

When we talk about usability testing for mobile, what we’re really talking about is a series of moderated user interviews where we observe test users interacting with the application. Moderating the sessions gives us a chance to ask mobile usability test questions. We’re always chasing the why. It’s not enough just to know what users are doing.

Once we understand how users are interacting, we can turn that feedback into user experience improvements before it launches, helping us avoid negative reviews and low user adoption.

5 tips for conducting effective user interviews

How we approach mobile application usability testing

We’ve been refining our mobile usability testing tools for years, and our current toolkit gives us substantially more insight than the average application project. We just had an opportunity to deploy our system for mobile app usability research with Everything But the House, a marketplace for estate sales. The team at EBTH came to us looking to build a new mobile auction iOS app. We pretty quickly realized that we were dealing with a partner who valued mobile user experience testing as much as we do. And since they didn’t already have an app — just a responsive website packaged as an app in the app store — we knew we needed to get our mobile usability testing service in front of end users to make sure we were getting things right.

With EBTH, we did the mobile usability tests in two chunks, so we could start getting feedback as soon as we had a product to put in front of people. In the first phase, our mobile usability testing methods involved having users find a couch and decide if they wanted to bid on it. In the second, we used mobile app usability testing tools to explore the bidding process. Both gave us insights we wouldn’t have otherwise gotten — and insights the EBTH team hadn’t gotten from their website usability testing.

mobile usability improvements

How to test mobile apps: selecting the right mobile test user

EBTH already had a huge, passionate user base, and it would have been very easy to tap their existing community to test the app. For us though, mobile UX testing is about inclusivity. We treat user interviews as an opportunity to broaden the conversation. To be true to our usability testing methods, we went outside of their existing clientele to talk with people who were familiar with the idea of buying items in a virtual auction, but who didn’t have experience with the product.

We also took pains to balance out our own biases. We do our best to staff diverse teams, but this project had an all-male team. To make sure we weren’t missing something obvious to others, we made a point of incorporating more female test users. It’s a great way to fill in your blind spots. During the hour you’re interviewing him or her, the test user is a part of the design team, and their voice matters the most. It’s all about their context.

Showing people banging their heads creates such a visceral response: 'Yeah, we have to fix this.'

This is why we always push for user testing on mobile. We’re trying to get a diversity of reactions, and that means talking to people from all kinds of backgrounds. Even if they’re not necessarily the target user, you can really learn interesting things if someone who’s not like the others is allowed to talk.

Picking the right mobile user testing tools

I mentioned doing usability testing for mobile applications is hard — but doing moderated remote mobile usability testing was until very recently almost impossible. Mobile usability testing best practices require that we see not just how the user reacts, but what they’re reacting to. That means we need to see their face and their phone screen at the same time. To get our diverse experiences, we’re also keen to test on people from all over the target market, whether that’s the U.S. or the world.

Lookback has transformed how we test mobile applications. It’s mobile usability testing software that uses a phone's camera to create a recording of both the users' reactions and their phone screens simultaneously. We’ve tested tons of remote usability testing tools and methods, but nothing allows us to so succinctly capture all the information that we need at once. That Lookback also lets us turn those videos into embeddable clips is just a bonus. We can basically cut a movie trailer of the mobile user experience.

mobile user testing recording

You can tell, but you have to show at some point. It can be hard to conduct usability testing for mobile apps, then report back to your team and your client that something’s just not working and needs to be redone. With Lookback, we’re able to capture the exact moment of frustration. Showing people banging their heads creates such a visceral response: “Yeah, we have to fix this.”

What our mobile usability test taught us

We’ve gone into detail about how to write mobile app usability testing questions, but sometimes it’s the questions you ask before the test that prove most valuable. Our mobile app usability testing process starts with a getting-to-know-you questionnaire to learn more about the users and their motivations. With EBTH, we needed to validate our underlying assumptions about how people approach the idea of bidding. So we asked users to walk us through the last time they bid on an item on eBay. We learned right away that bidding is not like buying — bidding is a game. It requires nerve and skill, you’re trying to get the best deal.

That insight totally shifted our thinking. The goal isn’t to complete a task, it’s to play the game. Moreover, there are dozens of strategies, all informed by how an individual likes to play the game and what tools they have to play it with. A mobile user might be desperate for push notifications, to know when they’ve been outbid for example, where a desk-bound website user wouldn’t want constant updates. Someone else might be fixated on who they’re bidding against, relying on information about their competitors bidding history to make strategic decisions.

How we turned mobile user feedback into user experience improvements

ad mobile development servicesThe best part of mobile usability testing is the problems and solutions it turns up. At the end of the our UX testing for EBTH, we wrote a research finding document with high-priority changes to make to the app before launch, and potential features to add down the road. We worked closely with one of EBTH’s designers, and he was able to take the user feedback and come back with options to solve for their frustrations. Together, we quickly iterated to get the app right while also getting it out on schedule. There were two big things we mutually agreed had to happen right away:

Spend more time educating mobile users

Our usability testing questions allowed us to uncover all of the strategies involved in online bidding. Those insights dramatically changed the design we had for the EBTH bidding page. Now we were faced with the design challenge of fitting substantially more information on this one page than we had previously planned. People were getting lost in the bidding process, so we added detail and broke the screen up so people could more easily set maximum bids, make rules for automatic bidding and make other strategic decisions that give them greater satisfaction and a higher probability of winning.

mobile user testing improvements

  1. Users wanted to be reassured that they were bidding on the right item once they got to the bid page. The item name alone wasn’t enough to assuage concerns, so we added an image. Yet again, a picture was worth a thousand words.
  2. We expected users to be familiar with the terminology around bidding. After all, everyone on our team had used eBay before. But that was our own blindness. Many people understood the concepts, but didn’t know the lingo. To help them, we added plain-language explanations to data fields, so users wouldn’t have to question or second-guess themselves while they were bidding. We also added a link to more detailed explanations for the users who need them.
  3. In the user interviews, we realized how important it was for people to play the game — and to feel like they were winning. Seeing what their competition was doing proved to be a crucial tactic, so we added a clear “view bid history” button to the bid page to display the bids people had placed on the item thus far. Now users can see how their opponents are bidding and use that information to know when and how much to bid. If everyone else is increasing their bids by $50 each time, a $51 increment may just give them the edge they need.

Give mobile users clear and direct feedback

Building all of the options sophisticated players need to bid meant introducing more potential for errors. Our app usability testing revealed that our error messages were further confounding users who got stuck on the bid panel. We were able to rework where the feedback was displayed and make it more directed, so we could better help them fix the issue.

How mobile usability analytics let us continue making UX improvements

EBTH already has a sophisticated analytics operation powering its website, but the team had limited visibility into their mobile app usability. In-app feedback was unreliable and, for a lot of metrics, nonexistent. Our mobile team is constantly testing mobile tools that can solve problems like this though. We were able to introduce three best practices to help EBTH learn more about their users’ behaviors.

  • iTunes Connect displays your app store metrics: downloads, views in the app store, views on the product page and reviews. It also has some in-app statistics, but they’re unreliably because users can opt-out.
  • AppAnnie imports the iTunes data, then compares its users to determine standing. We used this feature to figure out where EBTH was ranking on certain keywords. That gave the marketing team ideas for keyword tweaks, and we were able to track the app as it moved from rank 420 to rank 11. We then tracked the increase in downloads with iTunes Connect and usage with the next tool.
  • Fabric.io’s Answers tool gives us in-app user analytics. With it, we can see how many people are using the app and how often. So once people have downloaded the app, we actually know if they're finding it valuable.

in-app analytics

With just these three tools, EBTH has so much more information about their usability on mobile than they had in the past. They’re able to track the effect of a marketing push, and we’re able to see how people are using the app so we can suggest improvements and monitor results.

Using user experience testing to build smarter mobile apps

Building intelligently is all about listening to the user. You can’t ask someone to love your product just because you do. You have to build something that solves a real need. With usability testing tools, we can ask the right questions before we build. And with in-app analytics, we can test every change to the app that we make. Taken together, we’re able to gather insights we never had access to before and use them to build smarter products in the future.

Read our case study on Everything But The House

Published by: Will Norris in Business

Leave a Reply