Logo
/ mobile-testing

3 Things You Would Really Want to A/B Test In Your Mobile App

By DeviQA on Wed Nov 30 2016 00:00:00 GMT+0000 (Coordinated Universal Time)

In the times of fierce competition in the mobile applications stores, it’s quite hard to find a loyal audience. Price optimization does not always help to drive more traffic. You should constantly look for the ways that will help to convert the maximum number of users.

The essence of the A/B testing for mobile or web app testing is just to find out what entry points rather fail to engage the audience. This topic is on the verge of marketing and development of mobile applications and requires constant intervention, except for marketers, designers, and programmers.

When conducting such a test, traffic is divided into several streams: some users see one option (for example, different icons), while others see the second one. What each group sees is accidental. The option that consistently brings more conversions should be taken into work, the others are dropped.

The method involves some conditions:

For statistically significant results, it is desirable to drive not less than one and a half thousand people on each version of the landing page;

Each Landing should fall strictly identical volume of inquiries;

Different versions of the page should be displayed in front of your audience in parallel at one and the same time;

It should be clearly understood what kind of features of the winning landing page gave it a head start so that it is desirable to test a single element in a single pass.

What can you check with A/B testing?

You can test anything you want: the name of the app, its screenshots, and icons, colors blocks. Your imagination is the only limit. But try not to overdo with the process. There’s no need to test 40 types of icons, just pick up the ones you like most as your inner critic will tell you what options are clearly delusional. Moreover, it is wasteful, both regarding money and the too long testing process.

Checking with A/B Testing

Can A/B test become a Failure?

A/B testing is often considered a failure for the reason that it does not provide statistically significant results. However, this is only one side of the coin.

It’s not very pleasant to face a situation when the results of A/B tests are statistically significant, but they do not match with what you expected to see. Aim at getting the information that you truly need, aimed precisely at those elements of your product, which can increase the competitiveness of the applications and help you achieve specific goals.

How to Conduct a Mobile App Test?

Define your goals

It is important to remember the business goals of your application when determining key parameters that you plan to change using A/B testing. For most applications, the most important parameter is the conversion rate of users.

Define the application KPI

KPI (key performance indicators) is the data showing the success of your application. As a part of A/B testing, a clearly defined KPI will show you what to better focus on in each test.

Remember that a good interface and user interaction are the keys to success

If you're checking a random number of variants of the pages or the messaging field, it is very likely that your results will be too random and chaotic. Each option that you include into the A/B test should be based on the results of a comprehensive study of the principles of quality and design.

How to Test Mobile App: Successful Examples

Runkeeper: Attracting users by changing the home screen

Runkeeper is an application to track the user's activity, which was released back in 2008. In 2013, the developers decided that a change in the start screen will inspire users to monitor the way they walk, run, etc. The idea was risky, but it worked: the new graphical menu inspired people to use the product when not only running but also for other sports.

Runkeeper. A/B testing of start screen

Škoda: experiments with screenshots in the app store

Škoda Company was able to increase the level of their “Little Driver” application download by 50% by a simple rearrangement of screenshots in the app store. Developers have found that the interest of potential members disappeared after about 8 seconds of viewing. They put screenshots illustrating the main advantages of the application in the first place, and buyers appreciated it.

Škoda: experiments with screenshots in the app store

Spreadshirt: the clearer it is, the better it is

Spreadshirt is a retailer specializing in the sale and purchase of original T-shirts and prints. The company decided to increase the number of users of its site and changed the infographic illustrating the mechanism of using this web resource. The new version was clear, and, most importantly, it was concise. As a result, the number of people who clicked on the button "Start selling now" increased by 60.6%!

Spreadshirt. Changing the infographic illustrating the mechanism

The Bottom Line

The main mistake in the A/B testing is blind testing without the appropriate research base. When checking the mobile application, it is important to clearly understand what people want, rely on the thoughtful design of the product and continue testing until you get significant results.