DeviQA Case Study: WeHeartIt
DeviQA Logo

WeHeartIt

Mobile Automation Testing, Software Performance Testing, Mobile Apps, QA Consulting

Project Overview

What WeHeartIt had in the testing processes when they came to us and what they got after they started working with us.

Before improvement

There was no formal QA process on the project
Bugs were poorly described

After improvement

Multi-threading allowed us to reduce the time taken for all 2000 scenarios run from 12 hours to 2 hours
Integrated our web automation tests with bug tracking and team management systems
The statistics about all test runs are stored on the CI side
Designed a complex architectural solution for automated tests from scratch
Automation regression testing took 2 hours instead of 2 weeks of manual testing activities
64GB RAM
server with 15 VMs
2,000
auto tests developed
45
parallel threads
15
virtual Machines
400%
efficiency
6
browsers
QA Team:
3 Automation Test Engineers
Project length:
1 year
Technologies & Tools
WebDriver.io
JavaScript
Linux
Jenkins
Multithreading
BrowserStack
REST API
Bamboo
xCode
Android Studio
JMeter

The Challenge

We Heart It is a social network for inspiring images. Users collect the images to share and organize into collections.

The task was to cover 95% of the application with automated tests to quickly and reliably answer the question - "Can we deploy?" We needed to design test suite architecture which allowed us to run tests in dozens of threads on 6 different browsers. At the same time, we needed to develop atomic scenarios to avoid collisions during the execution of parallel tests. We also needed to consider using API to increase the speed of duplicated activities and prerequisites for creation of entities before the tests through database.

At the same time, there was no formal QA process on the project, so we needed to design it and familiarize the rest of the team with it.

Achievements

We developed quite a complex environment to run automated tests for this project. We were given a 64GB RAM dedicated server on which we setup and ran 15 virtual machines. There were three browsers running in parallel on each VM. In total, about 40-45 browsers were run in parallel at the same time. Multi-threading allowed us to reduce the time taken for all 2000 scenarios run from 12 hours to 2 hours and, as a result, the team was able to answer the question of "Can we deploy?" more quickly.

Tests were run on 6 various browsers and their different versions. We also integrated our web automation tests with bug tracking and team management systems. So, when tests are completed, the bugs that were automatically created on BTS side and the appropriate test cases are marked as failed/passed for specific test run. The statistics about all test runs are stored on the CI side, so you can always see the history of your builds. At the end of each test run, the fully featured view report was sent out with detailed information about passed/failed scenarios.

Services Provided

A team of 3 senior automated test engineers worked on the project and designed a complex architectural solution for automated tests from scratch. Starting from the first days, our framework was integrated into client's continuous integration process. The number of tests increased on daily basis and the client felt the immediate progress.

Mobile Automation Testing

After few months of work, a portion of the team switched to performance and load testing. That was considered a one-time choice and clients should understood how quick and reliable the product is.

Software Performance Testing

Other than the fact that the client had an in-house manual testing team, which was focused on testing the mobile application, we helped them on periodic basis, especially close to releases.

Mobile Apps

Initially, the client hired us as quality assurance consultants to audit current testing processes and propose improvements. We analyzed current approach and made a proposal. All our recommendations were implemented.

QA Consulting