QA for image-based social network
Setting up a test automation solution from scratch for confident software deployment.
About project
We Heart It is an image-based social network that positions itself as ‘a home for your inspiration’ and a place to ‘organize and share the things you love.’ Users can collect (or ‘heart’) their favorite images, share them with friends, and organize them into collections. The app enables users to discover, curate, and connect through shared visual interests. We Heart It is available for both iOS and Android platforms, due to which the app can be easily accessed and enjoyed on the go.
Before DeviQA
There was no formal QA process on the project
Manual regression testing took about 2 weeks
There was no well-structured system for test result collection and analysis
With DeviQA
A complex architectural solution has been designed for automated tests from scratch, and ≈ 2,000 test scripts were developed
15 virtual machines were run on a dedicated 64GB RAM server to enable parallel test execution on 40—45 browsers in total
Automated regression testing took only 2 hours
Our automated web tests were integrated with bug tracking and team management systems
The statistics about all test runs were stored on the CI side
Our contribution
Team
3 automation QA engineers
Project length
1 year
Technologies and tools
WebDriver.io
JavaScript
Linux
Jenkins
Multithreading
BrowserStack
REST API
Bamboo
Xcode
Android Studio
JMeter
Our engagement
Our key task was to cover 95% of the application with automated tests to quickly and reliably answer the question: ‘Can we deploy?’
Firstly, we had to design a test suite architecture that would allow us to run tests in dozens of threads on 6 different browsers. At the same time, we developed atomic scenarios to avoid collisions during the execution of parallel tests. We considered using an API to speed up duplicated activities and the creation of entities before testing through the database.
As there was no formal QA process on the project, we needed to design and set it up, as well as familiarize the rest of the team with it.
We developed quite a complex environment to run automated tests for this project. We were provided with a 64GB RAM dedicated server, on which we set up and ran 15 virtual machines. Three browsers were running in parallel on each VM. In general, our tests were run on 40-45 browsers in parallel. Due to multi-threading, we reduced the time needed to run all 2000 scenarios from 12 hours to 2 hours. As a result, the team was able to answer the ‘Can we deploy?’ question more quickly.
Tests were run on 6 different browsers and their different versions. We also integrated our web automation tests with bug tracking and team management systems. So, on test completion, detected bugs were automatically logged on the BTS side, and the corresponding test cases were marked as failed/passed for a specific test run.
The statistics about all test runs were stored on the CI side, so we could always see the history of our builds. At the end of each test run, a test report with detailed information about passed/failed scenarios was sent out.
Services provided
Performance testing
After a few months of work, some members of the QA team switched to work on performance and load testing. That was a one-time task, as the client wanted to know how quick and reliable the product was.
Web automation testing
A team of 3 senior automated test engineers worked on the project and designed a complex architectural solution for automated tests from scratch. Starting from the first days, our framework was integrated into a continuous integration process. The number of tests was increasing on a daily basis, and the client felt immediate progress.
Mobile testing
Although the client had an in-house manual testing team that was focused on testing the mobile application, we helped it periodically, especially before releases.
Dedicated QA team
Initially, we were hired as quality assurance consultants to audit the current QA processes on the project and propose improvements. We carried out a comprehensive analysis and detected areas for improvement. All our recommendations were implemented. Using our rich experience, we built a comprehensive architectural system for automated tests and test report generation.
Facing similar challenges to We Heart It?
Schedule a call to see how we can help you