- Home >
- Case studies >
- Image-based social network
QA for image-based social network
Setting up a test automation solution from scratch for confident software deployment.
About project
We Heart It is an image-based social network that positions itself as 'a home for your inspiration’ and a place to 'organize and share the things you love.’ Users can collect (or 'heart') their favorite images, share them with friends, and organize them into collections. The app enables users to discover, curate, and connect through shared visual interests. We Heart It is available for both iOS and Android platforms, due to which the app can be easily accessed and enjoyed on the go.
Before DeviQA
There was no formal QA process on the project
Manual regression testing took about 2 weeks
There was no well-structured system for test result collection and analysis
With DeviQA
A complex architectural solution has been designed for automated tests from scratch, and ≈ 2,000 test scripts were developed
15 virtual machines were run on a dedicated 64GB RAM server to enable parallel test execution on 40—45 browsers in total
Automated regression testing took only 2 hours
Our automated web tests were integrated with bug tracking and team management systems
The statistics about all test runs were stored on the CI side
Our contribution
Team
3 automation QA engineers
Project length
1 year
Technologies and tools
WebDriver.io
JavaScript
Linux
Jenkins
Multithreading
BrowserStack
REST API
Bamboo
Xcode
Android Studio
JMeter
Our engagement
Our key task was to cover 95% of the application with automated tests to quickly and reliably answer the question: 'Can we deploy?'
Firstly, we had to design a test suite architecture that would allow us to run tests in dozens of threads on 6 different browsers. At the same time, we developed atomic scenarios to avoid collisions during the execution of parallel tests. We considered using an API to speed up duplicated activities and the creation of entities before testing through the database.
As there was no formal QA process on the project, we needed to design and set it up, as well as familiarize the rest of the team with it.
We developed quite a complex environment to run automated tests for this project. We were provided with a 64GB RAM dedicated server, on which we set up and ran 15 virtual machines. Three browsers were running in parallel on each VM. In general, our tests were run on 40-45 browsers in parallel. Due to multi-threading, we reduced the time needed to run all 2000 scenarios from 12 hours to 2 hours. As a result, the team was able to answer the ‘Can we deploy?’ question more quickly.
Tests were run on 6 different browsers and their different versions. We also integrated our web automation tests with bug tracking and team management systems. So, on test completion, detected bugs were automatically logged on the BTS side, and the corresponding test cases were marked as failed/passed for a specific test run.
The statistics about all test runs were stored on the CI side, so we could always see the history of our builds. At the end of each test run, a test report with detailed information about passed/failed scenarios was sent out.
Services provided
Performance testing
Web automation testing
Mobile testing
Dedicated QA team