We Heart It is a social network for inspiring images. Users collect the images to share and organize into collections.
The task was to cover 95% of the application with automated tests to quickly and reliably answer the question - "Can we deploy?" We needed to design test suite architecture which allowed us to run tests in dozens of threads on 6 different browsers. At the same time, we needed to develop atomic scenarios to avoid collisions during the execution of parallel tests. We also needed to consider using API to increase the speed of duplicated activities and prerequisites for creation of entities before the tests through database.
At the same time, there was no formal QA process on the project, so we needed to design it and familiarize the rest of the team with it.
We developed quite a complex environment to run automated tests for this project. We were given a 64GB RAM dedicated server on which we setup and ran 15 virtual machines. There were three browsers running in parallel on each VM. In total, about 40-45 browsers were run in parallel at the same time. Multi-threading allowed us to reduce the time taken for all 2000 scenarios run from 12 hours to 2 hours and, as a result, the team was able to answer the question of "Can we deploy?" more quickly.
Tests were run on 6 various browsers and their different versions. We also integrated our web automation tests with bug tracking and team management systems. So, when tests are completed, the bugs that were automatically created on BTS side and the appropriate test cases are marked as failed/passed for specific test run. The statistics about all test runs are stored on CI side, so you can always see the history of your builds. At the end of each test run, the fully featured view report was sent out with detailed information about passed/failed scenarios.
A team of 3 senior automated test engineers worked on the project and designed a complex architectural solution for automated tests from scratch. Starting from the first days, our framework was integrated into client's continuous integration process. The number of tests increased on daily basis and the client felt the immediate progress.
Mobile Automation TestingAfter few months of work, a portion of the team switched to performance and load testing. That was considered a one-time choice and clients should understood how quick and reliable the product is.
Performance TestingOther than the fact that the client had an in-house manual testing team, which was focused on testing the mobile application, we helped them on periodic basis, especially close to releases.
Mobile TestingInitially, the client hired us as quality assurance consultants to audit current testing processes and propose improvements. We analyzed current approach and made a proposal. All our recommendations were implemented.
QA Consulting