Solebit (MimeCast) provides identification and prevention of zero-day malware and unknown threats.
From the moment when we joined the project, there were automation tests designed and developed by the in-house development team. Tests were not useful because every time a developer triggered the tests, it was by hand in a terminal. As the result, the team had a big and complicated file with the results.
Also, the tests took more than 20 hours for completion, and the history of the build was also unavailable.
The test suite architecture was unscalable, making it difficult to maintain a large number of test machines.
The task was to create simple runner for tests and increase their speed. We learned some valuable lessons. We should have redesigned the architecture to support cloud platform integration, made the tests run much more easily, and generated a clean report with all the necessary details.
At the same time, we should propose the improvements for current testing process and made some manual testing, too.
DeviQA created complex and detailed automated scenarios for testing REST APIs using the Faraday library. We built a software development kit application that worked on various machines, including Azure, Google Cloud, Docker, and both privileged and unprivileged LXC containers. The QA team created complex methods for these integrations using SSH and SFTP connections. We also added different scenarios for Microsoft Azure and Mac. Altogether, we designed and developed more than 25,000 tests. Every test ran 300,000 files which were then uploaded.
We reduced the build time by a factor of 10 and configured how to run tests in simple way without the involvement of a tech person. We also created a reporting system that allowed the team to view clean reports even if the results contained 10k rows. Finally, we configured file processing selection on the number machines (Azure Cloud, Google Cloud).
Most of our work was focused on the automated testing of REST APIs, auto sending emails through SMTP, DevOps, and configuration activities. Autotests for uploading files were running in parallel on 30 different machines, with 10 threads on each machine.
We also tested a UI part of the application and prepared comprehensive test documentation. These test cases covered 80% of the application, and we created test suites for each release for different customers.
We designed and developed performance / load and stress tests and then integrated them into CI. In addition to checking the ability system under pressure, we also checked how fast the system was up and running after a crash.
Before we start a long-term journey with the client, we proved our expertise by performing a QA audit. We identified thin places in infrastructure, including non-flexible machines, a low machine count for checking, a necessity to clean up previous application versions, extended machine creation time before build. We provided clean and simple ways to fix these issues.