The Value We Bring
to Our Clients
Partnership with DeviQA has let our clients significantly improve the efficiency of all QA-related processes: test coverage up to 97%, accelerated product releases, better team productivity, higher client satisfaction and loyalty, and many more.
Partner With Us: See the Difference
Test coverage :
90%
2.5 weeks to run regression testing
4 days to run regression testing
7 days to run smoke testing
1 day to run smoke testing
Outdated test cases
60% increase in the number of regression tests and relevant test cases
50% test coverage
90% test coverage
Supported localizations were not tested
27 localizations supported by QA engineers
No automation testing
>1500 automated test scenarios created
Dev capacity:
+15%
Didn’t have a formal QA process
Built smooth and well-working processes
Didn’t make automated testing
Test automation : smoke, API, UI on all supported browsers
Testing was carried out by the devs
Full coverage of the application by test cases

Test coverage :
97%
Existing workflow in Jira didn’t allow to understand how many bugs were left unfixed before the release and their priorities
Adjusted the Jira workflow, integrated TestRail with Jira to be able to see statisticls of the issues per release
The lack of test documentation
Developed necessary test artifacts that adhere to industry standards and are easily maintainable
No Automation test coverage
Added multithreading to run tests in parallel on 10 different threads, which reduced the time of the test run in 7 times
Regression testing time
2x
No formal testing process
2 days to run regression testing
>1500 released features
>5k of the critical/blocker/major bugs detected
>7500 test cases written
>10 trained QA specialists on the client’s side
Test coverage:
90%
A product was unstable and loaded with bugs
The product became stable and robust
There were no automated tests
Automated testing was introduced
There was a lack of test documentation
Created and maintained all required test documentation
Reduced customer inquiries:
70%
Customers faced a huge amount of issues
Decreased the number of client issues and speed up the process of delivering new, well-tested features for customers
Low percentage of app coverage by auto tests
We build an automated test system that includes integration, performance, acceptance, and end-to-end testing solutions for web and mobile parts of application across all environments
Long running tests
Speed up auto tests by running them in multiple threads and simultaneously for different areas
Test Time:
20h
2h
Tests took more than 20 hours for completion
We reduced the build time by a factor of 10
Every time a developer triggered the tests, it was by hand in a terminal
Configured how to run tests in simple way without the involvement of a tech person
The team had a big and complicated file with the results
Created a reporting system that allowed the team to view clean reports even if the results contained 10k rows
Releases per day:
4
5
1 huge release every 1-2 months
4-5 releases per day
1 large team with an unclear scope of work
~15 squads each with a straight and clear working plan
0 test cases
200+ e2e test cases for the whole app
1-2 months to release a new feature
2 weeks max to release a new feature
Convoluted and complex process of urgent bug-fix releases
Clear and fast way to release any fix
Business team could review new feature only on a prod or using low-level test-data
Business team can review new features on a dedicated prod-like environment
No release documentation
Documentation that outlines the release process in detail
Regression Testing Time:
2w
2h
The manual testing activities were time-consuming and took up to 2 weeks to complete
Automation regression testing took 2 hours instead of 2 weeks of manual testing activities
There was no formal QA process on the project
We established a formal QA process by designing and presenting it to the rest of the team for familiarization and implementation

Test coverage:
90%
The architecture of the test suite could not be scaled and was difficult to maintain for a large number of tests
Designed the architecture of the test framework from scratch
Tests couldn’t be integrated with other testing and DevOps tools
Integrated auto tests with Jenkins, TestRail, and Jira to have a complete test ecosystem up and running
Tests took many hours to be completed
Built a test suite which ran auto-tests using 16 threads on multiple machines

Test coverage:
80%
Test suite was developed without maintenance and scalability support in mind
We conducted a comprehensive review of the existing test suite, identified areas of improvement and implemented best testing practices
The high number of issues or defects that were present in the software
The number of issues reduced to 25 % after automation was properly implemented
Slow test results and the lack of stable automated tests, leading to hindered deployment capabilities
SoftNAS experienced several benefits, including faster test results and the acquisition of stable automated tests, resulting in improved deployment capabilities
Test coverage:
90%
No QA on the project, as result chaotic testing from the developers side only
Structured and formalized the testing process
The customer support team didn't know the specific functionality of the project and couldn't explain to customers how features work
Helped the CS team to understand all the subtleties of the application
Not structured, not informative bug reports
Created template for bug reports to include all required attributes including logs, attachments for faster issues replication/analysis

All versions were released at the appointed time
Lack of QA resources
Created and fully stabilized the testing process
No ability to execute regression testing before each release
The first release completed three months after we started working on the project
Prevented huge number of blockers and crashes caused by the big amount of merging
Test coverage :
90%
2.5 weeks to run regression testing
4 days to run regression testing
7 days to run smoke testing
1 day to run smoke testing
Outdated test cases
60% increase in the number of regression tests and relevant test cases
50% test coverage
90% test coverage
Supported localizations were not tested
27 localizations supported by QA engineers
No automation testing
>1500 automated test scenarios created
Dev capacity:
+15%
Didn’t have a formal QA process
Built smooth and well-working processes
Didn’t make automated testing
Test automation : smoke, API, UI on all supported browsers
Testing was carried out by the devs
Full coverage of the application by test cases

Test coverage :
97%
Existing workflow in Jira didn’t allow to understand how many bugs were left unfixed before the release and their priorities
Adjusted the Jira workflow, integrated TestRail with Jira to be able to see statisticls of the issues per release
The lack of test documentation
Developed necessary test artifacts that adhere to industry standards and are easily maintainable
No Automation test coverage
Added multithreading to run tests in parallel on 10 different threads, which reduced the time of the test run in 7 times
Regression testing time
2x
No formal testing process
2 days to run regression testing
>1500 released features
>5k of the critical/blocker/major bugs detected
>7500 test cases written
>10 trained QA specialists on the client’s side
Test coverage:
90%
A product was unstable and loaded with bugs
The product became stable and robust
There were no automated tests
Automated testing was introduced
There was a lack of test documentation
Created and maintained all required test documentation
Reduced customer inquiries:
70%
Customers faced a huge amount of issues
Decreased the number of client issues and speed up the process of delivering new, well-tested features for customers
Low percentage of app coverage by auto tests
We build an automated test system that includes integration, performance, acceptance, and end-to-end testing solutions for web and mobile parts of application across all environments
Long running tests
Speed up auto tests by running them in multiple threads and simultaneously for different areas
Test Time:
20h
2h
Tests took more than 20 hours for completion
We reduced the build time by a factor of 10
Every time a developer triggered the tests, it was by hand in a terminal
Configured how to run tests in simple way without the involvement of a tech person
The team had a big and complicated file with the results
Created a reporting system that allowed the team to view clean reports even if the results contained 10k rows
Releases per day:
4
5
1 huge release every 1-2 months
4-5 releases per day
1 large team with an unclear scope of work
~15 squads each with a straight and clear working plan
0 test cases
200+ e2e test cases for the whole app
1-2 months to release a new feature
2 weeks max to release a new feature
Convoluted and complex process of urgent bug-fix releases
Clear and fast way to release any fix
Business team could review new feature only on a prod or using low-level test-data
Business team can review new features on a dedicated prod-like environment
No release documentation
Documentation that outlines the release process in detail
Regression Testing Time:
2w
2h
The manual testing activities were time-consuming and took up to 2 weeks to complete
Automation regression testing took 2 hours instead of 2 weeks of manual testing activities
There was no formal QA process on the project
We established a formal QA process by designing and presenting it to the rest of the team for familiarization and implementation

Test coverage:
90%
The architecture of the test suite could not be scaled and was difficult to maintain for a large number of tests
Designed the architecture of the test framework from scratch
Tests couldn’t be integrated with other testing and DevOps tools
Integrated auto tests with Jenkins, TestRail, and Jira to have a complete test ecosystem up and running
Tests took many hours to be completed
Built a test suite which ran auto-tests using 16 threads on multiple machines

Test coverage:
80%
Test suite was developed without maintenance and scalability support in mind
We conducted a comprehensive review of the existing test suite, identified areas of improvement and implemented best testing practices
The high number of issues or defects that were present in the software
The number of issues reduced to 25 % after automation was properly implemented
Slow test results and the lack of stable automated tests, leading to hindered deployment capabilities
SoftNAS experienced several benefits, including faster test results and the acquisition of stable automated tests, resulting in improved deployment capabilities
Test coverage:
90%
No QA on the project, as result chaotic testing from the developers side only
Structured and formalized the testing process
The customer support team didn't know the specific functionality of the project and couldn't explain to customers how features work
Helped the CS team to understand all the subtleties of the application
Not structured, not informative bug reports
Created template for bug reports to include all required attributes including logs, attachments for faster issues replication/analysis

All versions were released at the appointed time
Lack of QA resources
Created and fully stabilized the testing process
No ability to execute regression testing before each release
The first release completed three months after we started working on the project
Prevented huge number of blockers and crashes caused by the big amount of merging
Test coverage :
90%
2.5 weeks to run regression testing
4 days to run regression testing
7 days to run smoke testing
1 day to run smoke testing
Outdated test cases
60% increase in the number of regression tests and relevant test cases
50% test coverage
90% test coverage
Supported localizations were not tested
27 localizations supported by QA engineers
No automation testing
>1500 automated test scenarios created
Dev capacity:
+15%
Didn’t have a formal QA process
Built smooth and well-working processes
Didn’t make automated testing
Test automation : smoke, API, UI on all supported browsers
Testing was carried out by the devs
Full coverage of the application by test cases

Test coverage :
97%
Existing workflow in Jira didn’t allow to understand how many bugs were left unfixed before the release and their priorities
Adjusted the Jira workflow, integrated TestRail with Jira to be able to see statisticls of the issues per release
The lack of test documentation
Developed necessary test artifacts that adhere to industry standards and are easily maintainable
No Automation test coverage
Added multithreading to run tests in parallel on 10 different threads, which reduced the time of the test run in 7 times
Regression testing time
2x
No formal testing process
2 days to run regression testing
>1500 released features
>5k of the critical/blocker/major bugs detected
>7500 test cases written
>10 trained QA specialists on the client’s side
Test coverage:
90%
A product was unstable and loaded with bugs
The product became stable and robust
There were no automated tests
Automated testing was introduced
There was a lack of test documentation
Created and maintained all required test documentation
Reduced customer inquiries:
70%
Customers faced a huge amount of issues
Decreased the number of client issues and speed up the process of delivering new, well-tested features for customers
Low percentage of app coverage by auto tests
We build an automated test system that includes integration, performance, acceptance, and end-to-end testing solutions for web and mobile parts of application across all environments
Long running tests
Speed up auto tests by running them in multiple threads and simultaneously for different areas
Test Time:
20h
2h
Tests took more than 20 hours for completion
We reduced the build time by a factor of 10
Every time a developer triggered the tests, it was by hand in a terminal
Configured how to run tests in simple way without the involvement of a tech person
The team had a big and complicated file with the results
Created a reporting system that allowed the team to view clean reports even if the results contained 10k rows
Releases per day:
4
5
1 huge release every 1-2 months
4-5 releases per day
1 large team with an unclear scope of work
~15 squads each with a straight and clear working plan
0 test cases
200+ e2e test cases for the whole app
1-2 months to release a new feature
2 weeks max to release a new feature
Convoluted and complex process of urgent bug-fix releases
Clear and fast way to release any fix
Business team could review new feature only on a prod or using low-level test-data
Business team can review new features on a dedicated prod-like environment
No release documentation
Documentation that outlines the release process in detail
Regression Testing Time:
2w
2h
The manual testing activities were time-consuming and took up to 2 weeks to complete
Automation regression testing took 2 hours instead of 2 weeks of manual testing activities
There was no formal QA process on the project
We established a formal QA process by designing and presenting it to the rest of the team for familiarization and implementation

Test coverage:
90%
The architecture of the test suite could not be scaled and was difficult to maintain for a large number of tests
Designed the architecture of the test framework from scratch
Tests couldn’t be integrated with other testing and DevOps tools
Integrated auto tests with Jenkins, TestRail, and Jira to have a complete test ecosystem up and running
Tests took many hours to be completed
Built a test suite which ran auto-tests using 16 threads on multiple machines

Test coverage:
80%
Test suite was developed without maintenance and scalability support in mind
We conducted a comprehensive review of the existing test suite, identified areas of improvement and implemented best testing practices
The high number of issues or defects that were present in the software
The number of issues reduced to 25 % after automation was properly implemented
Slow test results and the lack of stable automated tests, leading to hindered deployment capabilities
SoftNAS experienced several benefits, including faster test results and the acquisition of stable automated tests, resulting in improved deployment capabilities
Test coverage:
90%
No QA on the project, as result chaotic testing from the developers side only
Structured and formalized the testing process
The customer support team didn't know the specific functionality of the project and couldn't explain to customers how features work
Helped the CS team to understand all the subtleties of the application
Not structured, not informative bug reports
Created template for bug reports to include all required attributes including logs, attachments for faster issues replication/analysis

All versions were released at the appointed time
Lack of QA resources
Created and fully stabilized the testing process
No ability to execute regression testing before each release
The first release completed three months after we started working on the project
Prevented huge number of blockers and crashes caused by the big amount of merging