Reducing regression analysis efforts
Challenges
- Test analysis could only start after full execution was completed (4 hours wasted daily)
- All test failures had to be analyzed manually
- No visibility into causes for tests failures
- Absence of history and trends of test failures
- No tools to manage team workload
- Test execution reports were done manually (1 hour of daily efforts)
Highlights
- Real-time analysis during test runs: results available after the first job execution, saving team capacity and providing an early reaction
- Automatic re-run of failed tests provided additional value and saved up to 5.5 of team’s hours per day
- About 20% of defects previously analyzed manually are being updated automatically through ML capabilities
- Clear visibility into the number of new /existing production defects, auto test related issues, and environment related issues
- Full understanding of application quality, correct planning of maintenance time, and transparent communication of environment instability based on real-time statistics
- History of tests execution helps to analyze causes of test failures more efficiently
- Improved task management due to possibility to plan work allocation and track tests assigned to each team member
- Real-time dashboards were tailored to client’s KPIs, giving full transparency of test execution results