In today’s world of extreme agility, intelligent systems, SAAS, automated test and deployments, what is the role of humans in QA? How has the way we test software changed?
Intelligence is no more about fighting technical problems but to graduate to a higher level of delivering “Do more, but do NO more”. In the context of automated testing, automation needs to become smarter and intelligent to enable us to make decisions faster – provide valuable information about the state of the system, to enable us to do less, yet deliver to meet the continuously increasing customer expectations.
In the Mar-Apr 2014 issue of Tea-time with Testers ezine, T Ashok, Architect – HBT, shares his views on this subject. According to him, Intelligent automation means scripts that are easy to build & maintain, scripts that are purposeful enabling tester to clearly identify the problem, scripts that are resilient to ensure that maximal number of scripts executed in a run and finally scripts that help analyze outcomes and suggest actions rather than report test outcomes as a report.
Read below the full article:
Is the objective of testing to find more issues or prevent them by being more sensitive? Is behavioral information more suitable that structural information to design test cases? Is automated testing superior to finding issues?
There are very many things in the discipline of testing that seem contrary, like the Yin and Yang, creating a constant tension as to ‘what to- choose’.
This article by T Ashok, CEO STAG Software, explores the idea the various opposites and outlines the view that “The magic is in the middle”. That it is not a tussle, but a perfect state with the opposites balancing each other. The article was published in the June 2013 issue of Tea Time with Testers, ezine.
A New Zealand based customer in the heath care domain embarked on a journey of migrating their Delphi-based products to Microsoft technologies. The products use specialized GUI controls that are not recognized by the typically popular tools. The company was keen to embark on automation right from the early stage of migration. And the budget to develop automation was tight.
We conducted Proof-Of-Concept (POC) to identify the tool that would support automation for both Delphi and VB.Net. We discovered that most popular tools were indeed not compatible with the developed product. The POC concluded that Test Complete did support both Delphi and VB.Net with a few constraints. It was very cost effective however but not user friendly. We convinced the management of our decision. The project started off with us identifying test cases which could be automated. Seven modules were automated and demonstrated.
We developed reusable Keyword Driven Framework for the client. Both individual test case execution and batch run was possible just by choosing the test cases. STAG provided detailed demo of the framework to the in-house QA team.
However some of the test cases chosen for automation were not complete. We validated the test cases, made the necessary changes and then initiated the scripting. The automation work was divided between the STAG and customers team. As we automated the test cases, we guided and trained the customer’s team to automate.
The result – By automating 326 test scenarios, the testing time was cut down from 80 hours to 12 hours! We saved the customer significant money spent on the tool, but more by enabling them to release the product to market ahead of schedule!
A large global provider of BI solutions has a product suite that runs on five platforms supporting thirteen languages with each platform suite requiring multiple machines to deliver the BI solution. The entire multi-platform suite is released on single CD multiple times a year.
The problem that stumped them was “how to automate the final-install validation of multi-platform distributed product”. They had automated the testing of the individual components using SilkTest, but they were challenged with “how to unify this and run off a central console on various platforms at the same time”.
Considering each platform-combination took about a day, this required approximate two months of final installation build validation, and by the time they were done with this release, the next release was waiting! This was a relentless exercise, consuming significant QA bandwidth and time, and did not allow the team to do things more interesting or important.
The senior Management wanted single-push-button automation -identify what platform combination to schedule next, allocate machine automatically from the server farm, install and configure automatically, fire the appropriate Silk scripts and monitor progress to significantly reduce time and cost by lowering QA bandwidth involved in this effort. After deep analysis, in-house QA team decided this was a fairly complex automation puzzle and required a specialist! This is when where we were brought in.
After an intense deep-dive lasting about four weeks, we came up with a custom master-slave based test infrastructure architecture that allowed a central console to schedule various jobs onto the slaves, utilizing a custom developed control & monitoring protocol. The solution was built using Java-Swing, Perl, Expect and adapters to handle Silk scripts. Some parts of the solution where on Windows platform while some on UNIX. This custom infrastructure allowed for scheduling parallel test runs, automatic allocation of machines from a server farm, installing appropriate components on appropriate machines, configuring them and finally monitoring the progress of validation through a web console.
This test infrastructure enabled a significant reduction of the multi-platform configuration validation. The effort reduced from eight weeks to three weeks. We enjoyed this work simply because it was indeed a boutique work fraught with quite a few challenges. We believe that this was possible because we analyzed the challenging problem from wearing a development hat and not the functional test automation hat.