A leading global e-commerce provider was looking at automating over 75 components of its global network services platform to strengthen its security and reliability. The task was complex and the company felt it needed a skilled team with lot of gray hairs.
The company discussed the challenge with STAG and we jumped at it. To win their confidence, we even offered to do a pilot to showcase our capability. We took the complete ownership of the deliverables also. Did we just shoot ourselves in the leg?
Our customer is a leading worldwide provider of business-to-business EDI and supply chain integration, synchronization and collaboration solutions. The Indian Development Center were entrusted with the responsibility of making changes in product solutions followed by smooth product migration from QA environment to deployment in pre-production environment and subsequently to production environment. Any change in the product component called for full validation of the entire product suite that had multiple impacts due to multiple locations and different components used by different users across the globe. The in-house QA team handled the manual functional testing quite effectively but the challenge at hand was to cut down the test cycle time thereby facilitating faster migration to pre-production environment and subsequently to production environment. This called for superior script-writing skills apart from performing regression testing of the entire product suite. STAG was entrusted with this project and the expectations set
- Automate (server-side scripting) tests to perform verification and regression of both dataflow and admin-flow
- Automate (scripts for WinRunner) certain pre-determined functionalities
To handle the issue of large number of test cases covered by each component we formulated smart automation strategy. We ensured the automation architecture was flexible and reusable. This helped to cover optimal test cases in a single script. We created around 52 verification scripts and around 28 regression scripts – for toolset WinRunner. Further, we also developed over 330 server-side PERL scripts Customer formally certified each script and only then, it was released to QA team for use. We developed custom tool –“test harness” in Java and test scenarios were called from xml file – The xml file was the placeholder for Perl script name and parameters required to execute the script. – to use as UI front-end to execute the automation scripts.
The best part was none of the delivery team had gray hairs.