Surprises are always around the corner

An industry leader providing end-to-end solutions to automate Foreign Exchange trading, our customer provides innovative solutions for financial institutions. Their flagship product, the online FOREX trader, connects to the various Trading Gateways across the world using Adapters. That is no small task. We’re talking millions of transactions at sub 200 ms response time.
When we were called to develop a automation suite for one its components, we didn’t expect anything challenging. Boy, were we in for a surprise or what?

An important middleware Component called as Adapter that links the FX Inside to the Provider. Different providers have their own Adapters. The real work of the Adapter was to direct a homogeneous data sent by the Client, while trading into heterogonous environment to provider and vice versa. These Adapters have to be tested for every release of the core applications. This is a backend non-UI programs that requires scripts to be written to test the functionality at API level.

The objective was to develop automation suite which can be used to test multiple adapters on both Simulator and Live setup. Automation suite should be flexible enough to cater to test new adapters that will be added in the future with minimal changes.

For that we interacted with the developers to understand the functionality of the Adapters and finally we developed a framework which would cater to automating multiple adapters and also add new adapters in the future.

The team started the incremental approach towards Automation of the Adapters by first interacting with the development and QA team, gathering the necessary information by which the common scenarios across the adapters were identified. The critical part of Automation was to develop scripts that can Automatically Restart the Adapters residing on a remote Linux box, send across trading messages to the adapter component, receive them by listening to messaging broker and parse the necessary information.

The result was much better than we anticipated. The execution time of the test scenarios for one Adapter taking two days earlier was reduced to thirty minutes for both live and Simulator environments, which was phenomenal for the client.

STAG developed a test suite to automate tests for every adapter at API level, thus, bringing down System testing effort by 40%.

“Never look down” – not the best suggestion for a startup

A Talent Management Company delivering end-to-end learning solutions was on a rapid growth path. Customer base was growing, and they catered to every possible segment. With international awards and mentions in every possible listing, it was a dream growth. Each customer was special and of high priority. The sales team filled order books enough to keep the engineering busy with customization. Within short period, it became increasingly difficult to meet schedules and then instances of customers reporting defects started coming in. The management smartly decided to enough of act the signs before things got out of hand. It is wise to check if the rest of the team is keeping up with you, when you are climbing high.

After a detailed analysis we put down a list of things that need attention – With no formal QA practice in place, a makeshift testing team of few developers and product managers assessed the applications before being released to customers. The requirement documents for their products did not exist. There was no tracking of the defects done which eventually resulted in delayed releases to the clients.

The team applying HyBIST hypothesized what could possible go wrong in the product (applied HyBIST core concepts of ‘Negative Thinking’ and ‘EFF model’) and staged them over multiple quality levels. The test scenarios and test cases designed were unique to each of the quality levels formulated as the focus on the defects to be detected at each level is unique and different (HyBIST core concepts Box model was applied to understand the various behaviors of each requirement/feature/sub-feature and hence derive the test scenarios for each feature/sub-feature). With close support from the management, we put together a net so tight, that no defects slip through.

 A clear mapping of the requirements, potential defects, test scenarios and test cases was done after completing the test design activity to prove the adequacy of test cases.

The robust test design ensures the product quality. The percentage of High priority defects was significantly high (65%) and were detected in earlier test cycles The test scenarios and test cases were adequate as the defect escapes was brought down to 2% from 25% and the regression test cycles was reduced from 30 to 12. More importantly, the schedule variance dropped to normalcy.