“Roadmap to Quality” – Panel discussion at SofTec 2012 Conference

SofTec 2012, Bangalore, July 14, 2012

The panel discussion on “Roadmap to Quality” was brilliant due to the cross-pollination of interesting ideas from non-software domains. Three of the four panelists were from non-software domains – Mehul@Arvind Retail, Soumen@GM, Raghavendra@Trellborg with lone exception of Murthy from Samsung, with moderation done by Ashok@STAG.

The key take ways from the panel discussion are:

  1. Continuous monitoring helps greatly as this is like a mirror that reflects what you do constantly, this is what Mehul@Arvind highlighted as being important in his domain of apparel/retail business. Ashok connected this to dashboards that are becoming vogue in our workplace, more in the Agile context
  2. Soumen@GM stated the importance of early stage validation like Simulation, Behavior modelling in the Automotive industry, as the cost of fix at the later stage is very expensive. The moderator connected this to “Shift Left”, the new term in our SW industry- how can we move validation to earlier stage(s)?
  3. Raghav@Trellborg a component manufacturer of high technology sealing systems stated need of understand of understanding the final context of usage of the component as being very important to know to ensure high quality. He also stated testing is deeply integrated into the “shop floor” i.e. daily work and the most important aspect of quality is not QA or QC but the underlying the Quality Systems in place. How do Q systems ensure that quality is deeply entrenched into the daily life. The moderator highlighted the fact the in software industry we have implemented systems, but these are still at an organizational level and the need of the hour in SW industry is to institutionalize these at a personal level
  4. Finally Murthy stated level of quality needed is not the same in all domains, in certain domains (like mobile) that have disruptive innovation and short life cycles, “we need just enough quality”. He highlighted the need to understand “technical debt” that we can tolerate as a driver for deciding “how much to test”

You can also read the special news on the panel discussion on Silicon India website.

Relavent topics:
a. Software testing lacking serious effort

You are only as good as your team

A semiconductor company is considered a pioneer in the 4G-WiMAX, dreams of being among the first companies to launch WiMax solutions. On the verge of launching their product, the only challenge in the un-treaded path, was imagination.
Their QA requirements was unique as the product being developing. They were looking for a partner who would be as spirited as they were. Can STAG prove its mettle? Could we be the team they were hoping for?

One question we are asked almost immediately after saying Hello is “Do you have the domain expertise?” and then we speak about HyBIST. It couldn’t have happened this time. Pioneers can’t ask for experience. Soon we were working on conformance validation (which later on became the IEEE standard). Within a few weeks we understood why they were looking for someone beyond ‘I-can-provide-testing-resources-too’.

BuildBot is a system to automate the compile/test cycle required by most software projects to validate code changes. The buildbot watches a source code repository (CVS or other version control system) for interesting changes to occur, then triggers build with various steps (checkout, compile, test, etc). The

STAG setup a system to automate the build, compile & validate code changes in the source code repository. The builds are run on a variety of slave machines, to allow testing on different architectures, compilation against different libraries, kernel versions, etc. The results of the builds are collected and analyzed (compile succeeded / failed / had warnings, which tests passed or failed, memory footprint of generated executables, total tree size, etc) and are displayed on a central web page. The entire system was around 6000 Lines of Code in Python.

This resulted in quick validation of code changes in the repository leading to reduced rework time, thus increasing productivity of distributed development teams.

Surprises are always around the corner

An industry leader providing end-to-end solutions to automate Foreign Exchange trading, our customer provides innovative solutions for financial institutions. Their flagship product, the online FOREX trader, connects to the various Trading Gateways across the world using Adapters. That is no small task. We’re talking millions of transactions at sub 200 ms response time.
When we were called to develop a automation suite for one its components, we didn’t expect anything challenging. Boy, were we in for a surprise or what?

An important middleware Component called as Adapter that links the FX Inside to the Provider. Different providers have their own Adapters. The real work of the Adapter was to direct a homogeneous data sent by the Client, while trading into heterogonous environment to provider and vice versa. These Adapters have to be tested for every release of the core applications. This is a backend non-UI programs that requires scripts to be written to test the functionality at API level.

The objective was to develop automation suite which can be used to test multiple adapters on both Simulator and Live setup. Automation suite should be flexible enough to cater to test new adapters that will be added in the future with minimal changes.

For that we interacted with the developers to understand the functionality of the Adapters and finally we developed a framework which would cater to automating multiple adapters and also add new adapters in the future.

The team started the incremental approach towards Automation of the Adapters by first interacting with the development and QA team, gathering the necessary information by which the common scenarios across the adapters were identified. The critical part of Automation was to develop scripts that can Automatically Restart the Adapters residing on a remote Linux box, send across trading messages to the adapter component, receive them by listening to messaging broker and parse the necessary information.

The result was much better than we anticipated. The execution time of the test scenarios for one Adapter taking two days earlier was reduced to thirty minutes for both live and Simulator environments, which was phenomenal for the client.

STAG developed a test suite to automate tests for every adapter at API level, thus, bringing down System testing effort by 40%.

Smart Test Automation to check product functionality cuts test execution time enabling faster market release

STAG Software was working on a dashboard product aimed at the mobile telecommunications industry. It was being developed on the LAMP platform, which is a solution stack of free, open source comprising the Linux (operating system), Apache HTTP Server, MySQL (database software), and either Perl, PHP or Python.

The major user interface (UI) component of the product, which was the management UI, had the facility to configure key components, configure handsets, user management (create, modify and delete), upload audio/video clips for video on-demand and live viewing, pinning channels for streaming, display status of streaming servers, streaming sessions, assets as well as generating reports for asset inventory and streaming activity.

The scope of the project and range of features dictated that the project would not only be development intensive, but post-development there would also be an equally intensive testing and debugging stage.

STAG automated the execution of a number of product feature test cases.


As some of the product features reached stability, STAG automated the execution of their test cases. Validation of UI-based features was automated using IBM Rational Functional Tester (RFT). The non UI- based server-side features and the validation of the product installation process was automated using Perl.

RFT enabled to automate 400 functionality test cases out of a total of 600 test cases for the management UI. A data driven framework was developed with the ability to take input data for test cases from an Excel sheet. 400+ test cases were managed by developing a catalog of around 40 reusable library functions and 22 main test scripts. These same test scripts could be executed on multiple browsers i.e. Internet Explorer and Mozilla Firefox, which also enabled considerable time and effort savings. Moreover, some of the libraries developed could be used as project assets.


Benefits of automating the test cases were:

  • Test execution effort was brought down from 17 persons and machine hours to 7 machine hours
  • 42 person days effort was taken to design, develop and test the scripts, which was considerably shorter then anticipated
  • The testing team could focus more on other components/test cases, where manual intervention was essential
  • Cost savings
  • Faster time-to-market

This case study was published in the IBM’s “The Great Mind Challenge for Business, Vol 2, 2011”. . The book recognizes visionary clients who have successfully implemented IBM software (RFT) solutions to create exceptional business value.

HyBIST enables agility in understanding

A Fortune 100 healthcare company building applications for next generation of body scanners, uses many tools including OS, compilers, webservers, diagnostic tools, editors, SDKs, database, networking tools, Browsers, device drivers, project management tools and development libraries. Healthcare domain meant compliance to various government regulations including that of FDAs. One such compliance states that every tool used in the production, should be validated for ‘fitness of use’. This meant as many as 30 tools. How could one possibly test the entire range of applications, before it is used? Considering the diverse range of applications, how could they have one team do it?

STAG was the chosen partner not because we had expertise in healthcare applications, but because of HyBIST enables test teams to rapidly turnaround. For this job, STAG put together a team with sound knowledge of HyBIST.

The team relied on one of the most important stages of the SIX-staged HyBIST – “Understand Expectations” – A scientific approach to “the act of understanding the intentions or expectations” by identifying key elements in a requirement/specification and setting up a rapid personal process powered by scientific concepts to ensure that we quickly understand the intentions and identify the missing information. We look at each requirement and partition these into functional and non-functional aspects and probe into the key attributes to be satisfied for the requirement. We use a key core concept Landscaping that enables us to understand the Market place, end-users, Business flows, architecture and other attributes, and other information elements.

Once a tool is identified, the team gathers more information from the public domain. This ensured the demo from customer (of around 45 minutes) is easily absorbed. During the demo, the customer also shares the key features they intend to use. This information eventually morphs into requirements. The team then explores the application for around 2 days. During this period they come up with a list of good questions, clarify the missing elements and understand the intended behavior. Thus the effort spent to understand and learn the application is as less as 16 hours.

“Never look down” – not the best suggestion for a startup

A Talent Management Company delivering end-to-end learning solutions was on a rapid growth path. Customer base was growing, and they catered to every possible segment. With international awards and mentions in every possible listing, it was a dream growth. Each customer was special and of high priority. The sales team filled order books enough to keep the engineering busy with customization. Within short period, it became increasingly difficult to meet schedules and then instances of customers reporting defects started coming in. The management smartly decided to enough of act the signs before things got out of hand. It is wise to check if the rest of the team is keeping up with you, when you are climbing high.

After a detailed analysis we put down a list of things that need attention – With no formal QA practice in place, a makeshift testing team of few developers and product managers assessed the applications before being released to customers. The requirement documents for their products did not exist. There was no tracking of the defects done which eventually resulted in delayed releases to the clients.

The team applying HyBIST hypothesized what could possible go wrong in the product (applied HyBIST core concepts of ‘Negative Thinking’ and ‘EFF model’) and staged them over multiple quality levels. The test scenarios and test cases designed were unique to each of the quality levels formulated as the focus on the defects to be detected at each level is unique and different (HyBIST core concepts Box model was applied to understand the various behaviors of each requirement/feature/sub-feature and hence derive the test scenarios for each feature/sub-feature). With close support from the management, we put together a net so tight, that no defects slip through.

 A clear mapping of the requirements, potential defects, test scenarios and test cases was done after completing the test design activity to prove the adequacy of test cases.

The robust test design ensures the product quality. The percentage of High priority defects was significantly high (65%) and were detected in earlier test cycles The test scenarios and test cases were adequate as the defect escapes was brought down to 2% from 25% and the regression test cycles was reduced from 30 to 12. More importantly, the schedule variance dropped to normalcy.

Rapid action team – building a team from scratch

Customer is a major technology innovator and global leader in semiconductors for wired and wireless communications. Their products enable the delivery of voice, video, data and multimedia to and throughout the home, the office and the mobile environment.

The principal in US wanted to explore the possibility of moving core product development to their captive center in India with business case analyzed and approved. The challenge however was that they were unsure on the time required to build the engineers with domain knowledge and relevant experience. The impact of such delay on road map and associated planned revenue was identified as major risk. Could STAG mitigate this risk? Read more.

One Manager responsible for development was relocated to India and given this responsibility to build the initial team and show success. Offer for full time senior person to manage QA was made and they were waiting for him to join. With the market going through some turbulence, getting a person full-time on board to take over QA responsibility was also taking its own time.

The management was aware STAG took up a challenge in the past to arrest their defect escape. So they threw in the new one – to build an effective QA team with the following goals:

  • Build initial set of QA team in 3 different sub-groups
  • Complete knowledge transfer or ramp-up time will go as per business plan
  • Build complete test lab on-time
  • Commit deliverables and adhere to the plan
  • Show the improvement in productivity and quality over time
  • Transition core team to be part of customer organization if all set goals are met and partner with them to build temp staff required to achieve the new set of goals for product road map

We identified a large team – some with knowledge in HyBIST & STEM™ but new to domain, AND the rest with experienced in testing from the same industry. Both were then given a clear definition of quarterly goals under focus and STAG way of tracking and ensuring that how we measure customer expectation. Entire team worked closely with QA Manager to setup complete lab and commit release dates for some customer key releases and delivered on-time with acceptable quality defined.

Having the complete lab and no constraints to skip any type of tests, team started enhancing their scope, improving test assets thereby increasing stake holder’s confidence further. Certain area for automation identified and new members were added to support this initiative. With multiple releases experience team understood the dependencies and started defining right scope regression and release cycle time reduced wherever business situation demanded.

Typical success factors like good planning, effective tracking, timely release with good quality, team flexibility and attitude towards business impact was seen in every subsequent release. Both Development and QA got the required approval to take the core team on board and define the temp staff requirement and duration to manage the rest of the releases in road map. Journey continuing with STAG being a QA partner, some members smoothly transitioned to customer organization as core team and additional extended team requirement is still supported by STAG.

  1. What was thought as tough constrains to meet and build the team on-time was achieved with our approach which had high impact on revenue plans defined for that product line
  2. Full-time core team formation and extended team as contractors working fine
  3. Smooth transition ownership of a product line was achieved as per plan

Staying on top more difficult than getting there

Business is good when you are alone on top. If you are not prudent, however mightiest one may be, a small nudge by somebody could pull you all the way down. Our customer is the market leader in providing learning solutions to Universities and school. Apart from innovative solutions, they proved to be business smart by outsourcing to India not just for cost benefits, but also quality equivalent to their team.

Senior management of the organization had apprehensions of its success, owing to Competency, Process, Training, Communication and Cultural differences. Did STAG manage to allay their fears?  Read on.

With headquarter in Washington DC, US; the customer is the largest E-Learning provider in the School and University space. Growing competition was driving the product team for newer innovation, quicker concept-to-market and higher quality. As the development team shift gear into an agile mode, the pressure was on the QA to cut down the test cycle time and qualify the products quicker with the same effectiveness. The small team was found wanting for additional people. Aggressive deadlines convinced the management to outsource some parts of testing. We realized the initial challenge would be to reinforce the customer confidence on their decision to outsource. This would mean total transparency and a continuous communication channel to know the team’s day-to-day activities. It was also necessary that we adapt their QA process and the terminologies they use. The time involved in setting up the team and ramp-up was very short. This meant we would need someone travel to US on a short notice and undergo product training and be back and transfer the knowledge to the team.

The team was chosen based on previous experience with testing web applications and previous teaching experiences. Having functional experts meant resolving many of the issues internally.  On return to India after product and process training, the test lead initiated knowledge transfer to team. The training incorporated various topics including setup, features and test process. Once the training was complete, the team started executing dry runs of the test. Though the main objective was to understand the client’s test process, it was also the quickest way to learn the product.

The setup activity happened in close coordination with the onsite analysts and support team. This also helped verify the installation manual. The setup involved multi-platform & multi database test lab and was organized within a week of the engagement.

  • Significant reduction in test execution cycle time (66%)
  • Created excellent knowledge base so that subsequent ramp-up of team was done with just one week notice
  • The depth of knowledge of testing demonstrated by team gave confidence and team was allowed to update test cases. This helped in reducing defect escapes to the field by 13% to 4%
  • Started with four engineers as experiment in outsourcing helped them to outsource major part of test execution (22 engineers) in eight weeks time


  • We managed short-notice deadlines (as less as 2 days) by stretching ourselves when needed
  • We executed more than 60 cycles of testing on three major releases, and six minor releases(hot-fixes and service packs)
  • We helped the client to stop many critical issues from escaping
  • They seek and trust our ‘Quality’ advice in a ‘go-no-go’ situation
  • We accepted the challenge of simultaneously testing multiple versions
  • We work with their support team to isolate field defects
  • Our test lab has been flexible to the changing system requirements
  • We add and maintain their test documents
  • We are able to reset the test lab in 4 hrs to a newer configuration
  • We are able to increase the team size with only one week notice(whenever needed)
  • We have the ability to make a new engineer productive in 3 days!
  • We internally developed the training material to create a strong knowledge transfer
  • We have completed testing on/ahead of the schedule 95% of the cycles
  • We have brought down the test cycle time by over 60%
  • We provide status update daily and we do a rain check every alternative week over a T-Con
  • We internally resolve 4 out of every 5 clarifications needed by our engineers, before we approach the client – We ‘disturb’ the customer less!

On-time release to market helped the company stay afloat

A pharmaceutical company decides to ride the IT bandwagon. They establish a company to develop Enterprise Resource Planning (ERP) solutions to meet the demands of small and medium scales Pharmaceutical, Chemical and Food processing industries. The challenge at hand was to build a software development organization and release the first version of the product in six months to the market. With all the functional experts it had, the solution specially designed and developed for pharmaceutical Industry specific best practices, could not fail. Or did it?

The ERP solution complies with Current Good Manufacturing Practices (cGMP) and requirements of International Regulatory Bodies such as US FDA, EDQM, TGA, MHRA, MCC, etc. The challenge at hand was to build a software development organization and release the first version of the product in six months to the market. Considering the tight timeline for product release the company preferred to jumpstart its QA process by partnering with a third party testing organization specialized in test engineering. They expected this organization to provide the required software testing expertise to deliver high quality product to market in time. At the same time to work within the constraints of the company.

The first step involved Knowledge transfer from the customer. Using flow-charts and use-cases we got customer’s concurrence on our understanding of all modules and the interfaces of each module with others. Next, we used STEM™ Behavior Stimuli (BeST) technique to design test cases, module-wise. To increase the depth in testing we applied boundary value analysis technique, Equivalence class technique and Domain specific special value techniques. We also increased the breadth of testing by adding scenarios for different type of tests based on requirements under focus. We also identified module level interfaces to other modules to design end-to-end test scenarios.

We jumpstarted customer’s QA. Institutionalizing our test engineering practices within the organization, led to on-time launch of product thereby boosting the stakeholder confidence in the quality of the product and hence the investment.

Perfect software to stop perfect crime

The Intelligence department of Karnataka Police decided to implement a solution to analyze the Call Detail Record (CDR) from telecom service providers. The tool was to be deployed across the state. The solution can provide critical information about subscribers whose CDR is analyzed location, geographic movement, calls to other monitored suspects etc. This information is very critical for any case in the present times.  The head of this initiative, a very IT savvy Officer, decided that the solution needs to be validated by a specialist organization, if it has to be defect free. In came STAG.

The solution is a Call Detail Record analyzer intended for Law Enforcement or Intelligence Analysts who have to, need to, want to, or are expected to, work with telephone call detail records. A CDR is composed by fields that describe the exchange, i.e. the number making the call, number receiving the call, start time, duration, end time, route taken, etc. The tool also integrates with mapping server, enabling a visual display of the routes and locations of the suspect.

The solution was being developed by a small but very inventive team from a small town. However, being a small team meant the code was self-validated. This made the customer a little jittery. They sought STAG’s services to validate the solution end-to-end.

STAG assessed the development process of the organization, to understand how well it was built. The application then had to be put through thorough multi-level evaluation, starting from field level to load testing. The tool inched its way slowly through these gates, and required structural modification to clear some. STAG worked closely with the department during the training and saw through a successful release. The tool immediately started cracking some pending cases and is now sought after in the other states.