Aesthetics in Software Testing

Software testing is typically seen as yet another job to be done in the software development life cycle. It is typically seen as a clichéd activity consisting of planning, design/update of test cases, scripting and execution. Is there an element of beauty in software testing? Can we see outputs of this activity as works of art?
Any activity that we do can be seen from the viewpoints of science, engineering and art. An engineering activity typically produces utilitarian artifacts, whereas an activity done with passion and creativity produces works of art, and this goes beyond the utility value. it takes a craftsman to produce objects-de-art, while it takes a good engineer to  produce  objects with high utility value.
An object of beauty satisfies the five senses (sight, hearing, touch, smell and taste) and touches the heart whereas an object of utility satisfies the rational mind. So what are the elements of software testing that touch our heart?

Beauty in test cases
The typical view of test cases is one of utility– the ability to uncover defects, Is there beauty in test cases? Yes I believe so. The element of beauty in test cases in its architecture – “the form and structure”.
If the test cases were organized by Quality levels, sub-ordered by items (features/modules) then segregated by types of test, ranked by importance/priority, sub-divided into conformance(+) and robustness(-),  then classified by early (smoke)/late-stage evaluation, then tagged by evaluation frequency, linked by optimal execution order and finally classified by execution mode (manual/automated), we get a beautiful form and structure that not only does the job well (utility) but appeals to the sense of sight via a beautiful visualization of test cases. This is the architecture of test cases suggested by Hypothesis-Based Immersive Session Testing (HyBIST).

Beauty in understanding
One of the major prerequisites and for effective testing is the understanding of the product and the end user’s expectations. Viewed from a typical utility perspective, this typically translates into understanding of various features and intended attributes. To me the aesthetics of understanding is the ability to visualize the software in terms of the internal structure, its environment and the way end users use the software. It is about ultimately distilling the complexity into a simple singularity–to get the WOW moment where suddenly everything becomes very clear. It is about building a clear and simple map of the various types of users,  the corresponding use cases and technical features, usage profile, the underlying architecture and behavior flows, the myriad internal connections and the nuances  of the deployment environment. It is about building a beautiful mental mind map of the element to be tested.

Beauty in the act of evaluation
Typically testing is seen as stimulating the software externally and making inferences of correctness from the observations. Are there possibly beautiful ways to assess correctness? Is it possible to instrument probes that will self assess the correctness? Can we create observation points that allow us to take better into the system? Viewing the act of evaluation from the aesthetic viewpoint, can possibly result in  more  creative ways to assess the correctness of behavior.

Beauty in team composition
Is there aesthetics in the team structure/composition? Viewing the team collection of interesting people – specialists, architects, problem solvers, sloggers,  firefighters, sticklers to discipline and geeks etc. allows us to see the beauty in the power of the team. It is not just about a team to get the job done,  it is about the “RUSH”  that we get about the structure that makes us feel ‘gung-ho’, ‘can-do anything’.

Beauty in reporting/metrics
As professionals, we collect various metrics  to aid in rational decision-making. This can indeed be a fairly mundane activity. What is aesthetics in this? If we can get extreme clarity on the aspects that want to observe and this allows us to make good decisions quickly, then I think this is beautiful. This involves two aspects–what we collect and how we present these. Creative visualization metaphors can make the presentation of the aspects of quality beautiful. Look at the two pictures below,  both of them represent the growth of a baby.

The one on the left shows the growth of a baby using the dreary engineering  graph,  whereas the one on the right shows the growing baby over time. Can we  similarly show to growth of our baby (the software) using creative visualization metaphors?

Beauty in test artifacts
We generate various test artifacts –  test plan, test cases, reports etc. What would make reading of these a pleasure?  Aesthetics here relates to the layout/organization, formatting, grammar, spelling, clarity, terseness. These aesthetic aspects are probably expected by the consumers of these artifacts today.

Beauty in the process
The test process is the most clinical and the boring aspect. Beauty is the last thing that comes to mind with respect to process. The aesthetic aspects as I see here is about being disciplined and creative, being detailed yet nimble. To me it is about devising a process that flexes, evolves in complete harmony with external natural environment. It is in the hard to describe these in words, it can only be seen in the mind’s eye!

Beauty in automation and test data
Finally on the aspect of test tooling,  it is about the beautiful code that we produce to test other code. The beauty here is in the simplicity of the code, ease of understanding, modifiability, architecture and cute workarounds to overcome tools/technology limitations.
Last but not the least, aesthetics in test data is about having meaningful and real-life data sets rather than gibberish.
Beauty they say, lies in the eyes of the beholder. It takes a penchant for craftsmanship driven by passion, to not just do a job, but to produce object-de-art that appeals to the senses. As in any other discipline, this is very personal. As a community, let us  go beyond the utilitarian aspects of our job and produce beautiful things.

The tale of two doctors

In a big city lived Joe, a typical urban yuppie. He was always focused on a great tomorrow. He worked very hard, partied furiously and lived a fast life.

Life was a blast, until his body decided to act up. On this Sunday morning, he woke up panting, unable to breathe, body drenched in sweat, with a dull pain in his chest. The previous evening was a blast, a celebration party thrown for his best buddy getting engaged. After an evening spent at bowling, they hit the pubs, closing each one, until they could not find one open.

A typical Sunday morning would commence at noon; today, as he was rudely jolted out of his reverie, the bright LED clock showed 7:00.He could not move his arms, it seemed to take a tremendous effort to reach out for the bottle of Evian on the table near his bed. He had read about old age diseases getting younger in these modern times, dismissed them brashly, a reflection of his supreme yuppie confidence. For once he faltered, worried seriously if he could become one of the stories. All these years, he had thought of God of as a fashion statement but today he genuinely wished to believe that God exists. For the next few seconds, which seemed like an eternity, his mind rapidly flew back over the past years on the constant abuse he had heaped on his body. For once he prayed dearly that he would do the right things, if he was excused this time. The clock glowed 7:02, and he realized it had been the longest two minutes of his life.

At 8:55 a.m he was in the reception of GoodLife hospital for a 9AM appointment with Dr Robert Black Sr., a senior and very experienced cardiologist. He was soon ushered in, and was face to face with a severe yet friendly gentleman, a few orderly strands of golden hair on his shiny head with a piercing pair of eyes. “Mr Joe, would you please tell me your problem?” said Dr. Black. Joe described in detail his travails upon his rude awakening. An old school doctor, he believed in detailed physical examination rather than the fancy modern equipment. “Lie down on the bed and relax Mr Joe”. He took his stethoscope, placed it on his chest and listened carefully. “Breathe in and out deeply now” said the doctor as he continued to hover his steth on various parts of chest. His sharp eyes showed no emotion, as he went about his job confidently. The young nurse standing next to the doctor was petite and beautiful. She dispassionately took out the sphygmomanometer, wrapped the elastic band around Joe’s arm, pumped it up and watched the mercury bobbing up and down, while her other hand measured the pulse. After a minute, she looked at Dr Black and said 140/120 and 93, in a husky voice.

“Mr Joe, have you been feeling very tired at the end of day lately?”
“Yes” he said and added “It is the busy time of the year at work, a string of late nights.”
“What kind of work do you Mr Joe?”
“I work in the software industry. We are in midst of building a cool application for mobile phones”
“Oh I see, you are the software guy.My nephew is in the software business too and is always racing against time.”
“I guess you must be tied to the desk most of the time. Do you exercise?”
“Well doctor, the days are long and busy, I catch up on my sleep over the weekend. I try to workout in the gym over the weekends, but it is challenging”
“I see that you are smoker, do you drink? And are you a vegetarian?”
“Well doctor, I do drink, and I am not vegetarian.”

Dr Black was one of the most famous cardiologists in the country. He was a master at diagnosis, believed in scientific and systematic study of symptoms and their connections. He placed his gold-rimmed glasses on the table, rubbed his finger on his chin, leaned back on his cozy leather chair and his piercing eyes looked straight into Joe and said “Mr Joe, you have issue with the blood supply to the heart muscles, there seems a advanced arterial block. It is necessary that you undergo angioplasty, a procedure to relieve the constriction very quickly within a fortnight.”

Joe sat still, staring at the statement “The most amazing non-stop machine.Take care.” written on a poster containing the picture of heart, which prominently on the wall behind the doctor.

Dr Black who was used to these reactions, jolted Joe out of the reverie “Mr Joe, you need to quit smoking and go vegetarian. You are young, the angioplasty procedure has a high success rate and you should be back on to active life quickly.” Dr Black believed in conveying news straight, and expected patients to face reality and act on the problem. “Mr Joe, you would need to be in the hospital for 3-4 days, do decide on the date quickly. As I mentioned before, it is important that you act on this within a fortnight. It is painless procedure and should be fairly straightforward in your case. Mr Joe, do you have any questions?”

“No doctor, I have none” replied Joe mechanically, his ability to think numbed by the turn of events. As he exited the consulting room, the receptionist, a cheerful and bubbly woman in twenties whispered “Have a good day Mr Joe” with a beautiful smile, a genuine one. It had the effect, and for a moment he felt cheerful and returned the smile, a little weakly though.

As soon as he was outside the hospital, his hand went mechanically to his shirt pocket containing the cigarette packet. “Smoking kills” said the packet loudly and he threw the packet on the wayside.

David, his roommate had just woken up as Joe returned to his apartment. “Hi, had breakfast? Got some muffins and bagels, want some?” said David. “No thanks” mumbled Joe.

After a few minutes David understood the reason for the strange behavior of his best friend. “Come on man, let us get a second opinion right away”. David was one who never lost his cool and his level headed thinking in tough situations was one that helped his friends many a times.

At 10:50 they were at ValleyTech hospital for consultation with Dr James White. He had been referred by David’s boss, who had undergone a heart bypass a few months at ValleyTech.

At 11:05 Joe was called in. “Good morning Joe. Please sit down. I have read your case sheet and have a few questions. Do you have any recent ECGs?

“No doctor” said Joe.

“I know your company has a yearly health check plan for all, as our hospital administers it. So have you not taken it this year?”

“No doctor, the last few months has been very hectic and I have not had my yearly checkup yet” said Joe.

Dr White was a modern doctor who relied on technology in diagnosis and treatment. A young cardiologist,he believed in seeing the ‘internals’ before the
scalpel touched the body. He was amazed at the advancements in radiology and made it a point to recommend a few pictures to be taken before he touched a patient.

Unlike Dr Black who believed in the power of external examination, Dr White believed in the looking at internals for diagnosis and treatment. Dr White had immense faith in scans, lab tests and preferred analyzing reports to spending time on and performing examinations on patients.

“Please get the ECG done now. I would like to see the report first. Thank you.”

Joe went to the diagnostic lab in the adjoining building. When his turn came, he went inside, removed his shirt, the technician smeared jelly on his chest and proceeded to stick colorful leads at various points. In a few seconds, the needle was dancing, drawing patterns on the strip of paper. The technician looked at the squiggles on the paper with a bored expression, and after a few minutes, decided the machine has had its share of fun and switched it off. He tore the roll of paper, scanned it intently, and then proceeded to fold it and inserted into a cover. Joe was curious to know what the squiggles meant and asked “Is it normal?”.
“It seems fine, expect for a small spike here” replied the technician. He had seen hundreds of such squiggles and knew exactly as to what was normal, but he was no doctor to interpret any abnormalities.

Joe stared at the report, the squiggles held a secret that Joe was scared about. “Hey, let’s go meet to the doc” now said David breaking Joe’s train of negative thoughts.

“Show me the report Joe,” said Dr White.
Dr White held the strip of paper and rapidly scrolled it forward and then backwards.
“Were you treated for any heart related issue when you were young?” asked Dr White.
“No” said Joe, scared to ask questions.
“Joe, there is a slight aberration in the ECG, it may be nothing to be worry about. To confirm this, I recommend that you get the 128-slice beating heart scan. This the most advanced technology for diagnosis of heart related ailments that is available in the world and we are the only hospital in this city to have this. This gives a clear picture of the beating heart and enables clear diagnosis. And also take a chest X-ray too. I will be available until 1:00 PM, get it done right away and then see me”. He wrote down the lab request and handed it over to Joe.

“Hello, I need you to get ‘128-slice beating heart scan’ done. How much does it cost? Joe asked the grim looking gentlemen on the cash counter.
He was shocked at the cost of the hi tech scan, it seemed to have enough digits to max his credit card. Joe looked at David conveying in his look “They are milking us”.

Joe realized that the second opinion was going to be expensive and needed to think about this before he went on a diagnostic spree.

Joe was a professional software tester who diagnosed software for defects. He decided to chat with David over a cup of coffee to decide whether to go ahead with the expensive scan. David went to get the coffee while Joe sat down at the corner table, gazing at the birds fluttering over the little pond outside.

During this mindless gaze, staring at the chirpy birds, it suddenly flashed on him, the parallels between his profession and the doctor’s. In his job, he used black box techniques that required him to examine the system externally and find defects. He relied on deep understanding of the intended behavior, observation of behavior (“symptoms”) to design & refine his test cases. He had at times looked at internal information like architecture, technology, code structure to design test cases that were adept at catching issues related to structure.

His colleagues always used terms like “Black box testing” and “White box testing” and associated these to with system and unit levels respectively. Now he realized the general misconception that unit testing was white box testing and system box testing was black box testing. His train of thought was interrupted by hot coffee spilling on his shoulder followed by the shrill sound of glass breaking. “I am really sorry, hope you are ok” said the elderly woman who had tripped over the protruding leg of the gentleman at the neighboring table and spilt the hot cup of expresso that she was carrying. “I am fine, let me help you” said Joe as he helped pick up the glass shreds.

He realized that certain types of defects were better caught via “internal examination”(white box test techniques) while some are most suited to be caught via “external examination” (black box test techniques). He now understood that both of these test techniques were required at all levels to uncover effectively and efficiently.

Suddenly the diagnosis approach followed by Dr Robert Black and Dr James White became clear. As soon as David laid the steaming cup of Cafe Latte on the table, Joe had made the decision. He was not withdrawn or worried. The confidence was back and he would not let the ECG scroll spoil his fun.

The article was published in the April issue of “Tea-time with Testers” – an ezine on Software Testing.

Rapid action team – building a team from scratch

Customer is a major technology innovator and global leader in semiconductors for wired and wireless communications. Their products enable the delivery of voice, video, data and multimedia to and throughout the home, the office and the mobile environment.

The principal in US wanted to explore the possibility of moving core product development to their captive center in India with business case analyzed and approved. The challenge however was that they were unsure on the time required to build the engineers with domain knowledge and relevant experience. The impact of such delay on road map and associated planned revenue was identified as major risk. Could STAG mitigate this risk? Read more.

One Manager responsible for development was relocated to India and given this responsibility to build the initial team and show success. Offer for full time senior person to manage QA was made and they were waiting for him to join. With the market going through some turbulence, getting a person full-time on board to take over QA responsibility was also taking its own time.

The management was aware STAG took up a challenge in the past to arrest their defect escape. So they threw in the new one – to build an effective QA team with the following goals:

  • Build initial set of QA team in 3 different sub-groups
  • Complete knowledge transfer or ramp-up time will go as per business plan
  • Build complete test lab on-time
  • Commit deliverables and adhere to the plan
  • Show the improvement in productivity and quality over time
  • Transition core team to be part of customer organization if all set goals are met and partner with them to build temp staff required to achieve the new set of goals for product road map

We identified a large team – some with knowledge in HyBIST & STEM™ but new to domain, AND the rest with experienced in testing from the same industry. Both were then given a clear definition of quarterly goals under focus and STAG way of tracking and ensuring that how we measure customer expectation. Entire team worked closely with QA Manager to setup complete lab and commit release dates for some customer key releases and delivered on-time with acceptable quality defined.

Having the complete lab and no constraints to skip any type of tests, team started enhancing their scope, improving test assets thereby increasing stake holder’s confidence further. Certain area for automation identified and new members were added to support this initiative. With multiple releases experience team understood the dependencies and started defining right scope regression and release cycle time reduced wherever business situation demanded.

Typical success factors like good planning, effective tracking, timely release with good quality, team flexibility and attitude towards business impact was seen in every subsequent release. Both Development and QA got the required approval to take the core team on board and define the temp staff requirement and duration to manage the rest of the releases in road map. Journey continuing with STAG being a QA partner, some members smoothly transitioned to customer organization as core team and additional extended team requirement is still supported by STAG.

  1. What was thought as tough constrains to meet and build the team on-time was achieved with our approach which had high impact on revenue plans defined for that product line
  2. Full-time core team formation and extended team as contractors working fine
  3. Smooth transition ownership of a product line was achieved as per plan

Staying on top more difficult than getting there

Business is good when you are alone on top. If you are not prudent, however mightiest one may be, a small nudge by somebody could pull you all the way down. Our customer is the market leader in providing learning solutions to Universities and school. Apart from innovative solutions, they proved to be business smart by outsourcing to India not just for cost benefits, but also quality equivalent to their team.

Senior management of the organization had apprehensions of its success, owing to Competency, Process, Training, Communication and Cultural differences. Did STAG manage to allay their fears?  Read on.

With headquarter in Washington DC, US; the customer is the largest E-Learning provider in the School and University space. Growing competition was driving the product team for newer innovation, quicker concept-to-market and higher quality. As the development team shift gear into an agile mode, the pressure was on the QA to cut down the test cycle time and qualify the products quicker with the same effectiveness. The small team was found wanting for additional people. Aggressive deadlines convinced the management to outsource some parts of testing. We realized the initial challenge would be to reinforce the customer confidence on their decision to outsource. This would mean total transparency and a continuous communication channel to know the team’s day-to-day activities. It was also necessary that we adapt their QA process and the terminologies they use. The time involved in setting up the team and ramp-up was very short. This meant we would need someone travel to US on a short notice and undergo product training and be back and transfer the knowledge to the team.

The team was chosen based on previous experience with testing web applications and previous teaching experiences. Having functional experts meant resolving many of the issues internally.  On return to India after product and process training, the test lead initiated knowledge transfer to team. The training incorporated various topics including setup, features and test process. Once the training was complete, the team started executing dry runs of the test. Though the main objective was to understand the client’s test process, it was also the quickest way to learn the product.

The setup activity happened in close coordination with the onsite analysts and support team. This also helped verify the installation manual. The setup involved multi-platform & multi database test lab and was organized within a week of the engagement.

  • Significant reduction in test execution cycle time (66%)
  • Created excellent knowledge base so that subsequent ramp-up of team was done with just one week notice
  • The depth of knowledge of testing demonstrated by team gave confidence and team was allowed to update test cases. This helped in reducing defect escapes to the field by 13% to 4%
  • Started with four engineers as experiment in outsourcing helped them to outsource major part of test execution (22 engineers) in eight weeks time

Achievements

  • We managed short-notice deadlines (as less as 2 days) by stretching ourselves when needed
  • We executed more than 60 cycles of testing on three major releases, and six minor releases(hot-fixes and service packs)
  • We helped the client to stop many critical issues from escaping
  • They seek and trust our ‘Quality’ advice in a ‘go-no-go’ situation
  • We accepted the challenge of simultaneously testing multiple versions
  • We work with their support team to isolate field defects
  • Our test lab has been flexible to the changing system requirements
  • We add and maintain their test documents
  • We are able to reset the test lab in 4 hrs to a newer configuration
  • We are able to increase the team size with only one week notice(whenever needed)
  • We have the ability to make a new engineer productive in 3 days!
  • We internally developed the training material to create a strong knowledge transfer
  • We have completed testing on/ahead of the schedule 95% of the cycles
  • We have brought down the test cycle time by over 60%
  • We provide status update daily and we do a rain check every alternative week over a T-Con
  • We internally resolve 4 out of every 5 clarifications needed by our engineers, before we approach the client – We ‘disturb’ the customer less!

On-time release to market helped the company stay afloat

A pharmaceutical company decides to ride the IT bandwagon. They establish a company to develop Enterprise Resource Planning (ERP) solutions to meet the demands of small and medium scales Pharmaceutical, Chemical and Food processing industries. The challenge at hand was to build a software development organization and release the first version of the product in six months to the market. With all the functional experts it had, the solution specially designed and developed for pharmaceutical Industry specific best practices, could not fail. Or did it?

The ERP solution complies with Current Good Manufacturing Practices (cGMP) and requirements of International Regulatory Bodies such as US FDA, EDQM, TGA, MHRA, MCC, etc. The challenge at hand was to build a software development organization and release the first version of the product in six months to the market. Considering the tight timeline for product release the company preferred to jumpstart its QA process by partnering with a third party testing organization specialized in test engineering. They expected this organization to provide the required software testing expertise to deliver high quality product to market in time. At the same time to work within the constraints of the company.

The first step involved Knowledge transfer from the customer. Using flow-charts and use-cases we got customer’s concurrence on our understanding of all modules and the interfaces of each module with others. Next, we used STEM™ Behavior Stimuli (BeST) technique to design test cases, module-wise. To increase the depth in testing we applied boundary value analysis technique, Equivalence class technique and Domain specific special value techniques. We also increased the breadth of testing by adding scenarios for different type of tests based on requirements under focus. We also identified module level interfaces to other modules to design end-to-end test scenarios.

We jumpstarted customer’s QA. Institutionalizing our test engineering practices within the organization, led to on-time launch of product thereby boosting the stakeholder confidence in the quality of the product and hence the investment.

Perfect software to stop perfect crime

The Intelligence department of Karnataka Police decided to implement a solution to analyze the Call Detail Record (CDR) from telecom service providers. The tool was to be deployed across the state. The solution can provide critical information about subscribers whose CDR is analyzed location, geographic movement, calls to other monitored suspects etc. This information is very critical for any case in the present times.  The head of this initiative, a very IT savvy Officer, decided that the solution needs to be validated by a specialist organization, if it has to be defect free. In came STAG.

The solution is a Call Detail Record analyzer intended for Law Enforcement or Intelligence Analysts who have to, need to, want to, or are expected to, work with telephone call detail records. A CDR is composed by fields that describe the exchange, i.e. the number making the call, number receiving the call, start time, duration, end time, route taken, etc. The tool also integrates with mapping server, enabling a visual display of the routes and locations of the suspect.

The solution was being developed by a small but very inventive team from a small town. However, being a small team meant the code was self-validated. This made the customer a little jittery. They sought STAG’s services to validate the solution end-to-end.

STAG assessed the development process of the organization, to understand how well it was built. The application then had to be put through thorough multi-level evaluation, starting from field level to load testing. The tool inched its way slowly through these gates, and required structural modification to clear some. STAG worked closely with the department during the training and saw through a successful release. The tool immediately started cracking some pending cases and is now sought after in the other states.

The software Hara-kiri

The Japanese subsidiary of a global system integrator is required to outsource a part of their projects to the subsidiary in a ‘friendly’ country due to political reasons. However, this ‘Global’ delivery model has its banes. The pieces of code don’t necessarily integrate as expected.

After couple of times of burning their hands and re-code there was a noticeable pattern. Since freezing outsourcing was not an option, they enforced delivery of ‘Unit’ test cases along with the results hoping to improve the quality of the delivery. A whole lot of test cases and the code passed it all. This is where we entered the scene.

When a large System integrator having the third largest market share in Japan, faced quality issues in the code delivered by their outsourcing partner, they asked STAG’s involvement. We decided the best way is to assess the vast Unit test cases and reports that came along with the delivery. The assessment was to be done by comparing the available artifacts (Test Case, Data Definition Language (DDL), Screen Transitions and Bean Specifications) with those defined per HyBIST. The assets were assessed to understand

  1. Quality of the test cases
  2. Test Completeness
  3. Test Coverage
  4. Comparison with Ideal Unit Testing

For good Unit Testing, the unit should be validated from an external (using Black-box testing techniques) and internal/structural (using white-box testing techniques) view. In this case all test cases provided were designed using Black-box techniques per the specification and not using the code structure.

The results astounded the client – Apart from issues like poor test data, incomplete steps and insufficient negative tests, the tests were found to be designed applying only Blackbox techniques  i.e., the structural aspects were not evaluated at all. The results were used to confront the partner and renegotiate all future engagement contracts and deliverables.

Back to the future >> preparing for an avalanche

When a bank implements major solutions, you need to watch like a hawk. The smallest glitch can set off an avalanche. When we were asked to validate the performance of an integrated financing solution for leading commercial bank, we assumed it was like any other project. This wasn’t the case. The challenge thrown at us was ensure the system is future proof for 3 years! From our experience, we knew scripting and simulating large user load was the easier part. Banks run on data and documentation and this product is intended to cater to the agri-commodity business of the bank. We foresaw an avalanche of data thundering down!

The product is intended to enable financing for farmers for the commodity they have produced. Bank offers loan against the commodity that is being stored in warehouses. With a focus on commodity finance, the solution encompasses various modules of commercial operations right from sourcing of the account, operations, monitoring and control, recovery management, audit and closure through repayment. Each of the process that is initiated has to go through approval process and most of the processes have initiation and approval stages for the completion of the process!

Based on the understanding and post having some initial discussion with the bank, detailed operational profile was derived. 40+ scenarios were identified for the test with concurrency of 600 users.

The plan was to conduct the load test for 3 different combinations where peak concurrency of each module defined is achieved during the different combinations of load test.

Considering the key requirement was to conduct the test simulating 3 years of usage of the system, the only success factor was the test data creation. Hence it was required to create the huge test data before doing the actual test. The system was heavily loaded with data – 2000 users, 5000 borrowers 300 warehouses (100 Govt, 200 Private/Godown warehouses), 44,000 Loans, 50,000 liquidation, 10 image uploads per borrower and every warehouse creation and so on…

The scripts were developed to populate the required test data in the system to replicate three year usage. The first step towards that was to create 2000 users in the system. User creation meant creating more data for every user required role, and branch for which it needs to be created. After creation of the users, we started creating the warehouses and the borrowers required for the test. The next major activity was loan bookings and liquidations. 40,000 loan bookings and 50000 liquidation records were created by running JMeter scripts.

The interesting part was the interesting set of functional issues that surfaced during the data creation. The customer couldn’t be happier. The product was supposed to have been tested thoroughly for functionality. Once these were fixed, we were prepared for the next set of performance related issues. Steadily, with one step at a time, we ensured the avalanche would not occur for the next three years.

HyBIST implementation benefits more than just testing

We have been talking and advocating on various platforms, how scientific approach of HyBIST and the method, STEM, delivers key business value propositions. This time we thought it would be prudent to share our experience of implementing this in projects and convey the results as well as interesting benefits that it entails while delivering clean software to our customers.

HyBIST was applied on various projects that were executed on diverse technologies in variety of domains across different phases of product life cycle. The people involved where of mix from no experience to 5 years of experience.

We observed that HyBIST can be plugged into any stage or situation of the project for a specific need and one can quickly see the results and get desired benefits as required at that stage.

Our experience and analysis showed that there were varied benefits like rapid reduction in ramp up time, creation of assets for learning, consistent way of delivering quality results even with less experienced team members, increased test coverage, scientific estimation of test effort, optimization of regression test effort/time/cost, selection of right and minimal test cases for automation and hence reduction in its development effort/time/cost.

Following are the key metrics and the results/benefits achieved in some of the projects were HyBIST was implemented –

Project 1:

Domain: SAAS / Bidding Software

Technology – Java, Apache (Web server), Jboss (App server), Oracle 9i with cluster server on Linux

Lines of Code – 308786

Project Duration – 4 months

D1, D2 and D4 were done almost in parallel due to time constraints for this complete application that was developed from scratch.

  • D1 – Business Value Understanding (Total effort of 180 person hours)
    • 3 persons with 3+yrs experience were involved (had no prior experience in this particular domain)
    • 4 main modules with 25 features listed.
    • Landscaping, Viewpoints, Use cases, Interaction matrix (IM) were done.
    • D1 evolved and developed by asking lot of questions to the design/dev team.
  • D2 – Defect Hypothesis (Total effort of 48 person hours)
    • 3 persons with 3+yrs experience were involved.
    • 255 potential defects were listed.
  • D4 – Test Design (Total effort of 1080 person hours)
    • 3 persons with 3+yrs experience were involved.
    • Applied Decision tables (DT) for designing test scenarios.
    • Totally 10750 test cases were designed and documented.
    • Out of which 7468 (69%) are positive test cases and 3282 (31%) are negative test cases.
    • Requirement Traceability Matrix (RTM) and Fault Traceability Matrix (FTM) were prepared.
  • D8 – Test Execution (Total effort of 3240 person hours)
    • 9 persons were involved in test execution and bug reporting/bug fixes verification (3 persons with 3+ yrs experience and 6 persons with 2+ yrs experience).
    • 12 builds were tested in 3 iterations and 4 cycles.
    • Totally 2500 bugs were logged, out of which 500 bugs were of high severity.

Key benefits:

  • No bugs were found in UAT.
  • All change requests raised by QA team was accepted by Customer & Dev Team.
  • Interaction matrix was very useful for selecting test cases for regression testing and also for selecting right and minimal test cases for automating sanity testing.
  • Regression testing was for shorter periods like 2 to 3 days, interaction matrix was quite useful to do optimal and effective regression testing.
  • The method, structure, templates (IM, DT, RTM, FTM, Test case, Reporting) used and developed in this project is being used as reference model for other projects at this customer place.

Project 2:

A web service with 5 features that had frequent enhancements and bug fixes (Maintenance)

Technology – Java, Apache Web Server

Project Duration – 4 weeks

  • D1 – Business Value Understanding (Effort of 6 hours)

Mind mapping of the product and also the impact of other services & usage on this service

  • D2 – Defect Hypothesis (Effort of 5 hours)

Listed 118 potential defects

Key Benefits:

  • Preparation of D1 document enabled ramp up time for new members (Developers/Testers) to understand the product, to come down from earlier 16 hours to 4 hours now.
  • Any member added to this team was productive from day one and could start testing for any regression testing cycles for enhancements and bug fixes.
  • Listing of potential defects enabled adding missing test cases from the existing test case set.

Project 3:

Domain – E-learning

Technology – ASP.Net, IIS, SQL Server, Windows

Validation of a new feature added to the product

Duration – 2 weeks

  • D1 – Business Value Understanding (Effort of 5 hours)

Understood the feature by asking questions and interacting with development team over emails/conf calls

  • D2 – Defect Hypothesis (Effort of 2 hours)

Listed 130 Potential defects by thinking from various perspectives

  • D4 – Test Design (Effort of 16 hours)

Designed and documented 129 test cases

  • D8 – Test Execution (Effort for Test Execution – 626 Person hours,

Effort for bugs reporting/bug fixes Verification – 144 Person hours)

Executed test cases by performing 2 cycles of testing and 2 regression cycles

8 new test cases were added while executing the test cases

31 bugs were found in test execution of which 23 bugs were of high severity. 29 of the bugs can be linked to potential defects visualized and listed before. 2 of the bugs found were not linked to any documented test cases.

Key Benefits:

  • Arrived at a consistent way of understanding the feature and designing test cases for new features irrespective of the experience of the team member involved

Project 4:

Domain – Video Streaming

Technology – C++, PHP, Apache, MySQL, Linux

An evolving new product in very initial cycles of development/testing

Duration – 4 weeks

People Involved – 2 Fresh test engineers (No previous work experience but trained in HyBIST/STEM)

  • D1 – Business Value Understanding (Effort of 32 hours)

The understanding of the product in the form of listing features/sub features, Landscaping, Critical quality attributes, Usage environment/Use cases, by questioning

  • D2 – Defect Hypothesis (Effort of 40 hours)

Listing of over 150 potential defects

Key Benefits:

  • 2 fresh engineers could understand and comprehend the product features, the business flow and its usage in a scientific manner and also document it. They could also think and visualize possible defects to enable them to come up with needed test cases to identify and eliminate defects.
  • The process of doing D1 and D2 by the fresh engineers generated lot of useful questions that enabled better thinking, understanding and different perspectives of the product behavior to the senior engineers. This helped them to design more interesting test cases to capture the defects during test execution.
  • The assets created as D1 and D2 is helping the other members of the team to quickly ramp up on the product features and get the detailed understanding in 50 % lesser time.

Project 5:

Domain – Telecom protocol

3GPP TS 25.322 V9.1.0 Standards

Estimate effort for complete test design of RLC protocol by going through existing very high level test specifications and designing test cases for 2 sample functions

Duration – 3 weeks

None of the persons involved had any previous experience in testing protocol stack.

  • D1 – Business Value Understanding (Effort of 40 person hours)

Went through the generic RLC standards and in specific understood the 2 functions, Sequence number check and Single side re-establishment in AM mode

Prepared flow charts with data/message flows between different layers

Prepared box model illustrating various inputs, actions, outputs and external parameters

  • D2 – Defect Hypothesis (Effort of 16 person hours)

Listed 28 generic potential defects and the defect types

  • D4 – Test Design (Effort of 48 person hours)

Prepared 2 input tables and 2 decision tables

Designed 26 test scenarios (6 positive, 20 negative) and 44 test cases (6 positive, 38 negative)

  • Performed gap analysis of missing test cases in the customer’s test specification document for the 2 functions (Effort of 12 person hours)
  • Estimated time and effort for the complete RLC test case design, based on the above data (Effort of 4 person hours)

Key benefits:

  • Performed gap analysis in the existing high level test specs
  • 20 times more test cases designed for 2 functions covered
  • 86% of the test cases added were negative type
  • Test cases developed were detailed and covered various combinations of inputs, parameters and intended/unintended behaviors
  • Test cases developed were suitable in order to help easy conversion to test scripts using any tool
  • Performed estimation for RLC test design covering 22 functions
  • 256 Test scenarios and 1056 test cases to be designed with effort of 446 person hours for RLC

Project 6:

Domain – Retail

Validate railway booking software on point of sale device

Technology – Java

Duration – 4 weeks

  • D1 – Business value understanding (Effort of 16 person hours)

Documented software overview, features list, use cases list, features interaction matrix, value prioritization and cleanliness criteria

  • D2 – Defect Hypothesis (Effort of 18 person hours)

Listed 20 potential defects by applying negative thinking, 54 potential defects by applying Error Fault Failure model. The potential defects were categorized into 46 defect types and were mapped to the features listed.

  • D3 – Test Strategy (Effort of 6 person hours)

Based on the listed potential defects types, arrived at test types, levels of quality and test design techniques needed as part of test strategy/planning. Quality level 1(Input validation and GUI validation), Quality level 2(Feature correctness), Quality level 3(Stated quality attributes) and Quality level 4(Use case correctness).

  • D4 – Test Design (Effort of 24 person hours)

Designed and documented 30 test scenarios (15 positive, 15 negative) and 268 test cases (197 positive, 71 negative) for quality level 1, 70 test scenarios (21 positive, 49 negative) and 123 test cases (55 positive, 68 negative) for quality level 2 and 8 test scenarios for quality level 4. Created box models and decision tables to arrive at test scenarios

Prepared requirement traceability and fault traceability matrices

  • D8 – Test Execution (Effort of 32 person hours)

Out of 293 test cases designed, 271 were executed and 22 were unable to be executed.

52 defects (27 high, 12 medium, 13 low) and 8 suggestions were logged.

Quality level 1 – 23 defects (2 high, 11 medium, 10 low) and 2 suggestions

Quality level 2 – 27 defects (25 high, 2 low) and 6 suggestions

Quality level 4 – 2 defects (2 medium)

Key Benefits:

  • Complete validation of the product was performed successfully by one senior engineer guiding 2 fresh test engineers without any previous work experience and none of them having experience in this particular domain
  • All the suggestions logged were accepted and valued

Project 7:

Domain – Mobile gaming

Technology – Java, Symbian

Duration – 3 weeks

  • D1 – Business Value Understanding (Effort of 16 person hours)

Achieved product understanding by documenting software overview, technology, environment of usage, features list, use cases list, mapping of use cases to features, features interaction matrix, value prioritization and cleanliness criteria.

  • D2 – Defect Hypothesis (Effort of 16 person hours)

Listed 96 potential defects by categorizing issues related to installation, download, invoking application, connectivity, input validation, Search, subscription, authorization, configuration, control, dependency, pause/resumption, performance, memory.

Mapped the features to the potential defects

  • D3 – Test Strategy (Effort of 4 hours)

Based on the listed potential defects types, arrived at 4 levels of quality (Game initialization & invoking correctness, Game subscription correctness, Game download correctness, Dependency correctness). The different test types were mapped to the 4 quality levels.

  • D4 – Test Design (Effort of 20 person hours)

Box models and decision tables were created

Designed and documented 37 test scenarios and 66 test cases

Key Benefits:

  • Complete validation of the product was performed successfully by one senior engineer guiding 2 fresh test engineers without any previous work experience and none of them having experience in this particular domain
  • The assets created in this became a useful reference for other projects in mobile gaming software for its understanding and validation

Guy Fawkes – Beautiful fireworks, not a blast!

We had an interesting challenge posed to us by a large UK based government health organization. It was to assess if their large health related eLearning portal would indeed support 20000 concurrent users (They have 800K registered users) and deliver good performance. There was indeed a cost constraint, and hence we decided to use the open source tool JMeter.

The open source toolset has its own idiosyncrasies – max 1GB heap size support , supports only a few thousand users per machine and and has a nasty habit of generating a large log! To simulate the load of initially of 20000 and later 37000 concurrent users, we had to use close to 40 load generators and synchronize them.

We identified usage patterns and then created the load profile scientifically using the STEM Core Concept “Operational Profiling”. We generated the scripts, identified the data requirements, populated the data and setup synchronized load generators. During this process, we also discovered interesting client side scripting , we flattened them into our scripts. Now we were ready to rock and roll.

When we turned on the load generators, sparks flew and the system spewed out enormous logs – 3-6 million lines, approximately 400-600 MB! We wrote a special utility to rapidly search for the needle in the haystack! We found database deadlocks, fat content and heavy client side logic. Also the system monitors were off the chart and the bandwidth choked!

Working closely with the development team, we helped them identify bottlenecks, This resulted in query, content and client side logic optimization.Now the system monitors were under control and the deployed bandwidth was good enough to support the 20000 concurrent user load with good performance. To support higher loads in the future, system was checked with nearly twice this load and additional resources to support identified.

The FIVE weeks that we spent on this was great! (Hmmm- tough times over at last!)