Do you work well under Pressure Situations? Experience Report of WT15

Date: 21st Nov 2009
Testers: Ajay Balamurugadas, Shakthi Bhuvaneshwaran, Deepak Dhananjay, Karan Indra, Tejas Shah, Sunil Gatadi, Jaswinder Nagi, Veerendra, Poulami Ghosh, Sireesha Kolluri
Testing Session: 3pm – 4pm IST
Discussion Session: 4pm – 5pm IST
Application: http://sourceforge.net/projects/freekeygenera/
A small application: A very quick password generator.
Mission: Your Test Manager has asked for a detailed test plan as to how you will test this particular application.
This session had a surprise twist and no tester knew till the twist occurred.

So we were a total of ten testers ready to meet our new mission.
A lot of questions were raised:
> Which format should we use? Any sample format is available?
> What are the requirements? Which OS must be used for testing?
> How many of the testers started testing the application?
> How many testers started noting down their observations?
> What OS, Environment is to be used?

Some of the testers were stuck as they wanted requirements to create a test plan.
Twenty minutes had past and THERE WAS A CHANGE IN PLAN.

Test Manager: “The client does not want the Test Plan. Instead he wants to know how the application would be tested on a particular OS”. And there was just forty minutes left to complete the task.

Some testers were so busy designing the test plan that they did not give attention to the change in plan.

How important is it to be aware of the changes that take place around you?

Some testers were not aware of the new developments going on in the chat room and kept designing test plan which was of no use to the client.

One by one testers left.
It is sad to see testers abandon tough testing sessions.

Karan and Sireesha were busy preparing a detailed test plan as they emphasized on the process.
Some of the areas covered included Contents, Risks, Time needed, Testers involved etc

Some more testers left without any intimation.

It was left to Deepak, Shakthi, Karan and Ajay (facilitator) to start the discussion.

Karan listed out his Test Plan contents: Testing Objective, Test Items, Features to be tested, Features not to be tested, Testing Approach, Testing Tasks, Pass/Fail Criteria, Entry and Exit Criteria for testing, Suspension and Resumption criteria, Test Environment, Schedule, Staffing and responsibilities, Risks and mitigation plan, Test deliverables .

The questions posed to Karan included:
> Why would the client bother about the Testing Objective, Testing Approach, Pass/Fail criteria, Entry and Exit Criteria, Suspension and Resumption Criteria, Test Environment, Schedule, Staffing and responsibilities, Risks and Mitigation plan and Test Deliverables?
> How was the experience of the whole exercise?
> Why the client rejected the test plan?

As the mission was to know “How the application would be tested in a particular OS”, some points definitely could have been skipped.

Karan was extremely happy with this task. He was at ease throughout the testing session. According to hi, this was an exact replica of his organization’s day to day activities. Karan understood why only parts of his test report were accepted by the client 🙂

Shakthi was ready to share his experiences. It was very surprising to know that Shakthi had not run the exe given in the website. He had tested the website- speed to download the exe, and different internet speeds, different browsers. It was not related to the mission.

Shakthi, realized his mistake pretty soon and blamed the multitasking activity as the main culprit. He was testing from his office.

Deepak was the closest to meet the client’s requirements. As the client wanted to know how the application would be tested in a particular OS, he had listed out the test ideas and quality criteria.
Product elements were also mentioned.

Structure: Installed folders, logs, code; Functions: Keyboard tour, Menu Tour, Mouse Tour
Data: Valid, Invalid, Min, Max, Zero, Default, -ve, large, small
Platform: particular, Operations: Scenario: create pass word, Capability: Test functions,
Reliability: generate a key, Usability: How easy it is to create a Key?
Security: Any login required /password? Scalability: Next versions, previous version.
Performance: How slow or fast it is, when input maximum digit, minimum digit or without digit, when input character special character. Supportability: Support available or not? After release?
Function Testing: Different Function, Stress Testing: Huge input,
Claims Testing: Help by internet about the product, User Testing: Creating a key.

Though the above points covered the most important parts of the mission, there were many points which were not appropriate for the current mission.

Some points like > Scalability, Stress testing, Performance testing by input character, special character were not possible. There was no visible method to input data in the application.
Deepak was briefed about the importance of designing tests to suit and achieve the mission.

The customer is least bothered about tests that would not be applicable to his application.
Knowing which test to use is more important than knowing different tests.

Ajay highlighted the importance of using the ALLPAIRS Test Case Generation Tool by James Bach : http://www.satisfice.com/tools.shtml.

With the satisfaction of completing the session and the hope of participating the next session, testers left with their share of learning. See you in next testing session.

About the Author