Wednesday, November 17, 2010

Samples - How to provide interview feedback - Candidates on Testing

Selected Candidates for next round:

Good concepts on Test Scenario and Test Case derivation from Use cases. Has good understanding of Test Coverage, Test Reporting. Has concepts of different kind of testing. Has high level awareness on different section of Test Plan.
Has got good domain knowledge on Financial - Trading. Shows interest and puts effort to gain knowledge on domain worked.
Has experience on QTP and has got clear concepts whatever he has done so far.
Did not get chance to work on SQL much and hence lacking skill on SQL


Has got fair idea about Test Plan
Has exposure to test case derivation, Bug Life cycle
Has concept of Functional, Regression, Unit and Integration testing
Is aware of Test Suite setup, Test data creation
Has scripting knowledge (VB Script)
Confident about what he has done; Else says clearly does not know; Does not try to bluff
Has got good positive attitude and willingness to learn
Current role is write code to test


Has good concepts on Test Scenario and Test Case Derivation; Well aware on Test Reporting and Test Coverage.
Has got good high level ideas and awareness on different section of testplan with 2.5 yrs of experience
Does not have exposure of creating the Test Environment but Has worked on VMware tools and has clear concepts on it.
Beginner level concepts on QTP- Automation.
Fair knowledge on SQL


Has 1 year experience in development (Oracle 7.3, Developer 200) and 4 years experience in testing
Has fair understanding of Test Plan, used to prepare the draft version of Test Plan
Is familiar with Usability, Regression, Functional and Smoke testing
Has good understanding of Bug life cycle
Has basic concepts of SQL database and fair in writing SQL query.
Has exposure to Blackbox testing but no exposure to whitebox testing
Is familiar with V model of testing methodology.
Has fair understanding of preparing Test Summary Report.
Good in Automation using WATIR tool using Ruby language


1. Good testing concept; Have exposure to Functional, Regression, GUI testing; Has the concept of performance testing but did not get a chance to work on this
2. Has good exposure to deriving test cases and test scenarios; Has not involved in Test Plan preparation but has the awareness of the sections in Test Plan
3. Good communication skill - has ability to express and explain clearly
4. Has knowledge on Status Reporting
5. DBMS - Has basic knowledge on SQL query writing
6. Has worked on scripting and automating using WinRuner and QTP
7. Has got analytical and problem solving skills


Has very good knowledge of Testing methodologies and Testing concepts (with 3 yrs experience) Has good concept of Test Plan preparation, deriving test cases
Has good concept of Bug Reporting and Bug Life cycle
Has average knowledge on Status Reporting
Knowledge on SQL Query - Poor
QTP Automation - has knowledge on Record and Playback - no scripting knowledge
Communication - Good - has ability to express clearly


1. Well aware of different SDLC e.g. Waterfall and Agile.
2. Excellent Communication skill
3. Excellent attitude - always looks for how customer expectation can be met and exceeded.
4. Very eager to learn new technologies and tools
5. Testing concepts are very clear and technically good.
5. Very good interpersonal skill and can be very good team player.
6. Has exposure to SQL Database.
7. Has positive attitude.


During the course of the interview I felt he has got good positive attitude and eagerness to learn new things. Given a situation, he shows flexibility by saying that he has no problem acquiring the skills and work on new field where his expertise does not lie. He tries to keep himself aware of the fields even if had not got a chance to do that, as for example though he was not involved in Test Plan writing he is quite aware of the different section of the Test Plan. Otherwise he is clear with general testing concepts, Bug life cycle, status reporting and technical knowledge.
Overall I feel with positive attitude, an eagerness of learning and with minimum guidance he can quickly pick up the skills where he is little low.


Rejected Candidates:

Has no exposure to preparation of Test Plan and Test bed setup
Could not derive effective test cases given a scenario. Does not have concept of Test case prioritization and Testing coverage.
Has average knowledge on different type of testing. Does not have clear concepts.
Test Report knowledge is very minimum
Automation knowledge is below average. Seems only theoretical knowledge, not practical/hands on experience
No knowledge on SQL database
Communication - Average


Does not have much awareness on Test Planning, Test data creation but mentioned in his resume
Could not derive good test cases when given a scenario
Does not have good trouble shooting and analytical skill.
Has average knowledge on Functional and Regression testing
Does not have basic of SQL clear
Limited exposure on Automation - using a tool iTKO Lisa, Does not have exposure on QTP though mentioned in resume
Does not answer to the point of the questions asked
Observed that many things were mentioned in the resume but he does not have exposure and awareness e.g Test Planning, Test Data Creation, Performance testing, QTP etc
Communication - Average


No exposure and awareness of test plan
Average knowledge on test case derivation - mostly theoretical knowledge - could not answer properly if given a scenario to derive from there
Has average knowledge on Functional, Regression Testing and has no concept of Unit and Integration testing but claims he has worked on Unit testing
Average knowledge on Bug life cycle
Poor on SQL database
Test Report knowledge is also very limited
Seems to be not very flexible
Does not know about the concept of prioritizing of the test cases, traceability matrix, necessity of configuration tool etc
Communication - Good


Average knowledge on different type of testing. Could not explain properly functional, system, sanity and smoke testing concepts.
Poor SQL knowledge, could not explain the why primary and Foreign key is required. Also could not answer to simple SQL query
No exposure to Test plan preparation and Test bed setup
Candidate does not answer to the point, even if he is unaware of the answer still says something which not at all asked for
Communication - Average


Limited exposure to preparing test plan, Test cases and Test Bed
Has average knowledge on Functional, Load and Stress testing
Could not answer some scenario based question upto the mark
No knowledge on SQL database
Test Report knowledge is also very minimum
Communication - Ok


Has awareness about different sections of the TestPlan
Has fair understanding of V Model
Has exposure on Functional and Regression Testing and Test case derivation
No exposure in Test environment set up
Has basic understanding of SQL database but not very familiar about query writing
Has experience on QTP 9.2 and QC 9.2
Communication - Average


Limited awareness of test plan preparation and Test case derivation
Average knowledge on test case derivation, testing coverage, prioritization of testing and Test Reporting
Average knowledge on Bug life cycle
Does not have clear concept about the different kind of testing and when it occurs in testing life cycle
Could not explain any testing methodology properly
No exposure on setting up the test environment
Limited knowledge on Automation using Rational Robo
No knowledge on SQL database
Communication - Average


Has average awareness on Test Planning
Has good exposure on Functional and Regression testing. Could derive test cases when given a scenario.
Has average knowledge on Test coverage and Test Reporting
Has good knowledge on Bug life cycle.
SQL knowledge is not upto the mark.
Does not have much exposure in Automation. Undergone training on QTP but not hands on experience.
Communication - Good


No exposure and awareness of test plan
Average knowledge on test case derivation - mostly theoretical knowledge - could not answer properly if given a scenario to derive from there
Has average knowledge on Functional, Regression Testing and has no concept of Unit and Integration testing but claims he has worked on Unit testing
Average knowledge on Bug life cycle
Could not answer some scenario based question upto the mark
Fair knowledge on SQL database
Test Report knowledge is also very limited
Communication - Ok
Hence I can say the candidate is below average for QA Concept and will not be suitable for the position.


The candidate is good in Manual Testing. But lack skills in Automation and SQL. Moreover the candidate has a expectation of getting a lead role of managing 4-5 members and also mentioned that the currently he is more into assigning the tasks to junior members. As we are looking for individual contributor role with interest on hands on, the candidate will not be suitable for our team.


Does not know the difference between Functional and Regression testing - carries out same set of testing against all the builds - answered as first time called functional and second time onwards called regression testing
NO concept of configuration management tool - answered as used for only security purpose - authorization purpose
SQL Database - no exposure to SQL Query, did not try to attempt even
Poor knowledge on Windows - does not know how to kill a process, no concept on Registry, does not know how to find out the IP of the machine
Role is only test case writing and execution - no exposure to test environment creation, [separate team deploys the build and pass the test environment to test team]
Could not answer some scenario based question -
what will you do if you raise a bug and dev team is saying this is not a valid bug
what would you do if time does not permit to execute all the test cases
Test Report knowledge is only limited to only bug reporting
No direct interaction with developer - Lead was doing all the interaction.


{Strong points:
The candidate has good exposure to different kind of testing e.g. functional, regression and a little bit of performance testing though he has not carried out in a very structured manner but he has good concepts.
He has good understanding of basic Testing concepts e.g. deriving test scenario/test cases, Test case prioritization, Test coverage, Bug life cycle.

Weak points:
SQL exposure very minimal.
Did not get any chance to work on Automation. Explored QTP on his own interest.

Soft Skill:
Communication - ok. The candidate seems to be very energetic and worked in a close knit project group (dev and QE) in a start up company. Also seemed to be eager to learn new things.

Conclusion: I felt the candidate can be moulded and would be flexible and eager to acquire new skills as the project demands as he is in early in his career.}


Has exposure to deriving test cases. No exposure to deriving test scenarios and no involvement in Test planning (with 4 yrs of experience)
Has average knowledge on different kind of testing
Has good understanding of Bug Reporting and Bug Life cycle
Has average knowledge on Status Reporting
Poor knowledge on SQL Query
Has average knowledge on QTP automation
Communication – Good – has ability to express clearly


Has got experience on working in Onsite and offshore model. Has got experience on transitioning from onsite to offshore being at onsite and taking all the knowledge transfer, planning and preparing transition estimation
Has got experience on QTP and demonstrating automating a subproduct as poc to customer
Has hands on experience on Winrunner and leading a team in automation
Has got experience on Performance testing using WAS
Has got experience on Manual Testing (Integration and Regression testing)
Has got experience in domain Insurance, Education and Internal applications
Is Certified in ISTQB (Advanced – Test Manager)


1. Has good hands on exposure in both Manual and Automation testing. Thinks out of the box and come up with ideas. Open to both kind of testing (manual and automation) . Has also got some experience in Performance Testing, worked on tool e.g. WAS and Load Runner.
2. Has got good understanding of different type of testing.
2. Has got good experience in QTP, .net Framework and VB on automation side and can provide his guidance to the team on automation front.
3. He answers to the point and clearly says where he does not know.
4. Has got experience on Status reporting and interacting with client. Keeps client's perspective in mind.
4. Weak area - Did not get satisfactorily answers on Test Planning, Estimation, Task identification and Scheduling stuff.

Friday, November 12, 2010

Migrating the test cases from unmanageable to manageable

For a large project over the time it is normally found that the test cases grow to a large extent especially release after releases, however often with the tight schedule of ongoing project delivery, the test cases are not maintained properly. Finally one fine morning you discover that the test cases become unmanageable. It has really become cumbersome now and drains a lot of effort to train a new joiner especially when test scripts are not maintained with enough comments / when purpose of writing a test case has not been captured in the test cases / when the functionality the test cases is trying to test is not mentioned clearly. The test cases were just added and added and added during releases based on the requirements without any thought of maintaining it for future.....

Hence it becomes difficult now to identify and pick up some test cases from the test case lot for the purpose of testing/regressing functionality xyz for Release xxxx (say).

Sound familiar in your project too!

So managing the test cases becomes an important task to make a testing team more effective.

The problem seems to be more severe when you discover that the higher management will not be willing to deploy resources for this kind of activities which were not done correctly in past and you have no other choice but to take up this activity of cleaning the test cases (which itself is a major effort itself ) along with your critical project delivery without hampering it a bit. After all you can not jeopardize your ongoing project delivery at any cost. Can you :-)

Here I have cited few of the ways what you can try in your project once you are at similar condition and you want to take up the task of cleaning and properly managing the test cases for future then on.

1. Prioritize test cases

First of all it is required to identify the Test cases as priority 1, 2 and 3 if it is not already done.
It is not at all possible or not feasible to take up all the test cases at one go along with your regular project work and clean up. So the best way is to prioritize them and take them up in phase wise.

Suppose 1000 test cases are there. e.g. Identify 200 odd test cases as priority 1, 400 as priority 2, 400 as priority 3. Prioritizing is the first and foremost task.

Priority 1 test cases are test cases which test the very basic functionality/core functionality of the product.

e.g. for a mobile handset testing test cases should include
Whether you can make a call/ receive a call, send and receive message, save a number etc,
dial the number and call, pick up a number from the saved list and call etc etc.

2. Categorize test cases

Also categorize you test cases in different folder such as

Negative testing
Stress testing
Load Testing
Functional testing
Usability testing

If for a particular release customer is expecting only enhanced look and feel of the product, then you don't need to break your head and search test cases at Stress/load testing test cases at all...

3. Brainstorming and finalization of the standards/approach

Give one week of time to your team members. Ask them to come out with the ideas how to approach on cleaning and maintaining the test cases better for future.

You could also charge them up by linking this activity with rewards, e.g. the person who adds maximum value will get 1000/- voucher etc etc..[Do not make mistake of deciding whose contribution is best by yourself. Ask your team only to decide. Best option would be deciding by voting system.]

Once all the ideas are submitted, brainstorm on those ideas as a team. Book a conference for 3-4 hours and brainstorm exhaustively. Take up an idea and see the applicability, pros and cons etc and finalize the approach. With those finalized ideas set a guideline on what are the approaches you will take while cleaning/managing the test case for future.

Modify 2-3 test cases according to those guidelines. Now sent the guidelines and the modified 2-3 test cases for review. Review should be done from the junior most engineer to senior most manager (from inside or outside the team - whomever you think can add value). Most importantly give it to a new joiner and see whether he/she can understands on his/her own i.e. whether the test cases are self explanatory ( i.e. the test cases are elaborated enough and captured enough info and the new joiner can understands on his/her own).

Receive your review comments. I am sure you need to do a lot of follow up to get your review comments!

Once you receive the review comments, modify the guidelines accordingly. Now set the baseline for the guideline.

4. Set the guideline as Bible for your team now on. (Whenever a new test cases being added this bible should be followed).

5. Take up the whole effort as a project in phase wise

Take up the whole effort of migrating the test cases from unmanageable to manageable as a complete project.

Divide the project effort in 3 phases. You can take up priority 1 test cases in phase 1 and try to implement the standards/guidelines first. That is, Try putting a small investment and see how the investment is paying off.

6. Resource utilization (keeping in mind the ongoing release activities)

You can try this model of utilizing the resources.

You can deploy 1-2 full resources in this Test case management project depending upon the bandwidth and the criticality of your ongoing project delivery activities (basically what makes sense based on your ongoing project delivery condition).

Rest of the resources who are completely deployed on ongoing project delivery, can be given a long term deadline e.g. every two months 5 test cases needs to be modified /person. Thus later point of time you may find out that a significant effort can be absorbed along with the ongoing projects and also thus you can inculcate a sense of responsibility of maintaining the test cases properly by all the team members in your team.

7. Educating new joiners

Modify 1-2 test cases with max comments/info.
Explain the flow of the code to make them understand and record your explanation.
Give the recorded file to the new joiners and see whether they are facing any difficulties in understanding those on their own.

Thus you can save significant time of the existing team members who do not repeatedly explain the test cases every time a new member joins the team.

Also record how your test cases are segregated and how you are managing your test cases in the recorded file so that the new joiner need not ask for any kind of info from the existing team members.


8. Few ideas (subject to applicability to your project)

a. Capture as much info as necessary in the test script
who has written
what purpose it is done
CQ id
Requirement ID
functionality - what it is going to test etc etc

b. Create reusable modules / scripts - thus managing them would be easier and also writing new script would be easier

c. Version control might help; you can go back to previous version and can run if the necessity arises

d. Always keep an eye how you are developing the test script /test cases and how you can reuse in future (the already existing scripts)

e. Give proper access control for maintaining the test cases.
who will have read permission
who will have read/write permission
who can add new test cases etc etc.

f. Discard not relevant test cases / invalid test cases

g. Look back after each release and clean the system. Transfer not relevant test cases to an obsolete folder

h. Do a cost benefit analysis before diving into changing the whole suite.

i. Map Requirement with Test cases
You will be able to know for a particular requirement how many test cases are there at a glance. Also based on the requirement you can pick up the relevant test cases

j. Keep executed Test cases release wise in Quality Center - Test Lab Tab.

k. Collaborate with developers for minimum user interface change in order to minimize the rework effort of test case/ script change (of course without compromising with quality)

l. Last but not the least; Put a process in place for adding new test case. Say one person is in charge of reviewing the new test cases. i.e. the newly added test case needs to be reviewed and accepted by the person before it gets added into the Quality center which will ensure standards/guidelines are followed correctly till the process gets streamlined by all the team members.

Else who knows , you might end up doing the same kind of cleaning activity again in near future !

Wednesday, August 11, 2010

QA/Testing Team’s Goal

A: stands for Objectives
B: stands for Measurement Criteria


1. Work Delivery and Process Compliance

A: Maintaining productivity
B: x% increase in productivity for the whole year
A: Delivering on or before the scheduled date with high quality.
B: # of instances not meetingQuality : less than x% Sev 1 & Sev2 defects from TS / customer.
A: Adherance to process laid down in the team
B: # of instances of not following
A: Timely escalation of Project specific issues.
B: # of instances done
A: Quality of Test Plan and Test Cases -
B: # of defects found in Test Plan & Test Cases.
A: Test Case writing
B: Number of test cases written per week
A: Test case execution
B: Number of test cases Executed per week
A: Defect Validation
B: Number of defects validated per week
A: Proactively add test cases wherever gap is found.
B: # of instances done
A: Updating test cases end of every day after the execution
B: # of instances found not updated
A: 100% coverage of requirements in Test cases
B: # of instances found not done
A: Number of defects filed - Quality Quotient (# of Sev1 * 4 + # of Sev2 * 3 + # of Sev3 * 2 + # of Sev4 * 1)
B: Rating obtained
A: Invalid defects should not be more
B: Among the raised defects not more than x % should be invalid (Duplicate, Not a Bug, User Misunderstanding/Working as designed)
A: Clarity of the defects logged which should include clear description, steps to reproduce and appropriately filled other relevant details.
B: Query raised by Dev team
A: Timely review of the docs.
B: Number of instances not giving review comments
A: Status report should reach in time without follow up
B: # of instances required follow up
A: Action item needs to be tracked and completed in stipulated time
B: # of instances not done
A: Shows flexibility to work (willing to work on different workstream, different kind of testing)
B: # of instances done

2. Test Environment Creation

A: Take backup once the test environment is ready
B: Time taken to backup the environment
A: Ability to restore the backedup environment at any point of time
B: Time taken to restore the environment
A: Contributes to intranet and the information are up to date
B: Number of contribution and quality of the contribution
A: Optimally use of Software and Hardware resources
B: Report on effective use of resources

3. Product Knowledge

A: Ability to simulate close to the production environment as possible with the available software and hardware resources
B: Time taken to set up the environment
A: Ability to set up the end to end environment for testing and deploy the components
B: Time taken to set up the environment
A: Ability to diagnosis the customer issues and thus helps customer support people
B: # of times helped to diagnosis the customer issues
A: Ability to draw a comparison of the product feature against the competitor's product on the same line
B: Depth of value added
A: Document the steps of installation with the screenshot and a document on how to configure, how to overcome the problems faced during installation and configuration and thus create a knowledge base
B: # of documents prepared
A: Ability to make judgments on adequacy of testing coverage and quality of defects
B: Reports on testing coverage
A: Ability to prioritize the test cases, identify blocking issues which is must fix before the product goes live
B: Reports on this and # instances done
A: Ability to optimize decisions pertaining to test execution based on the areas needing focus and available timelines, identifying areas needing improved test coverage
B: # of instances done
A: Ability to strategizing the test automation
B: # no of times taken the initiative to automate
A: Keep a track of own learning about the Product knowledge gain. Must increase in every quarter
B: Progressive increase in product knowledge

4. Domain Knowledge

A: Ability in writing test scenarios in line with the business work flows
B: # of instances done
A: Read online resources and come up with presentation
B: # of instances done
A: Ability to suggest new functionality which will help the end user
B: Suggestion made v/s accepted
A: Be aware of the upcoming functionality/technology on domain front
B: Shares knowledge among team members
A: Ability to understand the entire business workflow , not only about the functional knowledge
of products owned by the team
B: Clarifying queries
A: Ensure that QA team analyze the defects filed by customer for the previous releases and make sure that same scenarios are converted to test cases for the upcoming release and gain domain knowledge on how the customer are actually expecting out of the product.
B: # of instances done

5. Communication & Team Player

A: Communications should be timely, clear, concise and readily understood by intended recipient
B: No further clarification required
A: Clear and concise status reporting
B: No further clarification required
A: Timely escalation of issues / problems
B: # instances done
A: Maintaining good relationship with team members
B: Complaints from team members
A: Proactively helping the team members/cross team members in Project related activities.
B: Feedback needs to be taken from the cross team members
A: Activley participate in the meetings and contribute.
B: # of instances done
A: Not disclosing confidential information (salary and appriasal rating etc)
B: # of instances done
A: Extending help to other teams in need within team / organization level.
B: Feedback from others
A: Number of presentation given within Team/organization
B: # of instances done
A: Knowledge sharing and mentoring new joiners
B: Feedback needs to be taken from the person mentored

6. Training and Knowledge Sharing

A: Giving detailed trainings to new joinees by scheduled date.
B: Feedback Mechanism
A: Training on tool/Technology
B: Feedback Mechanism
A: Attend trainings as per the identified training need
B: Shows increase in productivity/understanding on the related fields after attending the training
A: Participates in knowledge sharing session in team level
B: Feedback Mechanism

7. Ownership and Commitment

A: Showing a sense of ownership, takes complete ownership of the tasks given
B: # of instances done
A: Have a can do attitude, if can not be done in certain way suggests alternatives
B: Suggestions made
A: Come out with some constructive ideas which can be achieved in the period while waiting for the dependent task to be completed and thus add value by utilizing the time effectively
B: # of instances done
A: Proactively acts on tasks, informes about the advanced leave plan, Takes responsibility to to educate the team members to handle in case of his/her absence
B: # of instances not done

8. Innovation

A: Exploring Innovative ways to accomlish work. Come up with innovation/improvement areas
B: Number of suggestion made vs Accepted
A: Helps to define new process and putting the process in place. Tries to improve the process efficiency
B: Number of suggestions given
A: Publish white papers on areas of interest
B: # instances done

9. Client Management

A: Client satisfaction
B: Accolades from client
A: Client communications and relationship
B: Bonding with client
A: Client negotiation
B: Negotiating effectively with client for win-win relationship
A: Awareness of client roadmap and strategy
B: Project planning and risk planning

Wednesday, July 7, 2010

Unusual ways to find bugs quickly

Have informal chat with developers and try to find out
1. What are the functionalities/functional areas they are struggling to code?
2. What are the functionalities for which coding effort has been shared by more than one developer?
3. What are the last piece of functionalities they are stretching themselves to finish it off within the deadline?
4. What are the functionalities where the requirements are not that clear and developers are having more discussion with the product manager to know the expected behaviour to write the code accordingly?
5. What are the functionalities they have coded after coming back from the long vacation?
6. What are the functionalities which was assigned to Developer A and later transferred to Developer B with a short notice?
7. What are the functionalities coded by the developer who is actively looking for a job outside?
8. What are the functionalities coded by a developer who has already resigned?
9. What are the functionalities coded by a developer who is technically not very strong/disorganized in nature?
10. What are the functionalities coded by a developer who is not focused enough (who goes to have a cup of coffee, tea a number of times in a day / always on chat/ social networking sites)
11. What are the functionalities coded by the developer who is not very confident about his code and keeps asking the tester whether any bug has been found out in periodic basis?

Monday, June 28, 2010

Notes on daily activity list by a QA person

Notes on daily activity list by a QA person

Here is a sample for daily activity list/status for a new member (leadership or managerial position) in a QA team who has joined a new organization. Though it might not make sense for others completely (as it refers some pointer for a particular project/product etc and the background is not given here) but it is certain that it is going to help others to give a snapshot of how we can keep track of our daily activities which is definitely very helpful in year end/quarterly appraisal cycle and this is must for a feel good factor for your accomplishment.

05/06
Got an overview of the project from Mr. XX
05/07
Send out a mail to Mr. Boss about some basic queries - scope of work, goals, communication interface, guidance etc
Going through Technical architecture docs
Send out a mail to Onsite manager, letting him know my work plan
05/08
Took an interview along with Mr XX
Had an discussion with Dev manager and got an very high level understanding of the workstream
05/09
Attended Client presentation
05/12
Screened a resume and gave the feedback (Rejected) to Mr Boss
Preparing a PPT on the domain knowedge
05/13
Preparing a PPT on the domain knowedge
05/15
Attended Video conference with Onsite managers - counterpart
05/19
Intercated with other professional QA team lead and got to know about QA on professional service
Sent out QA metrics. Added a couple of them in the existing list
05/20
Gave the presentation on domain knowledge
Intercated with dev team members and got an overview of Product MC
05/21
Had an walkthrough on Product MC, MR and Application patch testing process by Mr XXX
Sent the analysis of product XAT defects to onsite counterpart
Sent out the QA charter to Mr Boss
05/22
Attended Induction Program by the organization
05/23
Attended Induction Program by the organization
05/26
Sent short para on Engineering QA processes/expertise/tools to the organization QA head
working on Metrics finalization
Took Face to face interview - 1 team member
05/27
Leave
05/28
Discussed with Mr XX about the monitoring and tracking of the projects
Sent the QA metric Plan to Mr Boss(Excercise time frame, priority to diff kind of testing)
Received direction of Goal planning started working
05/29
Sent out the modified QA charter
Goal Planning - working
05/30
Attended product architecture presentation by Dev team
Attended Governance presentation at (organizaton level for managers)
Took two interviews. Provided feedback comments
Coordinated with onsite manager to give for two time slots for client interview of the selected candidates
Completed Goal planning on Team as well as individual perspective and sent to MR Boss
06/02
Took two interviews (internal candidates) and provided comments
Come out with RRF description for the QA position for TDD method (took inputs from Dev manager and Mr Boss) position and sent to onsite manager
Goal planning - not received the feedback - followed up with Mr Boss
06/03
Sent out StabilityAnalysis matrix to Mr Boss
Goal planning - discussion with Mr Boss
Coordinated with onsite QA managers for time slot for a candidate
06/04
Attending product installation process by Mr xx
Goal setting - cleaning up goal and making it realistic one
06/05
provided interview feedback comments for a candidate to onsite manager
screened resumes for testing position (TDD method)
Sent the Goal planning - cleaned up and made more realistic one - finalized from my side
06/06
provided comments on automation test effort estimation by profession service team to Mr Boss
Helped team members and gave inputs to for filling up the Goal sheet
06/09
Discussed with Mr Boss for each point - Goal planning
Meeting with Mr Boss and Mr XX - testing update
Gave comments for a candidate's profile - test position(TDD method)
new team member joined today
Fine tune the Final goal sheet as applicable to each team member
Sent goal sheet to onsite manager for review
06/10
Created a document on Testing approach to take on a project from EDC for product line MR
Took two face to Face interview an dprovided feedback
06/11
working on Test plan creation for MR Rel 52 patch asked by onsite manager
Attended Security patch traiging meeting with onsite team members
06/12
working on Test plan creation for MR Rel 52 patch asked by
Attended Bi weekly status meeting
06/13
working on Test plan creation for MR Rel 52 patch asked by onsite manager
Sent the test plan to Mr Boss
Sent the test plan to onsite manager
Had Goal planning discussion with onsite managers
06/16
Meeting with professional service QA folks captured skill sets, current tasks and aspiration and sent to onsite manager.
06/17
product installation - attended the demo
Had a discussion with QA director
Discussed with onsite manager about the open items in Test plan
going to have access to Test Director, Clear Quest, access to onsite server lab.
06/18
followup on teammeber working to access TD
coordinated with Onsite manager for TD and CQ access
Sent the modified Goal sheet to onsite manager incorporating the review comments
06/19
PTO
06/20
Goal sheet sent to team, gave guidance to fill up. Followed up to get the filled up form from all of them
Prepared a checklist - resource utiliztaion chart to make the project plan ready
Took one interview - gave feedback. Rejected.
06/23
Had one on one discussion with teammember A,B,C
attended Testing Council Brown Bag Session and QA Automation Dashboard training
attended call with onsite coordinator about the setup of MR server
06/24
Had a discussion with profession QA lead about engaging QA folks on Engg QA work.
Sent out a plan for next 2 weeks how i am going to engage team members
Goal sheet - completed for all the existing team members
attended Call with onsite coordinator - discussed resource utilization , info about MR rel 52 patch
06/25
gave team members inputs to start -
call with onsite coordinator - ask functional testing info, QA utilization
asked teammember C to work on open source tool
06/26
attended Quality Center training - 2 hrs
Dev manager explained about the defects on MR Patch 62
Identified few gaps
call with onsite cordinator
Attended Bi weekly status meeting
Took one interview
06/27
Quality Center training - 2 hrs
MC Brown bag session by Dev team lead - 1.5 hours
completed identifying the gaps in testing by dev team and re etimated the timelines.
Attended call with onsite coordinator - Review Gap analysis and MR Coverage
Took one interview - QA position for TDD method
06/30
MR Patch testing - coordinated team members and dev team
could not access to Multibox environment, team member spent time in local boxes and others understanding the product.
verified and Approved timesheet for the teammembers
QA Dev team hands off - document prepared by Mr Boss - Review comments given
Call with onsite coordinator - MR testing progress

After one month......

07/29
Attended Corporate Banking Training
Prepared plan and sent to Onsite manager
Oversee R61 XAT testing
Sent to Mr. Boss: R61 XAT Estimation, Plan and PIN doc.
Asked onsite coordinator for Release Note - R61 testing
Sent out update on Steering comitte meeting inputs to Mr. Boss
Took one interview and sent out feedback.
Sent Test Closure report to Onsite manager and Mr Boss - R51 Xat testing.
Sent status of R61 Xat testing to Onsite manager.
Sent a mail to onsite trainer - status on CB knowledge gain
07/30
Attended CB Training
Discussed with Mr. Boss about the open requisition (one more position) for QA Team
Sent a mail to onsite developer - about the plan for fixing reopened defects of R51 0
Attended CB Training
Discussed with Mr. Boss about the open requisition (one more position) for QA Team
Sent out a mail to onsite manager about QA position (TDD method)
Modified the R61 Xat testing plan as one resource will be back to back to PS and sent to Onsite manager.
Sent a mail to onsite manager about the resource utilization .
Sent status of R61 Xat testing to onsite manager.
Sent a mail to onsite manager about the identified resource for automation

07/31
Attended CB Training
Validated a resume for the suitability of TDD position.
Attended All Hands meeting
Asked onsite doc person about the Release note R61 Xat testing.
Coordinated with onsite person about the automation tool installation.
Asked onsite manager about a license for the a tool.
Sent status of R61Xat testing to Onsite Manager.
Sent to Onsite manager about the confusion on R61 Xat defect list. QA has one list and Dev has other list.

08/01
Attended CB Training
Followed up with Developer about the comments on test cases on R61 Defects
Sent out everybody's goal sheet

08/04
Attended CB Training
Inputs to 2 team members - CB and following up
Inputs to team member - R61 Xat testing - following up - MR defects - likely to come.
Following up with 1 team member - R61 integration tetsing
Sent a mail to Onsite member - Subversion access
CQ access for 1 team member - sent a mail to insite coordinator
Bill - Existing test cases for corporate Banking
Prepared test closure report for R61 Xat testing
Call with onsite manager

08/05
Attended CB training
Peg release defect analysis - 2 team members
Followed up with subversion access - Sent a mail to onsite manager
Xat upcoming defects assigned to 1 team member
Places request for QC for 2 team members
Requested devloper for backend level testing session - CB training
Asked onsite person for R31 and R41 Test environment
Making sure all the access details - CQ, Subversion, etc for all the team members
Analysis of peg patch defects - observation sent to onsite manager

08/06
Appreciated 1 team member on R61 patch testing well ahead of the schedule
2 team members- High level test scenario prepared
Shortlisted few resumes
Asked onsite team members to send us the sample input files for test
Coordinating with Onsite member for QC login
Took one interview and sent out feedback
Analysis of the defects peg patch - update - sent to onsite manager
Subversion - still not accessible - sent out a mail
Call with onsite manager
08/07
Making sure all the access details - CQ, Subversion, ownlog in id for all the team members
Validated a resume
Updated bi weekly status update call and sent out ppt
Coordinated with onsite manager for a time slot for interviewing a candidate
sharepoint site for uploading documents no accessible - sent out a mail to onsite coordinator
Prepared a plan for R31 and R41 Xat defect validation
team member's test scenario validated and gave a go ahead signal
Asked onsite person for Test environment details for peg patch testing
Sent out onsite manager - PTO plan for the team
Took one interview -sent out the feedback
Perf rating for the team - excel sheet is under progress.

08/08
Sent out to Mr Boss - the perf rating of the team members/comments
Sent out to Onsite manager- estimation and plan for Peg patch testing
Sent out to Onsite manager - progress on R31 and R41 Xat defect validation status
oversee patch testing and KT progress

Saturday, May 29, 2010

Are you a good manager? 100 questions to evaluate yourself.

Are you a good manager? 100 questions to evaluate yourself.

Communication:[Listening,Two-Way Feedback,Written Communication,Oral Communication,Oral Presentation,Vision/Goal Setting]

1. Do you listen to your team members patiently? empathetically?
2. Do you send him/her a signal of actually you are listening e.g. like yes, ok, nodding your head etc.
3. Do you reword and ask them whether you have understood correctly to make them assure that you have listened and understood the problem carefully?
4. Do you give an opportunity to express their concerns?
5. Do you carry out periodic 1:1 and ask them questions and get their feeling about which went good/bad/ any concern area and capture their inputs?

6. While taking a decision do you ask them "What do you feel?" and get their feedback?
7. Do you ask your team member's opinion for any approach?

8. Are you prompt in sending emails?
9. While communicating do you make sure that the thought is logically build?
10. Do you make a judgement when written communication is required over Oral communication and act accordingly?
11. Are your status report easy to understand and capture every bit of information?
12. Are you prompt in providing comments when your expertise is sought after?
13. Do you conduct a focused QA meeting?
14. Do you share the objective of the meeting to be held with your team members?
15. Do you call Stand up meeting with team/sub team as on when required?
16. Do you coordinate with other stakeholders when your team members require any info?
17. Do you attend status call when your presence is required?
18. Do you make a note of important thing what you are going to tell, organizing the ideas, practicing mentally before and say with enthusiasm and confidence before your team members?

19. Do you take any presentation (technical/soft skill/report etc)?

20. Do you share the big picture with your team members
21. Do you share the context, objective of the testing instead of giving the instructions what to do?
22. Do you involve your team members for the estimating effort?
23. Do you contribute towards the value add activities in the team apart from routine tasks?
24. Do you explain what is the best can be done in a given condition when team is subjected to constraint of different criteria?
25. Do you measure individual development plan and share the roadmap for achieving own’s goal?



Task Management [Planning/Organising,Delegation,Administrative Control,Performance Evaluation,Performance Management, Recognising/Rewarding]

26. Do you do any ground work/home work before presenting the information to the team?
27. Do you involve your team members for the decision making process?
28. Do you participate in brainstorming session when the area is completely new?
29. Do you share the data (schedule etc) with the team members (whether they are comfortable in executing and completing the tasks) before publishing it?
30. Do you sometime ask for volunteers for the task which needs to be executed instead of directly allocating the resource?
31. Do you spend time in understanding the technical complexities, domain knowledge of the product and extend your technical help, domain knowledge to the team members?
32. Are you prompt in giving comments for documents written by your team members and your superiors?
33. Do you guide team members how they can prepare themselves when a new task is under the pipeline and utilize their time to best?
34. Do you co-ordinate with diff stake holders of the project and get the necessary information for the team members?
35. Do you keep maturity of the team in mind and always look for opportunities how you can improve upon that. Do you capture data to establish that?
36. Are your status report accurate and detailed?
37. Do you effectively utilize your resources?
38. Do you forecast for required resources based on the available information on the project (when you are handling multiple projects)?
39. Do you manage the leaves of the team members effectively? Availability of alternate resources to ensure smooth progress of the project? Plan for transitioning of required info to another person if someone goes for leave etc?
40. Do you formally/informally chat with dev team/product management team to get to know proactively what might come next to qa's plate?
41. Do you follow up with the other teams to make sure your entry criteria are successfully validated to start the task without delay?
42. Do you try to find out how you can improve the efficiency of the team (by adopting any tool etc by applying proper cost benefit analysis)?

43. Do you encourage the team members to brainstorm among themselves and let you know the decision derived?
44. Do you sometime give the team members liberty to divide the tasks among themselves?
45. Do you sometime give the team members the opportunity to directly communicate with the diff stakeholders?
46. Do you see the opportunity and delegate enough responsibilities to your team member and empower them?
47. Do you take judgemental decision when to offload and delegate the tasks to your team members?

48. Do you monitor the progress closely?
49. Do you try to find out plan-b task which the team member can execute when the execution does not go as per the prepared plan and thus minimizing the risk

50. Do you monitor performance of the team members periodically and share with him/her positive and improvement areas, areas of concern etc?
51. Are you open to take suggestion from the team members?
52. Do you give timely feedback to the team members?
53. Do you focus/highlight on their strengths rather than their shortcoming/faults?

54. Do you encourage your team members and applaud their effort?
55. Do you always have time for your team members when they need to talk to you?
56. Do you always provide positive enforcement to the team which will boost their productivity?

57. Do you recognize your team members?
58. Do you nominate them for any award?
59. Do you appreciate your team members / applaud in the team meeting for their achievement?



Interpersonal [Team Development, Interpersonal Sensitivity, Conflict Management, Coaching, Leadership/Influence, Employee Involvement]

60. Do you make sure the person whom you are allocating the task has the necessary competency to execute that?
61. Do you encourage the team members to communicate and collaborate with fellow team members?
62. Do you encourage them to come up with their own ideas?
63. Do you always make them as part of project team and make them feel responsible for project success/failure?
64. Do you check stability of the team members and take necessary action to address it?
65. Do you maintain a skill matrix in your team to deploy the team members effectively when a new task arrived?
66. Do you regularly conduct 1: 1 with the team members and collect what went well, what they could do better, any concerns , their aspiration and derive action item from there and execute it
67. Do you identify for each of the team members , their strong and their improvement area
68. Do you allocate some time for training for your team members depending on both project needs and individual interest?

69. Do you show them care and respect?
70. Do you show empathy for their personal problem?
72. Do you ever criticize/condemn your team members?
73. Do you make your team members feel important?
74. Do you smile enough and promote a healthy culture in your team?

75. Do you collect and derive a conclusion from the collected data for managing a conflict? Do you ever go by mere perception?

76. Do you measure what is your contribution towards coaching/grooming the team members?
77. Do you openly admit your shortcomings/lack of knowledge to your team? Do you help them to find a right reference from your network of people in case if you are not able to directly help them on a technical issue?
78. Do you make them self sufficient incrementally?

79. How do you influence the team members, by doing yourself first, being the first change?
80. Do you set an example by actually doing by yourself?
81. Do you discourage to do any kind of blame game?
82. Do you appreciate effort of a team member who truly deserve that?
83. Are you always fair in approach?
84. Do you motivate your team members by occasional motivational talk in the team meeting?
85. Do you ever take share of team member's achievement?
86. Do you respect team member's opinion/ideas? Do you ever say they are wrong?
87. Do you ever admit your mistake in front of the team?

88. Do you seek their feedback after putting your ideas?


Problem Solving [Problem Analysis, Decisiveness/Judgement]

89. Do you always try to understand the root cause of the problem and try to attack the root of the problem while solving it?
90. Do you ever try to attack the person instead of the problem?
91. Do you apply the solution and gain an experience out of it?
92. Do you say what you want to say without spoiling the relationship?

93. Do you always apply judgement before imposing the tasks to the team mebers?
94. Do you take prompt decision before the situation gets out of control?
95. Do you gather all the relevant information before taking the decision.
96. While taking the decision do you list out all the possible solution? Do you narrow it down based on pros and cons.?
97. Do you think about the consequences as well while applying a solution?
98. Do see from other's angle and feel if it is applicable?
99. Are you comfortable in project/people/operational issues?
100. Do you help team members to take any decision instead of giving direct orders?

Tuesday, April 6, 2010

Roles and Responsibilities of TechLead in a Testing Team

Roles and Responsibilities of TechLead in a Testing Team:
1. Be a technology mentor
2. Advanced knowledge in New technology
3. Advanced knowledge of various tools used in the project (testing tools, defect tracking tools, performance testing tools, Automation tools etc)
4. Understands about the Customer need and production env and guide team members about the New support (e.g. Browser compatibility testing, Upgrade testing, operating system, Database versions etc)
5. Educate QE team members on technology area and coordinate with PM and dev and extract the required information for QE team
6. Ability in Setting up Test env and troubleshooting and resolving issues faced by team members
7. Expertise in analysing market trends, available solutions, emerging customer needs etc.
8. Advance knowledge of the related domains and an ability in understanding the domain and having an end-user perspective and derive test scenario/test cases from that angle
9. Ability in contributing to Test process improvements
10. Ability in taking ownership of Products/Modules/Major Features and making a go no go decision
11. Ability in coaching others on products through training and workshops
12. Ability in making a comparative study about the features/functionalities available in diff product in the market and suggests enhancement to Product Management team and thus contributing to the product roadmap.
13. Advance knowledge of operating systems, databases, web servers, IIS etc
14. Knowledge of CM procedures and build process - branching, release cycle, release numbering
15. Expertise in troubleshooting various tools used in the project
16. Expertise in leveraging capabilities and efficiently using third party tools to optimize testing
17. Mastery in evaluating and recommending the right tools to be used

Monday, March 29, 2010

Test Metrics - Description and Definition

Severity wise Defect distribution % :
Description: This report shows severity wise defect distribution
Definition: Severity 1 defect % = (Total number of Severity 1 defects/Total number of defects) *100 and so on

Status wise Defect distribution % :
Description: This report shows State wise defect distribution
Definition: Closed Defect % = (Total number of Closed Defects /Total number of defects )*100 and so on for other status of the Defects

Regression Defect % :
Description:This shows the ability to keep the product right while fixing defect
Definition: Regression % = ( no of regression bugs )*100 / Total no of bugs

Reopen defect %:
Description: Percentage of reopened bugs against total bugs
Definition: Reopen bugs % = (reopened bugs) * 100/ total bugs

Invalid / Duplicate defect %:
Description:Percentage of invalid and duplicate bugs against total bugs
Definition: Invalid & Duplicate bugs % = (invalid bugs + Duplicate bugs) * 100/ total bugs

Defect distribution %:
Description: This metric shows a defect distribution per different kind of testing eg Functional testing, Regression testing, Usability testing, Documentation, Installation/Uninstallation Testing etc
Definition: % of Defect out of Functional testing =(Number of defects produced by functional testing /The total defects) *100

Root Cause Analysis on Customer found Defect and % distribution:
Description: This report shows a root cause analysis and percentage distribution of the customer found defects
Definition: Distribute it % wise as (Test case missing), (Platform not supported), (Not clear Documentation), (Not clear requirement), (Platform not covered) etc

Cumulative defects fixed per week:
Description: The shows the cumulative volume of defects that gets fixed on a weekly basis. An input to the testing team to plan ahead for the validation effort
Definition: Sum of all defects fixed in the week+sum of all defects fixed till that week.

Defect Validation %:
Description: This report shows (Closed+Reopen) vs Fixed defects
Definition: (No of validated (closed + reopen) defects/No of fixed defects) *100

Defect Validation Rate:
Description:No of defects validated per week. This is a measure of QA productivity in terms of validating the fixed defects
Definition: No of defects validated per week per person = (total no. of validated defects) *40/(Total hours to validate the defects * no. of resources)

Cumulative defects closed per week:
Description: The shows the cumulative volume of defects that gets closed on a weekly basis.
Definition: Sum of all defects closed in the week+sum of all defects closed till that week.

Bug Trend and Forecasting:
Description: This shows the inclination of uncovering a bug based on the past data and thus extrapolate it to forecast a value for future
Definition: E.g. Past 3 weeks moving average of weekly number of bugs

Defect ageing report:
Description: This shows how many bugs are still open on different time period slots.
Definition: No of open bugs with <> 3 months old etc as applicable

Defect Density:
Description: This shows the number of bugs found per per months of dev effort
Definition: (Total no. of bugs) * 160/(Total man hours of planned dev effort)

Defect Removal Efficiency (DRE):
Description: This shows the efficiency of QA in finding bugs against the bugs found by customers
Definition: DRE = (Pre release Sev 1&2 bugs) * 100/ (Pre release sev 1&2 bugs + post Release sev 1&2 bugs)

Test Coverage %:
Description:This report shows the extent to which testing covers the product’s complete functionality
Definition: (Total number of Test cases executed/Total number of Test cases available across all the platforms )*100

Test design per person week:
Description: No of test cases written per week classified into size or complexity based categories. This is a measure of QA productivity during the TC definition stages.
Definition: TC created per week per person = (total no. of TCs created) *40/(Total hours to wrting TCs * no. of resources)

Test Case effectiveness:
Description: This metric shows an indication of the effectiveness of the test cases.
Definition: (Number of Test cases produced defects/. the total number of test cases) *100

Status wise Test Case execution distribution:
Description: This metric gives an indication of status of Test case execution
Definition: "Passed Test Cases %=(No of Passed Test cases/Total Number of Test cases)*100 and so on forFailedBlockedNot yet Executed Test cases"

Cumulative test execution per week:
Description: The shows the cumulative no. of test cases that gets executed on a weekly basis.
Definition: Sum of all test cases executed in the week+sum of all test cases executed till that week.

Test execution per person week:
Description: No of Test cases or Test steps executed per week. This is a measure of QA productivity during the Testing phases.
Definition: TCs executed per week per person = (total no. of TCs executed) *40/(Total hours to execute TCs * no. of resources)

Review Comments on Test Plan:
Description: This shows the quality of the Test Plan
Definition: No of review comments received per Test Plan or comments can be distributed as High, medium, low

Review Comments on Test cases:
Description: This shows the quality of the Test Cases
Definition: No of review comments received per Test Case or comments can be distributed as High, medium, low

Automated Test Cases %:
Description: This report shows Automated vs total Test Cases
Definition: (No of automated Test case/Total no of Test cases to be automated)*100 %

Automation of TC rate:
Description:No of Test cases automated per week. This is a measure of QA productivity in terms of automating the test cases
Definition: TCs automated per week per person = (total no. of TCs automated) *40/(Total hours to automate TCs * no. of resources)

Resource Utilization:
Description: This shows how the time spent by the QA resources have been utilized.
Definition: (Time reported on productive QA task) / Standard time available for the period

Effort variance:
Description: This is the deviation in the actual effort as a percentage of the estimated planned effort.
Definition: (Actual effort hours - Planned effort hours) / Planned effort

Schedule variance:
Description: This is the deviation in the actual completion date from the planned completion date as captured in the project schedule.
Definition: Differential duration / Planned duration

Thursday, March 25, 2010

Qualities for a Software Tester Part - 2

Continued from Qualities for a Software Tester Part - 1 .....

Here I have compiled some of the qualities required for a software tester. (Part 2)

C. Domain Knowledge
1. Understanding of domain and use it for TS/TC writing
Beginner - Basic understanding of domain related to own area(module/feature) of work
Intermediate - Ability to write test scenarios in line with domain
Expert -
Understanding of the entire domain and expertise in using it to write test scenarios /Test Cases
Ability to review test scenarios in line with domain
Ability to guide others on the domain knowledge and help them write test scenario/TC
Ability to judge the quality of a defect and impact from the domain knowledge

D. Product knowledge
1. Understanding of diff product line
Beginner -
Basic understanding of the product/application flow
Intermediate -
Ability to write test cases aligned to product requirements
Ability to prioritize the test cases
Ability to review TC in line with product flow
Ability to identify the stable and unstable area of the product
Ability to identify the area which needs more coverage
Expert -
Ability to provide technical expertise on specific products
Understanding of end to end product line
Ability to judge the quality of a defect and impact from the product knowledge
Ability to ensure adequacy of test coverage with respect to functionality of product
Ability to develop test strategy based on impact analysis due to defect fixes
Ability to demonstrate the product feature to customer
Ability to propose new feature

E. Automation
Beginner -
Ability to use Test Automation tools to create test scripts, based on manual test steps
Ability to execute automated test cases
Intermediate -
Ability to automate new test cases, maintaining/executing existing regression suites
Ability to analyze automation execution result
Expert -
Ability to use Test Automation tools to create a framework for automated testing
Ability to identify the stable areas for automation
Ability to sequence test scripts for dependency
Ability to troubleshoot issues related to automation suites
Ability to prepare Automation strategy
Ability to evaluate and recommend automation tool for use

F. Tools and technology
1. Clear quest, Quality Center, Version One,SubVersion, Watin and Nunit, Concept of web applictaion, Agile methodology, RUP concept etc. etc.

G. Communication
1. with peers, with cross team members, with onsite team members, with project stakeholders
Beginner -
Ability to attend calls
Ability to provide timely and accurate analysis of findings
Ability to discuss doubts with other team
Intermediate -
Ability to work within a team environment with good communication skills
Able to report any problems as quickly and accurately as possible
Ability to co-ordinate with onsite for issue resolution.
Ability to attend the regular client call and discuss the weekly status with the client.
Expert -
Ability to attend appropriate development project meetings with developers and end users, to gather information for future testing projects and to fully understand the scope of the project.
Ability to communicate effectively with developers and other stakeholders in the organization.
Ability to participate in status meetings with the offshore and onsite teams and update the status to the concerned parties on time
Ability to communicate the test project status (detailed, and measurable) to the team members, to the customer, and to management
Ability to understand the business objectives and providing potential soln



Thursday, March 11, 2010

Test Deliverable(s)

Here is the list of Test Deliverables for an Existing Project/New Project/Support Project

Existing Projects – Projects that have been delivered to the market but are still undergoing new features and fixes.
Following are the deliverables from the QA team as applicable to a particular project
1. Test plans (new feature)
2. Test cases
3. Modification of the existing Test cases
4. Test case execution result
5. Reports on Bugs
6. Automated test suit
7. Modification of existing automated tests
8. Automated Test case execution result
9. Back up of Test Environment
10. Test Metrics
11. Post Mortem Report and Lesson Learned document

Interim Deliverables:
1. Status Report
2. Review of Test Plan and Test Cases

QA team should be involved right from requirement review discussion to functional design document review meeting to get a better understanding of the product functionality. Test Plan, Test Cases would be reviewed by Internal QA team, Concerned Dev team

Broad testing tasks would include Task updation in the iteration/release backlog -> Test Plan preparation->Test cases preparation -> Modification of the Existing Test cases -
>Environment setup -> Automate test suit -> Modification of existing automated tests -> Test case execution -> Report Bugs -> Demonstration to client at the end of the each iteration -> Post Mortem report and Lesson Learned document

New Projects – Any application that has not been delivered to the market place yet.

Following are the deliverables from the QA team as applicable to a particular project
1. Test plans (new feature)
2. Test cases
3. Modification of the Existing Test cases
4. Test case execution result
5. Reports on Bugs
6. Automated test suit
7. Modification of existing automated tests
8. Automated Test case execution result
9. Back up of Test Environment
10. Test Metrics
11. Post Mortem Report and Lesson Learned document

Interim Deliverables:
1. Status Report
2. Review of Test Plan and Test Cases

QA team should be involved right from requirement review discussion to functional design document review meeting to get a better understanding of the product functionality. Test Plan, Test Cases would be reviewed by Internal QA team, Concerned Dev team

Broad testing tasks would include Task updation in the iteration/release backlog -> Test Plan preparation->Test cases preparation -> Modification of the Existing Test cases ->Environment setup -> Automate test suit -> Modification of existing automated tests -> Test case execution -> Report Bugs -> Demonstration to client at the end of the each iteration -> post mortem report


Support – Applications that are in a production environment but are only changing based on reported defects.

Following are the deliverables from the QA team as applicable to a particular project

1. Modification of the Existing Test cases / Add test cases
2. Test case execution result3. Modification of existing automated tests.

Broad testing tasks would include Test Plan preparation->Test cases preparation -> Modification of the Existing Test cases ->Environment setup -> Defect reproduction-> Automate test suit -> Modification of existing automated tests -> Test case execution -> Report Bugs -> Add the test cases to the correct repository

Testing Activities in a Test Process following RUP + Agile methhodology

Here is a list of Test Activities to be carried out in a Test Process following RUP+Agile Methodology.

Background: A new testing team will carry out the testing of product say ABC 6.0 where as the previous release ABC 5.0 was tested by some other team and very basic info is passed from the previous team to current team.

Inception Phase
• High level Test Strategy will be prepared
• Level-0 estimate from QE side will be provided

Elaboration Phase
• Test Plan will be prepared
• Detailed estimate for Elaboration phase from QE side will be provided
• Base test cases for ABC 5.0 will be made ready
• Base test automation scripts for ABC 5.0 will be made ready
•Functional Testing
oHigh level Functional test scenarios creation referring to the Use case document will be prepared. Test scenarios based on workflows, covering one or more requirements will be created
oTest cases from the Test scenarios will be derived in iteration wise as an when design document and UI screenshots for the particular feature is available
oBase setup of Test environment for ABC 5.0 will be made ready
•Performance Testing
To ensure validity in comparisons of ABC 6.0 performance to its baseline, the performance testing methodology would include to be executed on ABC 5.0 to refresh past performance test results as the application is expected to function with client's user volume of data or Match the
testing methodology used for the ABC 5.0, so that those tests can be used as a baseline
o Analysis of ABC 6.0 requirement will be carried outo Performance Test Plan will be prepared
o Test environment for ABC 5.0 will be prepared
o Test Scripts will be prepared
o Base Line testing for ABC 5.0 will be carried out and parameters will be defined and finalized for comparison.

Construction Phase
• Test Plan will be updated/modified with execution plan.
• Detailed estimate for Construction phase from QE side will be provided
• Functional Testing
Assuming that all the new requirements part of ABC 6.0 would be delivered to QE in a number of iterations, following activities would be carried out on each of the iteration, Iteration 1 to N as defined by the team for the short releases.
o Test Case writing will be completed. Test cases would be organized as Smoke / Priority 1 / Regression test cases
o Traceability matrix from Requirement to Use Case to Test Cases will be createdo In each iteration, all the test cases authored will be sent to the stakeholders or their representatives for reviewo Setting up of the Test Environment during iteration wise testing will be carried out by deploying the build provided by Development team on top of ABC 5.0.
o Functional test cases would be executed to validate each requirement implemented correctly or not.
o Test Scripts for Automation would be prepared. Focus will be given on continuous creation of automated tests covering all Functional and Regression test cases.
o Automated test scripts will be executed
o After testing the new features, Regression testing is planned at the end of each iteration to ensure that defects fixed during the current iterations did not regress the producto Defect will be reported as part of Functional and Regression Testing
o All the fixed defects will be verified in the new build before going for the new feature testing. Sev1 and Sev2 defects must be planned to get fixed in the current iteration.o Test cases and Test Scripts will be modified as required .
o Test Plan will be reviewed at the end of each iteration to investigate incomplete tasks and re-estimate the timelines, if required
o Document validation with respect to the feature – User Manual, Online help, Installation Guides, Configuration, and Read Me File, Labeling and Packaging will be validated.o Project meeting will be conducted every day where the entire team sits together and discusses about the progress of the project and any issues preventing the progress.
o At the end of each iteration, produce Iteration Summary report with metrics to measure progress like
• No. of Test cases run at each iteration
• The running and tested feature metrics
• Effort Planned Vs Actual
• No. of Defects Vs functionality
• Schedule Variance
• Total No. of Test cases executed / Passed / Failed/ Blocked / Not Run
o There will be a Post Iteration Review after each of the iteration and the feedback will be taken as input for the next iteration plan.
Performance Testing
Performance testing will be carried out on iterative basis as on when Functional Testing for a particular feature is completed to ensure that the system performance does not degrade.
o Test Environment Setup ABC 6.0 and deployment of the build.
o Performance testing executiono Analysis, Monitoring and comparison report on performance parameter will be produced on each iteration

Transition Phase
• Test Plan will be updated/modified with further details
• Detailed estimate for Transition phase from QE side will be provided
• Functional Testing
Hardening /Regression iterations are planned to make sure system is stable and the fix does not have any adverse effect on the systemo In the final iteration QE team would test all new requirements and one complete flow as part of regression testing.
o During the final iteration, Regression test cases will be executed on test environment. Automated Regression test suite will be run as part of this.
o Validation of the defects which will be fixed in this phase will be carried out.
o Release note will be validated. Release note mentions the requirement items coded, defects fixed, known issues and any other critical information required for testing
o At the end of the iteration, produce Iteration Summary report with metrics to measure progress like
• No. of Regression Test cases run at each iteration
• Effort Planned Vs Actual
• No. of Defects Vs functionality
• Schedule Variance
• Total No. of Test cases executed / Passed / Failed/ Blocked / Not Run
• Integration Testing
Integration testing will be carried out in the integration lab to ensure the proper integration of all components of the product.
Performance Testing
Performance testing will be carried out after the regression testing is completed.
o Clean Test Environment Setup ABC 6.0o Performance testing executiono Analysis, Monitoring and comparison report on performance parameter will be produced at the end of the testing.

Monday, March 8, 2010

Qualities for a Software Tester Part - 1

Here I have compiled some of the qualities required for a software tester. (Part 1)

A. QA Concepts
1. Requirement Analysis
Beginner- Ability to analyze business requirement and understand it
Intermediate - Ability to analyze the Business requirement and derive test requirements
Expert - Ability to seek further information to fill up the gaps.Ability to understand the business objectives

2. Test Strategy
Beginner - Ability to understand and adhere to test strategy while testing
Intermediate - Ability to understand the Test Strategy and the objective behind itAbility to give some inputs towards test strategy derivation
Expert - Ability to write Test Strategy

3. Test Planning
Beginner - Ability to understand the Test Plan and work accordingly
Intermediate - Ability to fill up few sections in the Test PlanAbility to derive Test Plan using past project data
Expert - Ability to write Test Plan without any heuristic data

4. Test Scenario
Beginner - Ability to understand the Test scenario and derive test cases
Ability to document the Test Scenario if the Test Scenario has been described orally
Intermediate - Ability to derive test scenario from Use case/Business requirement
Expert - Ability to derive test scenarios with max coverage of the featuresSupport, guide and provide directions to the testing team to generate quality test case documents

5. Test Case
Beginner -
Ability to write test cases from Test Scenario/Use Cases
Ability to follow written instructions to execute test cases on specific areas of product functionality and check outcomes against expected results
Intermediate -
Ability to write test cases with complete details and aligned with Test strategyAbility to prioritize the test casesAbility to create test cases for various features/modules of the product
Expert -
Ability to write test cases including many aspects of the feature which can produce more bugs
Ability to optimize the test cases numberAbility to identify the areas which need more test coverage
Ability to write test cases in line with a specification to test a piece of functionality, identify and investigate ambiguities in tests and gain agreement from Product management team on proposed solutions
Support, guide and provide directions to the testing team to generate quality test case documents

6. Review Test Scenario/Test Case
Beginner - Ability to review individual test case/test scenario
Intermediate - Ability to review test cases/test scenarios from the feature/module coverage perspective
Expert - Ability to review test cases/test scenarios from Product and Domian knowledge perspectiveAbility to ensure proper reviews are done on the documents from the team

7. Traceability and Test Coverage
Beginner - Ability to map Req to TC with guidance
Intermediate - Ability to map Req to TC and derive reports
Expert -
Ability to map Req to TC and find out the gaps and identify the weak areas for coverage
Ability to ensure that the test cases have optimal coverage on the requirements – perform a test coverage analysis

8. Doc Review
Beginner - Ability to review the application flow of the user guide
Intermediate - Ability of reviewing technical and user documentation
Expert - Ability to review from the end user perspective

9. Defect raising and Defect Tracking
Beginner -
Ability to identify a defect
Ability to find defect
Ability to replicate a defect
Ability to report the defect clearly
Ability to validate a defect
Intermediate -
Ability to prepare bug report
Ability to identify a blocker defect
Ability to attend bug review meetings
Expert -
Ability to participate in bug scrub
Ability to utilize the analytical skills in determining the root cause of problems
Ability to analyze the bug reports for decision making
Ability to analyse the likely cause of and trends-in in defects during testing to identify problem areas, reduce errors and to suggest further improvements in the software development process.

10. Diff kind of testing(Functional, Regression, Smoke, Sanity, Integration, Performance, Load, User Interface, Database, Security, BlackBox and WhiteBox testing)
Beginner -
Should have basic knowledge of diff kind of testing
Ability to execute planned testing tasks with guidance
Ability to report test result
Intermediate -
Ability to execute various kind of testing
Ability to monitor test execution keeping the deadline in mind
Expert -
Ability of execution of test cases to test a complex application or component
Ability to prioritize the test cases while executing
Ability to guide others on testing concepts

11. Test Env Setup
Beginner -
Ability to set up the test env with guidanceAbility to deploy the build
Intermediate -
Ability to setup Test Env with minimal guidanceAbility to troubleshoot on own and identify the problem
Expert -
Ability to setup Test Env from scratch
Ability to derive the Test Env creation req.
Ability to identifying hardware and operating system requirements from functional specifications

B. Test Management
1. Scope of work, Scope of Deliverables
Beginner - Ability to understand the scope of work
Intermediate -
Ability to negotiate with for SOWAbility to derive the deliverables
Expert -
Ability to cope with changing requirements
Ability to adapt to the changing needs of a project
Ability to handling out-of-scope request

2. Test Effort Estimation, Task identification, Scheduling, Resource Allocation/Identification, Setting Milestone
Beginner -
Ability to come up with Test effort estimation with prior project data
Intermediate -
Ability to allocate work to the team based on the expertise
Able to prioritise and work within tight time scales in order to meet deadlines
Expert -
Ability to come up with effort estimation without prior project data
Ability to transfer estimation to scheduleUsing MS Project to create and track test schedules Ability to identify the resource need and the associated skill
Ability of timely delivery of different testing milestones.

3. Tracking progress
Beginner -
Basic knowledge of tracking tool
Intermediate -
Ability to track deliverables against plan (using Microsoft Project/version one) through observations, weekly meetings, reports
Expert -
Ability to review project plans and determine short and medium-term work plans

4. Issue Tracker, Risk and Mitigation
Beginner -
Ability to identify the risk related to own task
Knowledge of escalation methodAbility to timely escalation of issues
Intermediate -
Ability to communicate the issues faced effectively.Ability to perform risk analysis when required
Expert -
Escalate the issues about project requirements (Software, Hardware, Resources) to higher management.
Ability to identify the risks across test phases

5. Applying corrective action
Beginner -
Ability to understand the feedback mechanism/review mechanism in place and act accordingly
Intermediate -
Ability to ensure proper review and feedback mechanism is in place
Expert -
Ability to meet deadlines by constantly applying corrective action.Ability to take ownership, planning and driving of projects from inception to completion
6. Test metrics, Test Reporting/Status report, Test Closure - exit of project, Conducting Post mortem
Beginner -
Basic knowledge of tracking tool
Ability to produce reports, and collate reports from other testers
Ability to derive the metrics
Intermediate -
Ability to track overall progress
Ability to derive the applicability of metrics that can be extracted and that can be define, measure, and track quality goals for the project
Expert -
Ability to find gaps in coverage, opportunities and risks
Ability to make efficiency improvements and reliably manage projects
Ability to communicate the state of software; what has been successfully tested, what has not passed, what hasn't been tested at all (this can be due to time constraints, functionality unavailability etc)
Ability to say "no" when quality is not acceptable for release, supported by facts
Ability to provide weekly status reports, and plan own time and that of other testers to ensure completion of objectives so that the schedule is delivered to specified project timescales
Ability to prepares / updates the metrics dashboard at the end of a phase or at the completion of project.
Ability to conduct post mortem meeting and take feedback to turn into action items at the post mortem meeting

7. Resource Utilization, Team development, Developing backup resource, Training need
Beginner -
Ability to organize brainstorm sessions on a regular basis
Ability to explain any decision taken so that the team get the ideas behind that led to that decision
Ability to continuous monitoring and mentoring of Testing team members
Ability to coach other Test Engineers on any technical issues within own area of understanding
Intermediate -
Ability to adopt the best practises
Ability to organize trainings (external as well as internal)
Ability to work in group to share competencies
Ability to mentor and prepare induction plans for new members joining the team
Ability to provide direction to the team member.
Expert -
Ability to discuss with QA engineers and help them to derive the idea instead of exposing directly the idea
Ability to look forward to the products in development and planned for development, to foresee what new expertise and training is needed by the group
Ability to hire people based on current and future testing needs
Ability to assign task to all Testing Team members and ensure that all of them have sufficient work in the project.

8. Team Building
Beginner -
Ability to promote cooperation between Software and Test/QA Engineers,
Ability to run meetings and keep them focused
Intermediate -
Assign roles and responsibilities to test team members
Expert -
Ability to build a testing team of professionals with appropriate skills, attitudes and motivation
Ability to maintain enthusiasm of their team and promote a positive atmosphere;
Ability to promote teamwork to increase productivity;

9. Knowledge of configuration management - build, branching, development process, scope of using any tool for effectiveness, Building knowledge management
Beginner -
Basic Knowledge of CM procedures
Basic knowledge of the software development process
Intermediate -
Ability to take decision to procure Software Testing tools for the organization
Ability in leveraging capabilities and efficiently using third party tools to optimize the environment for testing
Expert -
Ability to conduct studies on various testing methodologies and tools and provide recommendations on the same
Ability to ensure the content and structure of all Testing documents / artifacts is documented and maintained.

10. Implementing test process
Beginner -
Ability to adherence to the company's Quality processes and procedures
Intermediate -
Ability to ensure effective test process improvement feedback by providing information back to process improvement groups
Ability to promote improvements in QA processes
Expert -
Ability to identify and implement process improvements for test efficiency
Ability to document, implement, monitor, and enforce all processes for testing as per standards defined by the organization.

11. Recruitment
Beginner -
Basic judgment skills to hire the right people
Intermediate -
Basic judgment skills to hire the right people
Expert -
Support the recruitment team for interviews

In Part 2 , I will cover on Domain knowledge, Product knowledge, Automation, Tools and Technology and Communication.