Wednesday, November 17, 2010

Samples - How to provide interview feedback - Candidates on Testing

Selected Candidates for next round:

Good concepts on Test Scenario and Test Case derivation from Use cases. Has good understanding of Test Coverage, Test Reporting. Has concepts of different kind of testing. Has high level awareness on different section of Test Plan.
Has got good domain knowledge on Financial - Trading. Shows interest and puts effort to gain knowledge on domain worked.
Has experience on QTP and has got clear concepts whatever he has done so far.
Did not get chance to work on SQL much and hence lacking skill on SQL


Has got fair idea about Test Plan
Has exposure to test case derivation, Bug Life cycle
Has concept of Functional, Regression, Unit and Integration testing
Is aware of Test Suite setup, Test data creation
Has scripting knowledge (VB Script)
Confident about what he has done; Else says clearly does not know; Does not try to bluff
Has got good positive attitude and willingness to learn
Current role is write code to test


Has good concepts on Test Scenario and Test Case Derivation; Well aware on Test Reporting and Test Coverage.
Has got good high level ideas and awareness on different section of testplan with 2.5 yrs of experience
Does not have exposure of creating the Test Environment but Has worked on VMware tools and has clear concepts on it.
Beginner level concepts on QTP- Automation.
Fair knowledge on SQL


Has 1 year experience in development (Oracle 7.3, Developer 200) and 4 years experience in testing
Has fair understanding of Test Plan, used to prepare the draft version of Test Plan
Is familiar with Usability, Regression, Functional and Smoke testing
Has good understanding of Bug life cycle
Has basic concepts of SQL database and fair in writing SQL query.
Has exposure to Blackbox testing but no exposure to whitebox testing
Is familiar with V model of testing methodology.
Has fair understanding of preparing Test Summary Report.
Good in Automation using WATIR tool using Ruby language


1. Good testing concept; Have exposure to Functional, Regression, GUI testing; Has the concept of performance testing but did not get a chance to work on this
2. Has good exposure to deriving test cases and test scenarios; Has not involved in Test Plan preparation but has the awareness of the sections in Test Plan
3. Good communication skill - has ability to express and explain clearly
4. Has knowledge on Status Reporting
5. DBMS - Has basic knowledge on SQL query writing
6. Has worked on scripting and automating using WinRuner and QTP
7. Has got analytical and problem solving skills


Has very good knowledge of Testing methodologies and Testing concepts (with 3 yrs experience) Has good concept of Test Plan preparation, deriving test cases
Has good concept of Bug Reporting and Bug Life cycle
Has average knowledge on Status Reporting
Knowledge on SQL Query - Poor
QTP Automation - has knowledge on Record and Playback - no scripting knowledge
Communication - Good - has ability to express clearly


1. Well aware of different SDLC e.g. Waterfall and Agile.
2. Excellent Communication skill
3. Excellent attitude - always looks for how customer expectation can be met and exceeded.
4. Very eager to learn new technologies and tools
5. Testing concepts are very clear and technically good.
5. Very good interpersonal skill and can be very good team player.
6. Has exposure to SQL Database.
7. Has positive attitude.


During the course of the interview I felt he has got good positive attitude and eagerness to learn new things. Given a situation, he shows flexibility by saying that he has no problem acquiring the skills and work on new field where his expertise does not lie. He tries to keep himself aware of the fields even if had not got a chance to do that, as for example though he was not involved in Test Plan writing he is quite aware of the different section of the Test Plan. Otherwise he is clear with general testing concepts, Bug life cycle, status reporting and technical knowledge.
Overall I feel with positive attitude, an eagerness of learning and with minimum guidance he can quickly pick up the skills where he is little low.


Rejected Candidates:

Has no exposure to preparation of Test Plan and Test bed setup
Could not derive effective test cases given a scenario. Does not have concept of Test case prioritization and Testing coverage.
Has average knowledge on different type of testing. Does not have clear concepts.
Test Report knowledge is very minimum
Automation knowledge is below average. Seems only theoretical knowledge, not practical/hands on experience
No knowledge on SQL database
Communication - Average


Does not have much awareness on Test Planning, Test data creation but mentioned in his resume
Could not derive good test cases when given a scenario
Does not have good trouble shooting and analytical skill.
Has average knowledge on Functional and Regression testing
Does not have basic of SQL clear
Limited exposure on Automation - using a tool iTKO Lisa, Does not have exposure on QTP though mentioned in resume
Does not answer to the point of the questions asked
Observed that many things were mentioned in the resume but he does not have exposure and awareness e.g Test Planning, Test Data Creation, Performance testing, QTP etc
Communication - Average


No exposure and awareness of test plan
Average knowledge on test case derivation - mostly theoretical knowledge - could not answer properly if given a scenario to derive from there
Has average knowledge on Functional, Regression Testing and has no concept of Unit and Integration testing but claims he has worked on Unit testing
Average knowledge on Bug life cycle
Poor on SQL database
Test Report knowledge is also very limited
Seems to be not very flexible
Does not know about the concept of prioritizing of the test cases, traceability matrix, necessity of configuration tool etc
Communication - Good


Average knowledge on different type of testing. Could not explain properly functional, system, sanity and smoke testing concepts.
Poor SQL knowledge, could not explain the why primary and Foreign key is required. Also could not answer to simple SQL query
No exposure to Test plan preparation and Test bed setup
Candidate does not answer to the point, even if he is unaware of the answer still says something which not at all asked for
Communication - Average


Limited exposure to preparing test plan, Test cases and Test Bed
Has average knowledge on Functional, Load and Stress testing
Could not answer some scenario based question upto the mark
No knowledge on SQL database
Test Report knowledge is also very minimum
Communication - Ok


Has awareness about different sections of the TestPlan
Has fair understanding of V Model
Has exposure on Functional and Regression Testing and Test case derivation
No exposure in Test environment set up
Has basic understanding of SQL database but not very familiar about query writing
Has experience on QTP 9.2 and QC 9.2
Communication - Average


Limited awareness of test plan preparation and Test case derivation
Average knowledge on test case derivation, testing coverage, prioritization of testing and Test Reporting
Average knowledge on Bug life cycle
Does not have clear concept about the different kind of testing and when it occurs in testing life cycle
Could not explain any testing methodology properly
No exposure on setting up the test environment
Limited knowledge on Automation using Rational Robo
No knowledge on SQL database
Communication - Average


Has average awareness on Test Planning
Has good exposure on Functional and Regression testing. Could derive test cases when given a scenario.
Has average knowledge on Test coverage and Test Reporting
Has good knowledge on Bug life cycle.
SQL knowledge is not upto the mark.
Does not have much exposure in Automation. Undergone training on QTP but not hands on experience.
Communication - Good


No exposure and awareness of test plan
Average knowledge on test case derivation - mostly theoretical knowledge - could not answer properly if given a scenario to derive from there
Has average knowledge on Functional, Regression Testing and has no concept of Unit and Integration testing but claims he has worked on Unit testing
Average knowledge on Bug life cycle
Could not answer some scenario based question upto the mark
Fair knowledge on SQL database
Test Report knowledge is also very limited
Communication - Ok
Hence I can say the candidate is below average for QA Concept and will not be suitable for the position.


The candidate is good in Manual Testing. But lack skills in Automation and SQL. Moreover the candidate has a expectation of getting a lead role of managing 4-5 members and also mentioned that the currently he is more into assigning the tasks to junior members. As we are looking for individual contributor role with interest on hands on, the candidate will not be suitable for our team.


Does not know the difference between Functional and Regression testing - carries out same set of testing against all the builds - answered as first time called functional and second time onwards called regression testing
NO concept of configuration management tool - answered as used for only security purpose - authorization purpose
SQL Database - no exposure to SQL Query, did not try to attempt even
Poor knowledge on Windows - does not know how to kill a process, no concept on Registry, does not know how to find out the IP of the machine
Role is only test case writing and execution - no exposure to test environment creation, [separate team deploys the build and pass the test environment to test team]
Could not answer some scenario based question -
what will you do if you raise a bug and dev team is saying this is not a valid bug
what would you do if time does not permit to execute all the test cases
Test Report knowledge is only limited to only bug reporting
No direct interaction with developer - Lead was doing all the interaction.


{Strong points:
The candidate has good exposure to different kind of testing e.g. functional, regression and a little bit of performance testing though he has not carried out in a very structured manner but he has good concepts.
He has good understanding of basic Testing concepts e.g. deriving test scenario/test cases, Test case prioritization, Test coverage, Bug life cycle.

Weak points:
SQL exposure very minimal.
Did not get any chance to work on Automation. Explored QTP on his own interest.

Soft Skill:
Communication - ok. The candidate seems to be very energetic and worked in a close knit project group (dev and QE) in a start up company. Also seemed to be eager to learn new things.

Conclusion: I felt the candidate can be moulded and would be flexible and eager to acquire new skills as the project demands as he is in early in his career.}


Has exposure to deriving test cases. No exposure to deriving test scenarios and no involvement in Test planning (with 4 yrs of experience)
Has average knowledge on different kind of testing
Has good understanding of Bug Reporting and Bug Life cycle
Has average knowledge on Status Reporting
Poor knowledge on SQL Query
Has average knowledge on QTP automation
Communication – Good – has ability to express clearly


Has got experience on working in Onsite and offshore model. Has got experience on transitioning from onsite to offshore being at onsite and taking all the knowledge transfer, planning and preparing transition estimation
Has got experience on QTP and demonstrating automating a subproduct as poc to customer
Has hands on experience on Winrunner and leading a team in automation
Has got experience on Performance testing using WAS
Has got experience on Manual Testing (Integration and Regression testing)
Has got experience in domain Insurance, Education and Internal applications
Is Certified in ISTQB (Advanced – Test Manager)


1. Has good hands on exposure in both Manual and Automation testing. Thinks out of the box and come up with ideas. Open to both kind of testing (manual and automation) . Has also got some experience in Performance Testing, worked on tool e.g. WAS and Load Runner.
2. Has got good understanding of different type of testing.
2. Has got good experience in QTP, .net Framework and VB on automation side and can provide his guidance to the team on automation front.
3. He answers to the point and clearly says where he does not know.
4. Has got experience on Status reporting and interacting with client. Keeps client's perspective in mind.
4. Weak area - Did not get satisfactorily answers on Test Planning, Estimation, Task identification and Scheduling stuff.

Friday, November 12, 2010

Migrating the test cases from unmanageable to manageable

For a large project over the time it is normally found that the test cases grow to a large extent especially release after releases, however often with the tight schedule of ongoing project delivery, the test cases are not maintained properly. Finally one fine morning you discover that the test cases become unmanageable. It has really become cumbersome now and drains a lot of effort to train a new joiner especially when test scripts are not maintained with enough comments / when purpose of writing a test case has not been captured in the test cases / when the functionality the test cases is trying to test is not mentioned clearly. The test cases were just added and added and added during releases based on the requirements without any thought of maintaining it for future.....

Hence it becomes difficult now to identify and pick up some test cases from the test case lot for the purpose of testing/regressing functionality xyz for Release xxxx (say).

Sound familiar in your project too!

So managing the test cases becomes an important task to make a testing team more effective.

The problem seems to be more severe when you discover that the higher management will not be willing to deploy resources for this kind of activities which were not done correctly in past and you have no other choice but to take up this activity of cleaning the test cases (which itself is a major effort itself ) along with your critical project delivery without hampering it a bit. After all you can not jeopardize your ongoing project delivery at any cost. Can you :-)

Here I have cited few of the ways what you can try in your project once you are at similar condition and you want to take up the task of cleaning and properly managing the test cases for future then on.

1. Prioritize test cases

First of all it is required to identify the Test cases as priority 1, 2 and 3 if it is not already done.
It is not at all possible or not feasible to take up all the test cases at one go along with your regular project work and clean up. So the best way is to prioritize them and take them up in phase wise.

Suppose 1000 test cases are there. e.g. Identify 200 odd test cases as priority 1, 400 as priority 2, 400 as priority 3. Prioritizing is the first and foremost task.

Priority 1 test cases are test cases which test the very basic functionality/core functionality of the product.

e.g. for a mobile handset testing test cases should include
Whether you can make a call/ receive a call, send and receive message, save a number etc,
dial the number and call, pick up a number from the saved list and call etc etc.

2. Categorize test cases

Also categorize you test cases in different folder such as

Negative testing
Stress testing
Load Testing
Functional testing
Usability testing

If for a particular release customer is expecting only enhanced look and feel of the product, then you don't need to break your head and search test cases at Stress/load testing test cases at all...

3. Brainstorming and finalization of the standards/approach

Give one week of time to your team members. Ask them to come out with the ideas how to approach on cleaning and maintaining the test cases better for future.

You could also charge them up by linking this activity with rewards, e.g. the person who adds maximum value will get 1000/- voucher etc etc..[Do not make mistake of deciding whose contribution is best by yourself. Ask your team only to decide. Best option would be deciding by voting system.]

Once all the ideas are submitted, brainstorm on those ideas as a team. Book a conference for 3-4 hours and brainstorm exhaustively. Take up an idea and see the applicability, pros and cons etc and finalize the approach. With those finalized ideas set a guideline on what are the approaches you will take while cleaning/managing the test case for future.

Modify 2-3 test cases according to those guidelines. Now sent the guidelines and the modified 2-3 test cases for review. Review should be done from the junior most engineer to senior most manager (from inside or outside the team - whomever you think can add value). Most importantly give it to a new joiner and see whether he/she can understands on his/her own i.e. whether the test cases are self explanatory ( i.e. the test cases are elaborated enough and captured enough info and the new joiner can understands on his/her own).

Receive your review comments. I am sure you need to do a lot of follow up to get your review comments!

Once you receive the review comments, modify the guidelines accordingly. Now set the baseline for the guideline.

4. Set the guideline as Bible for your team now on. (Whenever a new test cases being added this bible should be followed).

5. Take up the whole effort as a project in phase wise

Take up the whole effort of migrating the test cases from unmanageable to manageable as a complete project.

Divide the project effort in 3 phases. You can take up priority 1 test cases in phase 1 and try to implement the standards/guidelines first. That is, Try putting a small investment and see how the investment is paying off.

6. Resource utilization (keeping in mind the ongoing release activities)

You can try this model of utilizing the resources.

You can deploy 1-2 full resources in this Test case management project depending upon the bandwidth and the criticality of your ongoing project delivery activities (basically what makes sense based on your ongoing project delivery condition).

Rest of the resources who are completely deployed on ongoing project delivery, can be given a long term deadline e.g. every two months 5 test cases needs to be modified /person. Thus later point of time you may find out that a significant effort can be absorbed along with the ongoing projects and also thus you can inculcate a sense of responsibility of maintaining the test cases properly by all the team members in your team.

7. Educating new joiners

Modify 1-2 test cases with max comments/info.
Explain the flow of the code to make them understand and record your explanation.
Give the recorded file to the new joiners and see whether they are facing any difficulties in understanding those on their own.

Thus you can save significant time of the existing team members who do not repeatedly explain the test cases every time a new member joins the team.

Also record how your test cases are segregated and how you are managing your test cases in the recorded file so that the new joiner need not ask for any kind of info from the existing team members.


8. Few ideas (subject to applicability to your project)

a. Capture as much info as necessary in the test script
who has written
what purpose it is done
CQ id
Requirement ID
functionality - what it is going to test etc etc

b. Create reusable modules / scripts - thus managing them would be easier and also writing new script would be easier

c. Version control might help; you can go back to previous version and can run if the necessity arises

d. Always keep an eye how you are developing the test script /test cases and how you can reuse in future (the already existing scripts)

e. Give proper access control for maintaining the test cases.
who will have read permission
who will have read/write permission
who can add new test cases etc etc.

f. Discard not relevant test cases / invalid test cases

g. Look back after each release and clean the system. Transfer not relevant test cases to an obsolete folder

h. Do a cost benefit analysis before diving into changing the whole suite.

i. Map Requirement with Test cases
You will be able to know for a particular requirement how many test cases are there at a glance. Also based on the requirement you can pick up the relevant test cases

j. Keep executed Test cases release wise in Quality Center - Test Lab Tab.

k. Collaborate with developers for minimum user interface change in order to minimize the rework effort of test case/ script change (of course without compromising with quality)

l. Last but not the least; Put a process in place for adding new test case. Say one person is in charge of reviewing the new test cases. i.e. the newly added test case needs to be reviewed and accepted by the person before it gets added into the Quality center which will ensure standards/guidelines are followed correctly till the process gets streamlined by all the team members.

Else who knows , you might end up doing the same kind of cleaning activity again in near future !