Tuesday, December 27, 2005

QA!= Testing : Mother of all testing related debates is here ....

Whatever is QA, It is not testing – Cem Kaner
Do you want any better and authentic source of opinion on this topic? May be not. It is the time to open the eyes. Let us face, Testing has grown big enough to be separated from its Big brother QA. My sole aim of writing this post to educate the mass about the misconception caused by using wrong terminologies at all places.

My versions of differences:

QA: An entity that observes the activities of software engineering (while being performed or at the end of some logical milestone) and verifies that all documented or know rules and procedures are adhered. This entity does not “Touch” the work product being produced. It watches the process.

Test: Something that gets hands dirty, works with the product, uses it, abuses it, checks all relevant documents etc. Something that is similar to the development team that produced the work product – by from an evaluation angle.

These definitions are my versions and I have attempted in my best abilities to represent the mass opinion about these terms. I could be wrong … I am ready for a debate.

Problems of using QA word for testing:

Testing can not assure quality. “Testing” only measures or attempts to measure it. Quality has to be built from the beginning not to be tested later.
Assuring Quality is everybody’s (in the team) responsibility. How only testing team can be entrusted with that job? In true sense, it is management that is the real QA.
If it is strongly believed that Testing is responsible for ensuring the quality – then others can say – it is testing teams to job to ensure quality not ours.
Sets up wrong expectations about the role of testing.

Notable posts that discuss this topic



Saturday, December 24, 2005

AJAX - Making web experience more interactive ..

AJAX - A technology of bringing desktop software experience to web users. In this it is possible to do the things like editing data on client browser and get an update without web server resending the updated information. Be it “Editing photos on Yahoo photo albums” or using “Google maps” - it AJAX at work. Ajax is not a technology in itself, but a term that refers to the use of a group of technologies together.

AJAX - stands for "Asynchronous Java Script with XML". Java script is a default scripting model for web pages that is used to perform most of the client end data handling, rendering and other local processing. This when combined with XML can transfer the things greatly

AJAX is often considered as a threat to Macromedia's Flash Technology. Flash is more popular technology for rendering dynamic content in the form of multimedia, video based content at Web pages.

Companies like Microsoft, AOL and Amazon are already off the block in deploying this technology to spice up their Web sites. There is an increasing trend in web designers to embrace this. Ajax applications look almost as if they reside on the user's machine, rather than across the Internet on a server.

Read here more about it --

Happy Christmas and New Year 2006 ...


Friday, December 23, 2005

Risk based Testing ....some points to ponder

Risk based Testing - I can see lots of eyes rolling and eye brows raising on seeing this term. Likes of Project managers and Delivery managers simply love to hate this term "Risk". My boss, Manjunath Hebbar, other day gave a this beautiful definition of risk "A risk is piece of information regarding tomorrow 's (future) problem as on today (present)" and he further said - in contrast this an issues is "problem in the hand". Well - this discussion with him was around "project management" not about testing.I am not that competent to talk about project management, so let me be happy with what I am good at - testing.

Risk based testing is all about "Test" the risks in the object that you are testing. That is simpler while saying. In reality - Risk based testing involves

1. A methodical approach for identifying risks
2. Design the tests to explore the risks - know more about identified risks - As quickly as possible
3. Precipitate the failures resulting from the risks - execute tests - As quickly as possible
4. Explore other possible (unknown in the beginning) risks - As quickly as possible
5. Repeat steps 1..4 until all risks evaporate to the reasonable degree OR you run out of time/budget.

If you claim to have mastered Risk based testing technique - check you have following -

1. A proven method of identifying the risks in the product domain - most of these proven ones have a strong theoretical or empirical back ground ( mathematical/statistical or similar). A method you can rely on.
2. Test design methodology - that is matured enough to design tests to explore and precipitate this risks
3. An exit Criteria to know when are "reasonably done with"
4. Knowledge of "Risks" associated with this Risk based ( ironical isn't?) approach to testing.

There seem to be two broad categories of Risk based testing methodologies

1. Exhaustive Risk modeling and analysis - those with involved mathematical and statistically models one the lines of those used for "Reliability Engineering"

2. Heuristic Risk based Testing - rather simplistic one. James Bach has written an excellent article on this topic. As described in the article this is one of the approaches that can be adopted ( in combination with other "formal" techniques) when you have resource and time crunch. He cautions that the whole concept of "Heuristic" could be a fallible one and more of provisional and plausible. It tends to explore the solution.

I happen to discuss this issue with others in Google group on software testing, here. There are some very interesting and thought provoking ideas from Andy Tinkham in this thread.

Read following lines carefully - before you finish reading this post and tell me what do you think …
1. Testing is more about finding important problems that threaten the value of the product - issues that bug someone who matters in the context
2. Risk is a problem that might happen - higher the probability of Risk - higher is the probability of failure.
3. Should not all testing be risk based? Why would one want to test where there is no risk? When non risk based testing would be appropriate ….


Monday, December 12, 2005

Group Test Manager - Role Description

I happen get mail from a head hunter, asking me if I am interested in a "Group Test Manager" position in a reputed Company. When I looked at the job description, what surprised me is that - not even once the word "Testing" appeared in the profile of the job that required to manage a group of test managers. I am not sure this "profile acrobatics" is from newbie head hunter or from a HR trainee who was given the task of drafting job requirements for Group Test manager.
What is the problem here? What if the word "testing" does not appear? What is big deal if I confuse "Testing" and "Quality assurence" or "SQA" or "Production Mangement" or "Software process Facilitation"? The danger of such job posting is that it is loose-loose game. As there will be expectation miss match from both sides. It will be good if the expectations are settled down before the person starts the job. If that does not happen then - chances are that either company or the candidate will start looking for alternatives within months in the new job. Think of the cost of recruitment, training, relocation and other indirect costs that involved in bringing a person onboard - especially at such senior level. If, in the beginning, the hiring manager and HR - fail to put right effort in drafting job requirements and matching title, there would be dark clouds over the future.
Notice that in this job description - the pre-requiresites (Excellent Communication, Interpersonal, Analytical Skills) mentioned last and under "Needless to mention" category. It is unfortunate that these essential requirements for the job are taken for granted - is this the case?

Position - Group test manager
Time line - very urgent
Experience 9 to 14 years
Degree in Computer Science or related field or equivalent work experience and
Minimum of 8 years of related experience in SQA and/or Production Management required Combination of equivalent training and related work experience with experience managing technical teams.
Good software engineering process facilitation skills.
Excellent problem resolution, judgment, and decision making skills required.
Strong mentoring, team building, and counseling skills on software engineering and career issues.·
Excellent written and oral communication skills required.
Willingness to travel internationally on a regular basis (at least twice a year)
Nice to have· Overseas experience
Work experience in offshore and outsourced environment
Full SDLC experience· ITSM or Six-Sigma knowledge
Needless to mention: Excellent Communication, Interpersonal, Analytical Skills are pre-requisites.

Here is another example of vague and ambiguous job description. This is from very reputed internet MNC.

Need QA Engineers with 2-6 experience in following

- Experience in Linux, Solaris and Unix
- Experience in White box
- Experience in Seibel and QTP


May God bless - the company who requires people with above mentioned skills, Head hunter who hunts for the people, Interviewer who does the interviews and finally the poor candidate -applying for this job (like someone who enters automobile "spare parts" shop asking for medicines)

Finally, if you are somebody who is currently drafting job descriptions for your test positions - Need help in reviewing and fine tuning them. Feel free to contact me at shrinik@gmail.com for my services in this regard.


Saturday, December 10, 2005

10 Classic Reasons why Test Automation projects Fail ...

Yes, this is the title of the paper that will be talking about at STeP IN Summit 2006 International Conference on Software testing at Bangalore, INDIA.

Follow this link for the conference details


Be there ....


www.crazytestingideas.com ?

Hold on !!! this is not the name of my new website nor this is another site I want you to rush and have look at. Then, What the heck is this? Read on ....

At times, when you as tester are frustrated - as you were not able to find bugs - might have done something funky, totally out of box, or outrageously creative or totally un-imaginable and that led you to some interesting bugs? When was it that jumped out of the chair and yelled “Eureka!!!! I found a BUG - Look at this?"

Something on the lines of Shoe testing as James Bach mentions in his Rapid testing class?

One such I was trying to do was similar to Shoe testing. I was testing an application screen that lists some items where the number and status was constantly changing - something on the lines of Flight arrival and departure status in an Airport Gaint Display screen. There was button for "Refresh" I just used a pen top or a clip to stick "enter key" and so that stays pressed - before doing tha I started monitors like Filemon, Task manager at performance Tab ( CPU usage and memory) and left for lunch.

When I came back after an hour so, application had died with some weird message. Looked like a memory leak issue. No would have imagined that application would be used in this way. Through this I came to know about some interesting about the application - very unusual type of testing but some times powerful. Developer might react to this and say “No would ever do this?” “This is not a typical user scenario”. I say – "Look this weird test exposed some unknown behavior about this application which none new till today. I leave it up to you whether you want to fix it or not”

Have you come across such crazy test ideas? Do share –

How about hosting a website - http://www.crazytestingideas.com/ ? Volunteers please? OR Venture capitalist to fund this new company that sells crazy testing ideas on web :-):-)


Tuesday, December 06, 2005

Vision of an expert Tester?

Have you heard any skilled tester claiming this Vision? "I can test any thing, under any conditions and under any time frame". I do. Read this another "James Gems" article. Very cleverly he supports this tough goal with qualifiers. What would you do as expert tester if asked to test a nuclear power plant in 30 minutes? How will you react to test "IBM's Deep blue Chess software" in 2 hours? James gives few hints in this article as how to handle such seemingly impossible testing goals.

BTW, thinking of it, all testers should slowly march towards developing ability of "Testing anything, anywhere, under any time frame". Yes with your own qualifiers. It is a tough goal but worth pursuing – in my opinion.

Thank you James - for setting all skilled testers to this journey towards MOON - every tester is invited.


Sunday, November 27, 2005

Testing ideas for Webservices ...

I happen to stumble on this link on Mike Kelly's blog where he describes a method of using JUnit and XMLUnit to test webservices. Worth idea to copy ....


This post is interstingly on "Automation". Read on ...


why skilled testers do not like scripted tests that much ...

Jonathan Kohl, has posted an interesting article on "Scripted procedural test scripts". In the post, Jonathan takes us through a story line that points to developers. How about giving a “step by step”, clear and detailed set of instructions to developers and something that is written long ago before the development begins. Will it work? The development manager who heard this, said – “I would fire a developer who would need that amount of hand-holding”. Developer would decide what his best and do that based on good judgment and skill.

Yes exactly - why can not we apply that to testing? As a skilled tester would you like to be constrained by procedural tests? Would you like the hand holding? Notice the keywords that Jonathan mentions in the post - Tester's skill, Heuristics, Seeking for the information when it is not readily available, judgment, Mission of testing. They are very powerful.

He finally calls out for Skilled testers – Are you identifying yourself with Skilled testers community? Are you somebody who thinks that testing is about "Critical thinking" and interested in improving and nourishing that skill? If yes, read books, articles and blogs from James Bach, Cem Kaner and Michael Bolton. Here the websites for your reference...

James Bach - http://www.satisfice.com/
Cem Kaner - http://www.kaner.com/ and http://www.testingeducation.com/
Michael Bolton http://www.developsense.com/ And don't forget to check out articles by Michael on stickyminds.com. They are like big bang on "Rapid Testing" and Critical thinking....


Testing is not a mechanical Activity ....

I was reading this article posted on Cem Kaner’s site on - I wish to stress the following statement made in the article -

"Testing is a cognitive Activity - not a mechanical one"

Here is meaning of the term Cognition from Web

Cognition: 1. The mental process of knowing, including aspects such as awareness, perception, reasoning, and judgment. 2. That which comes to be known, as through perception, reasoning, or intuition; knowledge.

Note the keywords: Perception, Intuition, Reasoning, Knowledge, Awareness, judgment – all these represent “testing”. A true testing reflect this basic qualities related to HUMAN INTELLIGENCE.

In todays world of software with focus on "processes", "standards" - we all try to reduce the practice to the testing to a mechanical activity- be it test planning, test case design and especially execution - automation. Given a chance, the whole Non testing world (some even in testing group) would replace all smart and thinking testers with "nicely programmed Robots" - Righting test plans, test cases, executing them, logging bugs, making reports, attending meetings too ?? They follow processes, Do things 100% predictably all the time and don’t crib about "burn out" - All at the push of a button.

Such is the craze and lack of understanding about testing in the current Industry.

Remember "Real Testing" is about cognitive thinking and it is very "HUMAN" - don’t try to mechanize it. Today’s software systems have become so complex that It looses its effectiveness. Next time when some one says "standardization" or creating some types of robots - point them to this article.

Be tester, a thinker and be a human (No pun intended for Non testers - they are human too)


Friday, November 25, 2005

LUC - what is Least-privileged User Account?

Today's Tip is related to "Access control and security":

Most of us who test applications on windows platform use an account that has administrative privileges on the machine from where they are running the tests. This means that the application that runs has access to "everything on the computer". Try running the application that you are testing - by logging in to windows as non admin account. You might see serious issues - application may not launch, it may give weird messages and host of other issues that you would never notice if you are not working with an account that has admin privileges on the machine.

Why it is important to Test (Sometimes in a test cycle) by logging as non-admin account?

As golden rule of access control and security - a program should mandate an account privilege that is just sufficient to execute the required functionalities nothing more. Developers while developing typically work on admin account , develop code that assumes admin level of access and put that code on a production box. But the moment some body with non admin account uses the application - some part of the application may break. A Tester should explore the ways in which the application is using the access model and investigate where things look suspicious.

What's the problem with a developer having administrator privileges on her local machine?

The problem is that developing software in that kind of environment makes possible Least-Privileged User account (LUA) bugs that can arise in the earliest stages of a project and grow and compound from there.

Following as an excerpt is from the book Computer Security: Art and Science, written by Matt Bishop, ISBN 0-201-44099-7, copyright 2003.

Definition 13-1. The Principle of Least Privilege states that a subject should be given only those privileges needed for it to complete its task.If a subject does not need an access right, the subject should not have that right. Further, the function of the subject (as opposed to its identity) should control the assignment of rights. If a specific action requires that a subject's access rights be augmented, those extra rights should be relinquished immediately upon completion of the action.
This is the analogue of the "need to know" rule: if the subject does not need access to an object to perform its task, it should not have the right to access that object. More precisely, if a subject needs to append to an object, but not to alter the information already contained in the object, it should be given append rights and not write rights.In practice, most systems do not have the needed granularity of privileges and permissions to apply this principle precisely. The designers of security mechanisms then apply this principle as best they can. In such systems, the consequences of security problems are often more severe than the consequences on systems which adhere to this principle.This principle requires that processes should be confined to as small a protection domain as possible.

Read this article on Least Privilege Access http://www.windowsecurity.com/articles/Implementing-Principle-Least-Privilege.html and this https://buildsecurityin.us-cert.gov/portal/article/knowledge/principles/least-privilege.xml

Try your LUC (luck?) next time by logginng in with the user not having the admin persimission on and see if you can notice something suspecious. Don't forget to send me a mail about the bug that you caught ...


Friday, November 18, 2005

Sanitising your Resume...

I was going thru a set of resumes for test engineer position. Following are few things that I really find un-appealing and takes off my interest. I suggest to my fellow test professionals (others in general) - watch out for these (irritants) and if you are convinced that what I am saying makes sense - Clear your resume TODAY....

1. Open your resume in Word and search (Ctrl F) for words like "Involved", "Participated" etc. and delete the sentences containing them. The recruiter is not interested in what all are you were involved or participated - but he/she would like see "what you achieved" by doing that. Here is a way to rate your resume - Give one negative mark every time you encounter such word in your resume. How much did your resume score? Now do you understand why you are not getting enough interview calls?

2. I will not press very hard for this one - but if you do it is better. Get rid of words like "was Responsible for" or any variant of "Responsibility". What attracts recruiter is action word - "Achieved zero Downtime for systems I was responsible for" v/s "I was responsible for maintaining systems and ensure that downtime was low”. Notice the power of action. You will be delivering the same message but in a power packed way. That catches eyes of who "matter" in getting you a new "dream "Job. It is very important that you load your resume with these power packed action words, lots of them - especially in first 1-2 pages.

3. This one is the most "useless" part of resume if it is present. Writing paragraphs about the application that you tested with the names, versions, modules, detailed functionalities. Looks like a copy paste from functional specifications or SRS (System Requirement Specification) of the software product that tested. Watch out, some times this might land you in legal issues with your employer dragging you to court for leaking strategic product information to public - via your resume. This is big TURN OFF for the reader - especially a recruiter who would process and see thousands of resumes in a day.

Don’t forget the thumb rule – 1 page of resume for every 2 years of experience. So a person with 8-10 years of experience should not have a resume that exceeds 5 pages. Less and crisp is better and easier to read.

Enough? Open your resume and sanitize it TODAY.
May God bless you with job offers and interview calls pouring all the way !!!

Update: 21 Nov 2007 -- I referred this post to someone and at the same time happened to read another related but useful post on the same topic ...



Thursday, October 27, 2005

Getting Buggy ... Some gyan about Bugs ...

This is post is in response to few typical interview questions that are posted in various user groups. I typically don’t jump in and answer to them - But in some case I just can not remain silent - some questions and answers make me to speak up. As if they beg for answers. Here is one such occasion and here is how I responded....

1. What is difference between Bug and Defect: There are many definitions that float around there are no simple and universally acceptable definitions for these things. When used in an informal environment, both defect and bug mean same thing. It is some unwanted unexpected behavior that bugs somebody who matters. This definition of bug does not change depending upon the phase of SDLC. A bug is a bug is a bug is a bug. Same holds good for defect. I quote Cem Kaner, James Bach and Michael Bolton in this connection. Believe me they say same thing - no one can dare to question them in the knowledge in testing field. As per Michael Bolton - “I say that you may define "defect" in any way that you like, as long as the person that you're speaking with or writing to understands your definition."

Michael Bolton in a Google group post --

A bug is something that threatens the value of the product, or, if you like, a bug is something that bugs someone who matters. Both of these definitions come from James Bach. Your definition may differ. "We" depends on the context of the project. On a typical project, someone (the project manager) has the authority to determine whether something (a bug, failure, fault, defect, and symptom) is serious enough to merit attention. In my context, an intermittent problem is a bug if the project manager says it's a bug. James also wrote an article on intermittence in his blog; try http://blackbox.cs.fit.edu/blog/james

Following is an excerpt from Cem Kaner’s blog – Note that according to him the use of word defect in more formal context means "Legal Implications". If there is a defect in software, an end user can sue the producer of the software. He recommends that word "Bug" is more informal.

Quote: Cem Kaner --

I have two objections to the use of the word defect.

(a) First, in use, the word "defect" is ambiguous. For example, as a matter of law, a product is dangerously defective if it behaves in a way that would be unexpected by a reasonable user and that behavior results in injury. This is a failure-level definition of "defect." Rather than trying to impose precision on a term that is going to remain ambiguous despite IEEE's best efforts, our technical language should allow for the ambiguity.

(b) Second, the use of the word "defect" has legal implications. While some people advocate that we should use the word "defect" to refer to "bugs", a bug-tracking database that contains frequent assertions of the form "X is a defect" may severely and unnecessarily damage the defendant software developer/publisher in court. In a suit based on an allegation that a product is defective (such as a breach of warranty suit, or a personal injury suit), the plaintiff must prove that the product is defective. If a problem with the program is labeled "defect" in the bug tracking system, that label is likely to convince the jury that the bug is a defect, even if a more thorough legal analysis would not result in classification of that particular problem as "defect" in the meaning of the legal system.

We should be cautious in the use of the word "defect", recognize that this word will be interpreted in multiple ways by technical and no technical people, and recognize that a company's use of the word in its engineering documents might unreasonably expose that company to legal liability.

Unquote Cem Kaner.

2. Bug severity v/s Priority - Who assigns them: When the developers have plenty of time, bug arrival rate is lagging behind the fixing rate - nobody really cares about "Priority" and to some extent "Severity". Both of these are filtering mechanisms to select few bugs from the whole lot so that only important ones are fixed first. Severity is one way of grading the bugs from "bug impact" point of view - so that tester can influence the fix - say "This is more serious needs to be fixed first". After all, as very few people know - real value of tester is in getting a bug fixed than simply logging it. Severity can be/is assigned by the tester; can be modified by test lead if there is real need. There after it is in developer’s court. Developers use rating called "Priority" to pick top bugs to fix. So priority is set by Dev lead in consultation Program/Project manager some times even client will get involved. This can not happen without buy-in from test lead. What I am describing is IDEAL situation. In a mature Test organization this happens. I have see this (been a party to it) happening in companies like Microsoft. In Microsoft (also in many other organizations) – they use a process (ceremony in Agile world) called “Bug Triage” where dev lead, test lead and PM sit across the table with the list of bugs and deliberate on severity and priority. More often than not, the discussion is more oriented towards “Priority” than “Severity”. Bug Triage meeting is a formal platform to change the “Severity” and “Priority” levels.

In most of the test groups I have seen, Severity rating unfortunately is a means of performance of developer or tester. Like “This tester has logged 10 Severity 1 bugs” OR “The module developed by you had maximum number of Severity 1 bugs”. But then, that is another big topic of debate.

Last but not the least - All those who wanted to know about bugs but did not know whom to ask - read "Bug advocacy” by Cem Kaner - the bible on bug management. You will never have any doubts about bugs in rest of your life.


Buggy - all the way is'n it? I love to talk about bugs and the way they are managed...

Ideas and views are welcome...


Wednesday, October 19, 2005

Testability: A way to enhance your product's Utility Value

Testability to me is a feature of the product that makes it testable (a synonym for “usable”) in multiple contexts and ways. A mature testing process would always vouch for building testability right into the product during design stage where the developers are more willing and accommodating to accept “requests” from testers.

Typically only bugs make developers to listen to testers. Another important use of testability is that it makes “Test automation” more easy and efficient. Testers while doing automation need not spend hours and hours to create custom code to verify a test which could have been implemented as test hook – say a log file or a registry entry or database record or a status bar text. Ask for a test hook.

Just to give an example – I was looking at an automation idea where the test was to kick off a batch file and wait for it complete and then based on the result of the batch file, automation script would proceed. The problem the team was facing is how to make wait for script to finish. One option given by a tool vendor is to use a wait commend with some hard coded wait period.

Here is where stuck this idea of testability feature. I simply asked the developer to include a feature/task of creating an empty text file or registry entry to mark the end of batch processing. Developer happily included that feature which made our “wait” task easier and efficient than putting a dumb wait(100) kind of command. Another example of test hook is command line interfaces or APIs to product features that are typically driven with GUI. This will ease automation script maintenance when GUI unstable or changing.

One more example. We were planning to automate a feature where user would fill-out a form with lots of data which eventually would be submitted to web server in the form of an XML file. We wanted to simulate a load on web server by submitting a large number of such forms in a given span of time. When we asked the developer for help, he said that feature it not available. Then, we asked the development team for an API that would take the path to an XML file as the parameter and submit the contents of the file to web server to simulate form submission from client end. This made whole of automation work easy. Later on this testability feature was extended to include many interesting test hooks to make the job of testing easy.

Michael Hunter talks about testability at his blog – makes a good reading.

Monday, October 17, 2005

Resume v/s Ad

Here points about resume writing that I collected from various sources. I am not sure about the last point.. People do include (me too) their personal (contact) information in the resume so that they are "contactable".

I have been observing that people write resumes that are typically 8-10 pages. As a hiring manager, I mostly see first 2-3 pages, if I don’t get attracted by the profile till then; I will not read any further. My thumb rule is to have a page for every 2 years of experience. I have also observed that people write about their current and past employers in lengths - often about half a page for each employer. This is waste and is going make your resume less readable and less attractive. Sell yourself in the resume not your current or past employers.

Final point is that treat your resume as an ad that you put up in TV to market yourself in the job market. Decide what all you want to go in such an ad. Do you think - people sit and watch ads that are hour long and lack focus?

Read on...

Resume Writing: Seven errors common to an average resume

  • Too wordy. A résumé should be one page in length (one side only), or two pages at the most. A résumé is primarily an introduction - in the same way an advertisement is primarily an introduction - and should be under conscious control every inch of the way. Basic outline: Position Desired; Summary of Qualifications; Education; Skills; and, Employment.
  • Contains salary requirement. This is a big mistake. If you list a salary requirement it may well appear, to someone who has yet to appreciate your real value, to be too high or too low, and you may never get the chance to explain or elaborate. The thing to do is first make a favorable impression, and evoke some corporate response. There will always be time later to negotiate your salary - after the company decides it likes you and wants you and you're in some kind of bargaining position. It may be that their offer will not require negotiation.
  • "Me-oriented" Excessive use of the word "me", or "I" and prominent use of the phrases such as, "I seek," "my objective," etc. are to be avoided. Employers want to know what you can do for them. You must lead off with and elaborate on your benefit to the employer; plays up to what you think are the employer's objectives.
  • Assumes too much reader comprehension. This takes the form of listing and explaining numerous accomplishments, courses taken, etc., not necessarily related to your position objective.
  • Contains unnecessary and confusing information. (Different from being too wordy). You must be specific. Everything in your résumé should support and point to a single skill/expertise. In advertising, the simplest ad is best. No ad, no matter how high-powered, can sell several concepts at once. Neither can a résumé.
  • Stiff, formal language. Don't be flip, but make it readable. Aim for your audience and the people you want to impress. In short, communicate.
  • Includes personal information. Do not include any personal information. Name, home address, and home voice phone that's it.

Friday, October 14, 2005

Advice for budding software test professionals ...

Here are few points of the post that I made for QTP yahoo groups - in response to a query made for FAQ/interview questions on QTP. The main motivation for me to post this is that I see most of the new entrants in this field, do not know where to invest time and effort to know the field of testing. Often they end up reading some FAQ or Typical interview questions posted on some site and think that they have arrived. Software testing today suffers from lack of education and awareness about "What it takes to be a software tester" and "how to successfully carry out and add value to the overall process of software development.

Read on .....

Here is my advice to all aspiring QTP or Test engineers and professionals. These are lessons I learnt personally and useful for any software professional who is serious in testing.

1. Do not look for short cuts to learn and get knowledge. Have a long term plans to get good mileage in this profession. FAQs, etc are good to read only for knowing top line. To succeed in the interview you will have to win it from inside of your heart, invest honestly in studying and expect fruits. Banking on FAQs, interview questions etc may get you the job but will not keep you there.

2. Most important for a tester is to understand what makes a good tester? How he/she is different from a developer? What value tester brings to the table? How to find talent in testing and nurture it? How testing is different from QA or any flavor of process (CMM, Six sigma) etc.

3. Invest in sharpen problem solving and "thinking out of box" abilities. Read good stuff on testing. Participate in conferences, discuss with test professionals in other companies, participate in activities in SPIN, etc. Solve puzzles ( zig saw or shankuntala devi). Never stop learning.

4. Sharpen Technology skills. Know about "How web works" , DNS, Networking, protocols, XML, Web services, Cryptography, databases, Datawarehousing, UNIX commands, Fundas of J2EE, .NET, system admin list is endless. Today testers are expected to know the basics. I take lot of interviews for various positions. Most of the people do not have these basics. It is difficult to survive in this world of testing only banking on "Automation tool" knowledge.

5. Learn programming languages like , C#, Java and scripting languages like PERL, python, Unix shell etc. This will increase utility value of yours. Developers and PMs will respect you.

6. Improve communication skills - take English class. Improve vocabulary. Read Read and Read.
Most of the people I have seen ignore this important skill. They can not write a paragraph on their own without spelling and grammatical mistakes. Make a habit to learn a new word a day.

7. Read and write on blogs ( Google to find out what is blog- if you don’t know already). Here are few blogs that I suggest for every test professional.

http://www.testingeducation.org/ - cem kaners free testing courses.
http://www.testing.com/cgi-bin/blog - Brian Merick site
http://blackbox.cs.fit.edu/blog/james/ - James Bach - highly respected Visionary in testing.
http://blogs.msdn.com/micahel/ - Microsoft’s Famous Braidy Tester
http://www.developsense.com/blog.html - Michael Bolten - Testing in plain English
http://www.kohl.ca/blog/ - Jonathan Kohl
http://www.io.com/~wazmo/blog/ - Bret Pettichord - Automation testing Guru

my own blogs -
http://blogs.msdn.com/shrinik ( my Microsoft blog - is closed since left that company)

Last but not the least, Be a person with positive outlook in the life. Believe in yourself other wise nobody else will believe you

All the best. Let us build a next generation test professionals community and change the way world does testing today.

Tuesday, August 09, 2005

Definitions of software testing - tracing the history

As I read more and more on Testing and explore it, one thing that always fascinates me is - way in which “testing” defined. Starting with this one from Glenford Myers “The process of executing a program or system with the intent of finding errors” to this one from James Bach “Testing is questioning a product in order to evaluate it".

It seems a quite a big journey. From a predominantly a “Bug finding Approach” to an “Evaluation based, information gathering approach” – indicates a clear shift in the thinking and approach. In my view, these definitions are not mere “theoretical” definitions meant to be memorized and used in job interviews. They paint a picture about thoughts, directions and perceptions about the “Testing” at respective times. They are indicative of levels of knowledge, trends prevalent at specified period.

I am currently working on these definitions (source and time period) and present a chronological study and account of how software testing evolved since the days of “Myers (Book – “The Art of Software Testing” – 1979) and Bill Hetzel (Book “Program Test Methods”, 1973) through the days of “Boris Beizer”, “Cem Kaner”, “James Bach”, “Bret Pettichord”, “Brian Merick” and other visionaries in testing. Help me by sending interesting definitions about testing along with the source and time period – I will consolidate and post it on this blog.

Critique this one – “A software engineering discipline positioned strategically in SDLC to help developer, in all possible ways, to ship better quality software”.

I know, the words like "help”, "SDLC" "strategic" in this definition are open for debate. The theme here is to "Help the developer" - the poor guy who is nicely sandwich-ed between the “Tester” and the “Project manager" (with conflicting interests - most of the times) and we expect him to deliver “defect free" software in time within budget - all the time !!!!

Monday, August 08, 2005

Jerry Weinberg ....

I was led to “Jerry Weinberg’s site by James Bach. I heard about - Jerry Weinberg in most of software testing related discussion with people often quoting him. I was lazy to google and find his site. James’ Blog post (titled "How to investigate intermittent problems") came handy to me locate Jerry’s site. I just finished reading some stuff from his site. No wonder why he so big hit in the software community. I especially enjoyed this humor article related to wisdom of dogs …..

“A German Shepherd Dog went to a telegraph office, took out a blank form and wrote: 'Woof, woof woof woof, Woof woof woof woof woof.'

The clerk looked at the paper and politely told the dog, '"There are only nine words here. You could send another 'woof' for the same price.'

'But that wouldn't make any sense at all,' replied the dog.“

Thursday, July 21, 2005

Do a "Windows + M" when outlook is active window .....

I have noticed this behavior (seems like a bug) with Outlook 2003

When outlook main window is active, press Windows key + M (minimize all). Instead of outlook window getting minimized, it looses the focus and hence does not get minimized. This does not happen with any other office 2003 applications. I am using Windows XP. BTW, if I use right click on Taskbar ->Select Show the Desktop, outlook window gets minimized correctly. It seems that piece of code that handles active outlook for “Show the Desktop” function is not present for “windows button + M function when outlook is active window. A severiy 3 or 4 type of bug?

Any body observed this?

Wednesday, July 20, 2005

Blink Testing and power of Human brain ...

James Bach has posted this excellent post on what he calls as “Blink Test” - which is about exploiting human brain's power to process huge data in seconds and find out patterns (good or bad). Software testing is more about thinking and questioning. One who is curious and one tries to go beyond the obvious will become a successful tester. As human beings we are greatly blessed with power of thinking and our brain is capable of storing and processing data at “super sonic speed” and give amazing results.

In the post, James presents a movie that demonstrates with a sample application how one can perform blink test by exploiting the power of our brain. In the post he gives few examples of using this kind of testing (or investigation) like paging through a long file super rapidly. On the face of it, it looks that one is processing a huge data in a small interval of time – may sound futile. But believe me, as James demonstrates in the example, it really works. Can you imagine running about 700 odd tests (need to watch the movie how this was made possible) in about 5-8 secs time and find bugs? Efficient and quick testing right?

Great post James.

Friday, June 17, 2005

Another classic definition of Testing ....

After reading chapter 5 on Automation testing of all time favorite book on testing - "Lessons learnt in software testing" and this interesting thread of discussion on "software testing" Yahoo group discussion – let me re-define “software testing” or its objective as follows

"Software testing is a questioning process that is aimed at getting information about the software product under test"

Based on what kind information one wants to know about the application under test - testing takes different shape - hence we have different types, levels of testing.

This definition is very simple yet very powerful.

Thursday, June 16, 2005

Interviews, failures and being yourself ….

Continuing my tips on interviews, here are some very interesting observations on a job seeker in the interview. Louise of blue sky resumes , makes important points about interview -

  • It is not a test where job seeker has to pass or employer to fail/raject a candidate.
  • It is not about giving “right” answers and impressing the interviewer but about “being yourself “and giving answers straight from your heart.

He points out that a job that was obtained by some kind of “faking”- will eventually be a disaster for both job seeker and employer - as fundamentally there was a “misfit”.

I see this post carrying a distinct point of view about job interview -- it discourages the approach of “impressing the interviewer”and makes a point to look for a match between job requirements and intersts and capabilities of a prospective candidate.

Louise signs off the post with these words “If they don't choose you, chances are it wasn't the right fit anyway”. This is a practical way to fight blues of “rejection”. I personally have gone thru few such occasions and was rejected for some jobs when I was in desperate need of job. Now when I see those situations in retrospect, I am happy that they rejected me. There was “misfit” in those situations.

Gretchen, a Sr. technical recruiter at Microsoft, echoes on the similar lines

On a totally different line – don’t forget to checkout this post from monster on “exit interviews”.

Friday, May 06, 2005

Ports and IP addresses

These are two things that come to my mind when I think about Security testing/Hacking.
I tend to believe that all security and hacking revolves around getting info about these two things Right?

Honeypots - what are they .....

These days I reading like crazy on security related topics. I am also collecting lots of Hacking tools. Hacking? Still long way to go. I often dream about becoming a white hat hacker.

I like hacking as this is close to testing - it about something you explore in an unknown territory. Hackers (whitehat) have same amount of passion, enthu and curiosity as a seasoned software tester. Tester go for bugs and while hackers seek vulnerabilities. So lots of similarities.

Well While I was reading on security I came across this site - that talks about honeypot.

What is a honeypot - to simply put it "it is a specially and intentionally produced piece of software vulnerability that is open for attack for hackers".
In the article, the author identifies two types of honeypots - one production and other research. This classification is purely based on what you can do with Honeypot.

Typical usages are to monitor the possible attacks, or reaserch purpose.
In another interesting case, honeypots are used by organizations that outsource the security assessment or penetration testing. The agency or company or individual's skill doing such testing/assessment is indicated by "the speed with which they discover that honeypot. Failure to find out honeypot may even terminate their assignment in some cases.

Dont forget to check out this site for detailed discussion on honeypots

I am on to reading another topic of security testing - Keep coming back I shall post more often on security testing on this site...


Monday, May 02, 2005

Another quotable quote ...

Here is another quotable quote ....

Testers make informed decisions possible because they think
critically about software. That's big-time fun, and a serious privilege.

Friday, April 29, 2005

Ramdomness in life ....

Randomness in life ....? Randomness some times teases you back in life so predictably that you start suspecting it. I have see many times some things happing - like seeing a person at some place many times ( called co-incidence?) and missing a bus/train regularly at times - things that ought to be random- behave like non-random. May be that is life.

Steven lavy on MSNBC ,while talking about iPod's "shuffle" feature - says his iPod which is suppose to the play tunes randomly, plays them in some specific order. He further quotes Temple University prof John Allen Paulos, an expert in applying mathematical theory to everyday life and says that some times in trulely random events like tossing a coin - it is quite common to get 6 heads in a row ....". He ends the post with following lines which are truely amazing ...

"Life may indeed be random, and the iPod probably is, too. But we humans will always provide our own narratives and patterns to bring chaos under control. The fault, if there is any, lies not in shuffle but in ourselves.'

SD Times survey : Quality is hot so is agile development .....

Read on. Quality/Testing is a real hot topic these days

Quotable quotes ...

Read these one of the post in "software testing" yahoo group . Really quotable ....

"Strategy is the art of making use of time and space. I am less concerned about the later than the former. Space we can recover, lost time never." - Napoleon Boneparte

"We are a service organization whose job is to reduce damaging uncertainty about the perceived state of the product."- Bret Pettichord


Tuesday, April 26, 2005

Dev - Test Relation ship

"testing is simply a non-deterministic task the outcome of which is somewhat unpredictable.' this is how Micheal hunter - describes in one of his posts. Dev - Test relationship has always been my fav topic. I am fortunate enough all these years to work with those 'understanding" developers who thinks that dev and test compliment each other. As I hear stories about dev feeling testers are "necessary evils", I feel that most of this due to historical reasons or due to the way some organizations look at testers. Without enough training, career paths - people are pushed to do testing - that is why we have lots of people called testers who made to this title because that failed in other area - mainly development. Micheal further says in his post about true dev test relationship that is based mutual trust and respect

"Every relationship is founded on trust - or the lack thereof - and this one is no different. Remember that you're not just fighting stereotypes about testers ("Testers can't code." "How hard can it be to find bugs?") but also about developers ("Developers write bugs into their code just to make my life miserable!" "How hard can it be to write bug-free software?") "

A collaborative dev test relation is sign of a good project team ...


Wednesday, April 20, 2005

useful info on software lifecycle, requirements management et. al

I know, it is bad, - just pasting some links for info - but I can not help it.

Here is some real good stuff for software lifecycles, requirements management - some project management stuff. I stumbled on this site



Tuesday, April 19, 2005

Friday, April 15, 2005

Sun's new programming language - Fortress

As I juggle with joining formalities and other initial stuff at iGate - my new company, I am back to blogging. Squeezing some time out, I started off reading some of fav blogs - one them happens to be of Brian Merrik. In one of his posts Brains talks about this guy - Guy Steele who is one of the leading figures of this new programming language initiative from SUN.

Look at this person's Bio - awesome. Reading some his quotes - I am really getting drawn towards this topic of Writing Language specifications. As Brian points out in his post, state tables are great tool to represent state machines.

Another interesting thing about this guy as pointed out in Brian's post is : he has a huge shower, in which he spends about twelve hours a day. I don't absolutely know that, but I deduce it from the time I heard him say he only gets good ideas in the shower.

As somebody rightly said "Successful people don't do different things but do things differently".

This language BTW, is touted to be better than Java - Another revolution in offing? Watch out for SUN.....


Wednesday, February 16, 2005

What would you choose?

As a tester often you need to choose between delivery ( customer satisfaction and quality) and "process compliance". If given an opportinity - which one would you choose and why?


Tuesday, February 15, 2005

Top 5 categories of prime focus areas for successful IVV business

I happen to discuss IVV ( independent verification and validation) with one of my friend and thought following things that are important for a successful IVV business. This is one area that growing where more and more IT service companies are getting into.

1. Process
a. Engagement
b. Contract
c. Model for Test estimation
d. Internal process for test execution
e. Test Automation

2. Quality
a. Metrics - process and tools for measurement
b. how IVV offering quantifies the quality
c. Test escapes - how to deal with them

3. People
a. hiring and team building
b. Training and retaining people with specific skills
c. Sound strategy on avoiding people burn out
d. Management support
e. Strong and visionary leadership

4. Knowledge management and IP
a. KnowledgeBase for capturing bugs, patterns, Execution history.
b. IP for test techniques and procedures.

5. Marketing
a. Branding of IVV offerings.
b. Wide portfolio consisting of Functional, Domain, performance, security, compliance testing etc.

I am planning to add more to this post I work on this