"Some birds aren't meant to be caged, their feathers are just too bright"- Morgan Freeman, Shawshank Redemption. This blog is from one such bird who couldn't be caged by organizations who mandate scripted software testing. Pradeep Soundararajan welcomes you to this blog and wishes you a good time here and even otherwise.

Thursday, January 28, 2010

Black Viper Testing Technique


The following conversation is a copy paste from LinkedIn's Software Testing & Quality Assurance group in which I write a lot.


Senior Software Engineer - Quality Assurance from somewhere responds:

Never heard about it?I googled the term and found one helpful link. http://www.phadkeassociates.com/index_files/robusttesting.htm but Wiki doesn't have any results for it, as per me its not as concrete as other testing techniques.


and then XXX, PMP, Associate QA Manager from some_organization

Robust testing means the degree to which a software system or component can function correctly in the presence of invalid inputs or stressful environmental conditions

Sr Software QA Engg from elsehwere responds


Is it something like the stability Testing? I never heard of this terminology.


Pradeep Soundararajan : Independent Software Tester with an experience of 7 million and 4 hundred mistakes in testing responds

Same as Black Viper testing.

Senior Software Engineer - Qualty Assurance responds // Note the spelling of Quality as per the Quality Assurance Engineer. No offense. All humans are fallible. //

Black Viper Testing ... now what kind of testing is it?

Pradeep Soundararajan : Independent Software Tester with an experience of 7 million and 4 hundred mistakes in testing responds

@ Senior Software Engineer - Qualty Assurance,

What? You don't know Black Viper testing? I think if you had read about Selphar box of techniques for testers , you wouldn't have asked about Black Viper testing technique.


Senior Software Engineer - Quality Assurance again responds

@ Pradeep

Seriously no. Never got a chance to read the same.
Can you please pass over any link for the same?

Pradeep Soundararajan: Independent Software Tester with an experience of 7 million and 4 hundred mistakes in testing again says something

@ Senior Software Engineer - Qualty Assurance

Cool. Time to update you that none of those terms that I used exists or is meaningful. I just made it up.

The lesson is: Don't get fooled by terms people make on the fly, instead, focus on your skills as a tester and learn to ask a question, "What do you mean by that?"

I see terms like "Guerrilla testing" and "Smart monkey testing" mostly being asked in interviews by dumb testers who are usually successful in intimidating the candidates to believe such things exist.


Those candidates who couldn't give convincing answers to such question, try looking for answers after the interview and start ask questions in all forums they can get their hands on. Of course, we humans don't say "I don't know" (just like me) and hence try to give any answer we think might make sense.

The candidates believe the answers provided by someone who appear to be experienced is true. When those candidates turn out to be interviewers they are tempted to ask the same question, "What is guerrilla testing?" to new generation of candidates.

That solves the puzzle for where these great terminologies come from.

Plus:

In orkut software testing group, long ago, (just about an year back) I saw someone asking the difference between Monkey and Guerilla testing techniques (which was asked to them in an interview as per their claim)

Someone responded to it:  If you do aggressive monkey testing, it is called Guerrilla testing.

(If that didn't frustrate you, the response to the above will)

Here was the response: "Thanks!"

_ huh _

I am back to implementing the Black Viper testing technique with a hissing sound. Be careful, don't go near anyone when they are using the Black Viper testing technique, it could be poisonous :))

26 comments:

Santhosh Tuppad said...

@Pradeep Soundararajan,
I love this blog post. Black Viper - Hiiiiiiiiissssssssssss... I hope this post is read by interviewers or anyone who invent their own terminologies which doesn't make sense.

Thanks,
Santhosh Shivanand Tuppad

Santosh Shukla said...

Hi Pradeep,

That was really cool and very true.

I was once interviewed by , one of the largest investment bank in Gurgaon.

I was asked "what is noun-adverb testing?"

Good for me that I never cared for it and still don't know the answer.
And good for the testing community that I did not become a part in spreading one more bad interview question.
Bad for me that I told "I don't know." and did not ask him the answer.

Santosh Shukla, http://proudtester.blogspot.com

Pradeep Soundararajan said...

@ Santhosh Shukla,

Noun and Verb testing from Vipul Kocher is this:
http://www.puretesting.com/Noun-and-Verb-Technique.pdf

Maybe you and I already do this kind of testing however to expect a candidate being interviewed to know things by the exact name doesn't make sense.

Jassi said...

Thanks Pradeep in actaully educating us, it is really necessary to be aware & alert.I loved this post Simple and POWERFULLLLLLLLLLLL........

Cheers,

Jassi

Pradeep Soundararajan said...

@Jassi,

Don't get misguided by what I write. Think critical and maybe you could help me learn if I am wrong.

However, hissssssss!

Flights of Fantasy said...

hissssssssssss hisssssssssssssss

hey pradeep nice blog.....
but interviewers should not forget that some testing terms are company specific and such term could be referred in different terms elsewhere.

Santosh Shukla said...

@Pradeep,

Thanks for enlightening me.

Good that I raised this question at least now.

As you rightly said, we use some techniques but are not aware of their standard names. However, I did this kind of testing but not completely. Vipul's expression really impressed me.

Santosh
http://proudtester.blogspot.com

Jassi said...

Thanks @Pradeep ,now my grey cells are thinking :)

Cheers,

Jassi

Mgr. Anna Havlíčková (dříve Anna Borovcová) said...

Nobody should say I do some "kind of testing" without explaning what he means by it. For example when he writes into test plan he will do stress testing, he should write what he means by it.

But it is useful if you have list of different techniques.
1) You can give it somebody who is learning to be testers, because if he brainstorm this technique himself, there would be even more diffrent names for one testing technique.
It helps astablish naming convetion.
2) If you have list of desriptions of techniques, you do not need think over desriptions whenever you need it quickly, like for test plan or presentation.
I will rather think over testing excersises then wrote description.
If it workes, reuse it.

Kashif Ali Habib said...

hi pradeep,

its really nice post, you have touched an interesting topic, i totally agree.

keep it up :)

Jaishri said...

HAhahahahah... Awesome... I liked this post a lotttt.... \m/

Neha Thakur said...

Good one Pradeep..I enjoyed it thoroughly...May god bless you:)

Unknown said...

Thanks for sharing this coversation pradeep.
Awesome post and a much required one !!!

I have seen fellow testers carry pages of defenition about "what testig does what" and more so paranoid about being correct in "remembering" them.

I happend to remember such question and for the befinit of testing communit (to beware), the question was "what is horizantal and Vertical performance testing".

The question was coming from a 15+ yrs experienced guy managing a big testing group (which includes testers from different companiens).

I should admit that I did get initimated for a moment on hearing this question and if faced, a relatively younger testers can have his confidence broken and mis guided.

Hope this post will save atleast some

Cheers

Suresh said...

It is not so easy to teach interviewers. Instead youngsters can learn to give reply like "eager to understand the terminology to know whether i have done such kind of testing".

Pradeep Soundararajan said...

@Suresh,

Wow! You said a very cool thing. I hope more people like you are there.

Diego said...

Good Post...

1) We put names to things to improve communication (shorten sentences), we shouldn't do it to either confuse people, put ourselves in a higher status or intimidate test job candidates (though is funny ;-))

2) Most of the terms to coin different types of testing are relatively new, we don't care about finding if it is already known and we develop them in our internal co-workers environment.

3) Due 1) and 2), we should try to always expplain when we mention testing "types", what are we reffering to (or at least provide a source link to the complete category listing where it is explained).

Neverhteless, "types" and taxonomies do have some benefits when doing a task, they allow us to split the universe in (equivalence) partitions of things to make sure that later we don't forget any "type" of testing and we cover them all.
The problem with types is that anything, including testing activities, can be classified under different points of view, or "dimensions" (set of types): according to the objective pursued during the testing, according to the knowledge of the underlying product, according to the level of integration with the rest of the system, etc. we will have a different set of orthogonal types for each one of them. And we always forget that there are different classifications for each one of this orthogonally different attributes and that we shall honor that when we use or invent new test "types", assuring that the type name corresponds with one value of the corresponding dimension (e.g.: if the type dimension is "the objective pursued with tesing", then all the types names should be defined in those terms).

Descartes found that many years ago, each point (test case) has a value (type) in any dimension that is present: a test case can be a "unit testing" done using "white box" techniques and be also a "regression" test to be runned on each source versioned repository check-in. Furthermore, tests can change their type during its lifecycle. What is was a test to check product maturity, can be later a test to do regression.

Some examples (BTW, these samples are considering the terms of type of testing according to SWEBoK: http://www.computer.org/portal/web/swebok/html/ch5#Ref2):

"Are you doing withe box testing?"

"No, I'm doing integration testing"

Wrong! you can do withe box testing (know how the thing was coded) while doing integration testing between different product components.

"There are different testing: Unit Testing, Functional Testing, Integration Testing, System Testing and Acceptance Testing"

Wrong! Functional and Acceptance Testing are from the "what's your objective when testing" classification, while unit, integration and system are from the "what level of integration are you testing with" classification, and should not be messed. You can perfectly do functional testing in both integration and system testing, for instance.

Rahul Verma said...

@Pradeep,

Nice post!

Because of the sheer number of terms that exist in the testing world, at times, it is very difficult to differentiate between which term one should know and what not.

Other angle to the topic that you have started is that many a times testers that I have interviewed often do not know even know what ECP is. Now ECP is just another term, but one would certainly expect a tester to know this. If as an interviewer you start elaborating on ECP and tell the meaning of it, it would be equivalent to giving the answer to the question.

Similarly, load/stress/soak etc. are widely used terms. There's also a possibility (and I have encountered this on many occasions) that these terms are used differently by different organisations/people. That doesn't make the terms themselves meaningless, as long as people in communication have the same understanding.

"Robustness" is an attribute of the system tested under the umbrella of performance testing. We as testers usually form terms around what we test for. So, testing for robustness becomes "robustness testing", and probably this is the reason you heard about "robust testing". "Robust testing" as such may not have any meaning at all, apart from the person(s) who were using this in their context, and were the source of this term. If we start drawing meanings, it could mean anything - one I already drew, other could be "testing carried out taking care of all dependencies and variables", another could be used in the context of automation, where the testing code itself should be robust. So, it would certainly be very difficult to answer "What is.." or "How would you do ..." questions for such terms.

Another example is "False positive testing", essentially meaning testing for false positives. Without a context in place, this term may not mean anything, but the moment you ask someone from the security industry, even if he hasn't heard the term before, he would be able to explain the purpose of this form of testing. This means, contextual terms that are not widely used can also be meaningful.

Even famous authors float terms, with official definitions and reasoning, but do not becme widely used. One example is "Self Satisfaction Testing (SST)" by an author of a performance testing book. Now, this term is vey funny and if we start drawing meanings, there could be many offensive interpretations as well :-). Till the point we relate it to "Smoke Testing", which turn should be associated with the context of performance testing, test envrionment validation etc., we wouldn't reach the meaning which the author meant. Now, does it make the term meaningless? For me - "Yes". For All - "Probably No". I am sure many of the author's peers or his testing group must already be using this term (and in turn asking questions about it in interviews :-))

Terms should be used to make conversation precise not complicated and confusing. So, this becomes more of a responsibility of the people who are communicating rather than blaming the term as such.

Various discussions happen on "This is not an official testing term". Till this day I do not know what "official" is, for this context. I feel pretty comfortable in using terms as I like, hearing to new terms and trying to understand their meaning, as long as conversion progresses and work gets done.

Regards,
Rahul Verma

Anonymous said...

That's a very nice post and thanks for spreading awareness against such bad practice(hope bad practice exist)
To add my experience, I was given a problem and asked to give an out of box solution for that :)in an interview
--Dhanasekar S

Pradeep Soundararajan said...

@Dhanasekar,

I have a problem with people using the word "out of the box thinking" because, because, because I don't think they don't know how to develop that.

I probably think they are referring to lateral thinking - which is very much a develop-able skill and I suggest Edward De Bono Lateral Thinking.

Anonymous said...

Thanks Pradeep, for letting me know about Edward De Bono Lateral thinking.A new learning.I initially thought this is another Black Viper kind of terminology ;)
--Dhanasekar

Abhijeet Rasal said...

Amazing piece of write! I wish monkeys and Guerrillas be given the credit... :) Sincere Thanks Pradeep. This should enlighten fellow Testers who don’t think and reason and accept the terminologies intimidated by some wise men. ;)

Michael M. Butler said...

Pradeep, it seems that the true nature of Black Viper testing is that it tests the listener. How Eastern. :)

Unknown said...

These days, you just put any word before testing and it becomes a form of testing and and if you post it as an interview question on a site, many testers will readily believe it is an authenticated form of testing. In an interview, I was asked if I know Black-box and While-box and Grey-box testing. i promptly said yes. But the interviewer didn't want to know that. he asked me about Yellow-box testing, green box testing, pink box testing, red box testing and a few more colors! he said these forms exist. What was he thinking? Does all of this have any relevance to the real testing work we testers do?

Pradeep Soundararajan said...

@Amrita,

I just hope the Black Viper really bites interviewers who ask those colorful box questions.

A request that I have is: When you interview people, put them to test than asking these questions. It starts from you.

Anonymous said...

mr. pradeep i am new person in software testing what is your advice for me,what should i focus on and how should i manage my career, so that i can be an expert and on the other hand help other people to be one? thank you and really liked your post its so realistic

Jesper L. Ottosen said...

I see you have not heard of the term snow plow testing... ;)