Tuesday, November 29, 2011

Test Automation Magic Bullet

Ah yes, the Myth of the test automation magic bullet.  You got too many test cases to run and not enough time, let's just automate them all!  That'll do it!  Every test automator has probably experienced this scenario before and probably more than once.

We have all heard the "Infinite Monkey Theorem" or some variation on it. In the case of test automation, it is kind of the opposite. My "Test Automation Theorem" goes like this:

A single test automator can automate an infinite number of
test cases of any complexity in any specified time period.

That about sums up managements typical understanding of test automation.  So, given the Test Automation Theorem, we can easily derive the Magic Bullet of Test Automation that solves all timeline and workload issues when it comes to testing.

Here is an actual example:

The company has decided to migrate to a new piece of software and it is super critical to the business (aren't they all).  The schedule is getting pushed back due to unforeseen (and obviously unplanned for - more on project planning in another rant) development issues and delays.  The test team has created over 7800 test cases with 2700 identified as urgent or high priority. Senior management declares that the test team will need to automate all 2700 urgent/high tests and that will take care of any schedule problems.  So let it be written; so let it be done!

Yes! The test automation magic bullet will fix it all and we have full management support to release the magic!  Ah, yo, um, Bob, um, flip that magic bullet switch over there and get this done. Sweet! That was easy.

Seriously, this was my situation recently. No joke.  Luckily, my manager has been down this path with me before and so we decided we needed to manage expectations and educate some people.

So let's break the situation down and see the (obvious to some) problem with the magic bullet approach. Here are the challenges:
  • Product has not gone through a test cycle and all tests are, of course, GUI based.
  • New product means no test automation infrastructure in place; all has to be built from the ground up.
  • How big of an automation team do you have and what timeline?
  • You can never automate every functional test.

Hitting the Moving Target:
If you haven't tried to create test automation, especially GUI automation, for a version 1.0 product, you will be trying to hit a moving target.  The interface is likely to change many times during the development cycle and you will end up re-writing more than half of any automation code you create.  You will end up stressed out, over worked, and, more than likely, hating your job.

You can search and you'll find many articles that say automated testing at the GUI level is fragile (and it is but much of the issues can be mitigated if the automation is designed well), but the GUI of a brand new application is the most unstable, ever changing beast you can encounter.  I have read many articles stressing using automation at the API level to test business logic rather than at the fragile GUI level. In a perfect world, that would be great, but testing at the API level typically requires more cooperation with the application developers that is many times hard to come by.  The application developers are already at full workload to get the "must have" product out the door and, if automation requirements weren't factored in at the initial product planning (yeah, right), they just don't have time to "fix your automation problems".  It can be really hard to get a good relationship established between a test or test automation team and the development team, especially when your company starts outsourcing more than half of the development work.  The outsourced team is trying to meet their timelines and test automation just doesn't fit into their workload.  Let's just leave it up to shortsightedness in both budget and planning.

The "Hidden" Work of Building New App Automation:
Even when you already have test automation in place, you can't always leverage the work you have already done. In this case, the new application to automate was created in PowerBuilder (yuck!).  All our previous automation efforts are in desktop Java swing apps or Web interfaces.  Looks like we have to start from (almost) scratch on this one.  That will sure make things easy </sarcasm>.

Now, even if this new application was web or Java, there is still lots of start up work to be done.  We have to build an object repository of some sort depending on the automation tool of choice and previous handling of this sort of thing.  We have to decide the best approach for automation development (data driven, keyword, hybrid, etc) and that will undoubtedly require learning about how the application works and the data requirements to execute test cases.  In our environment, the test automation team is separate from the manual test teams, so we don't have any application knowledge by default (sort of), but this application is all new so everyone is learning it. If your automators are part of the manual test team, then they have to learn the application anyway, but they also have to write test cases,  test plans, etc. which will take up their time; it's a trade off that I can't say either way is hands down better than the other.  So after you have your strategy, objects, and application knowledge down, you have to start building up function libraries and such for automation.  What??? You thought we were just going to Record and Playback the tests?  Really? Have you every tried to maintain test automation created like that? You still want to do it that way? GTFO!!!

The Other Magic Bullet: Outsourcing:

So the Test Automation Theorem isn't working out for you? You must have done something to mess up the equation.  Oh well, just use the other magic bullet: Outsourcing.  You can get a shit ton of test automators for the price of a can of coke and a bag of chips with Outsourcing. That'll fix your problem.  Oh boy...

Outsourcing, in my experience, creates sloppy, inefficient, and impossible to maintain automation code.  The outsourced "automators" (Ok, let's call them what they really are: Record and Playback Drones) slap together just enough code to do the test and repeat the process as fast as they can to get their "number of tests automated" count as high as possible as fast as possible to demonstrate progress and meet deadlines so they get paid. They won't be around next test cycle to deal with the mess, but you will.  Good luck with that.

Saturday, November 26, 2011

The Lost Art of Resumé Writing

What ever happened to resumé writing guidelines? You know: keep it concise, 1-2 pages, to the point with only relevant information?  Evidentially that is all out the window now with the latest breed of contract vendors.  Resumés are now novels: 8-10 pages, longwinded paragraphs with no useful information, copy/pasted and duplicated responsibilities.  Spell checking? Grammar checking? Bahahahahaha.

We have been trying to hire people for test automation for over a year (off and on) so I have had the, um,  pleasure of pouring through stacks of resumés.  Back in the day, I would have tossed every resumé that I get now straight into the trash can for wasting my time. This really surprises me with all the unemployment, but I guess I have to put up with this since that is all that gets submitted.  Here are my major peeves:
  • Too long with lots of unimportant information
  • Buzzword compliant to get search hits
  • Spelling and grammar optional

The Resumé is Too Damn High Long:
Let me break this down for you.  I don't read past the 3rd page unless you have a shit ton of experience with short terms at different companies.  Ok, I lied, I DON'T READ PAST THE THIRD PAGE, period. All the rest is wasted ink and paper.  Be real; you aren't applying for CEO.

And who the hell came up with this new thing of writing a 10 line paragraph describing the company's general business objectives before putting your job responsibilities.  Do you know what I am talking about?  Here is an actual example:


Merck, West Point, PA                                                            Jan’10 to Present

Merck is a global research-driven pharmaceutical company dedicated to putting patients first. Merck discovers, develops, manufactures and markets vaccines and medicines in over 20 therapeutic categories. The project consists of design, development and implementation of portal -Merck Product Services, to provide services for its clientele.  The portal was developed using a Vignette Portal that was integrated with Apache and Tomcat servers. The portal includes custom applications such as Registration, eHealth Seminars- an online seminar event scheduler, Shopping Cart – an online application that physicians use to request online, prescription samples and medical journals, Customer Service modules, Message Board.  The Portal provides group based access permissions to various user groups, custom styles, grids navigation trees and secondary pages.

Responsibilities:

- Watch paint dry
- Blah blah

WTF is that??!?  Was the resumé not long enough already?  What does that have to do with what kind of experience you have? Who came up with this shit?  Someone who doesn't have to read 20-30 resumés a day, that's for sure.  I don't give a rat's ass what the company does; I want to know what YOU did. And all these resumés are already reviewed by the contract vender and they allow (or promote) this shit.

And I haven't even gotten into the other information like Summary/Expertise lists a full page long and the absurdity of the Technical Skills list.  Yea, that's next...


Lies, Damn Lies, and Resumés:
Here's the deal. STOP LYING!  In your technical skills, quit putting things just to get search hits.  Selenium is the current buzzword that is a must for all test automation resumés these days, but looking at your work history you never list Selenium ANYWHERE, so you have never used it!  STOP LYING!  When I get you on the phone for the initial job screen, I'm going to know you are just a lying bastard.  We always phone screen to weed out the liars and idiots, so you won't get an interview anyway and I will be sure to bitch at your vendor about the lies.

So let's break down a real example of a Technical Skills list:


TECHNICAL SKILLS:
OS:                         MS-Dos, MS-Windows 9x/95/98/2K/NT/XP pro, UNIX, Linux 6.x/7.x/8.x
Languages:              FORTRAN, Pascal, C, C++, Java, EJB, Visual C#, Visual Basic, Dot Net  
                               Framework, Lingo, Lisp/Prolog
Databases:              Cobol, FoxBASE, Fox Pro, SQL, PL/SQL, T/SQL, Microsoft Access, ORACLE, 
                               DB2, MS SQL Server Management Studio, SSIS Packages, WebLogic, IIS.
Testing Tools:          Quicktest Professional, Loadwinner, Winrunner, TestDirector. 
Web:                       Html, Dhtml, Java Script, Shell Scripting, ASP, PHP, XML, XSL/XSLT, WAP
ERP Tools:              SAP-ABAP
GUI Tools:               Visual Basic 5/6, Swing, Developer 2000, Forms 6i, Report 6i
Other Applications:   MS-Office, Visual Studio, Dream weaver, Hyper Studio, UNIX tools,
                               Visual Café, Photoshop, Director8, Adobe InDesign.


First, let me congratulate this person for not just throwing Selenium in the list for fun, but, WOW, this is what I am talking about.  First off, we start with OS.  MS-DOS, really?  I mean really?  No you haven't.  I used to use MS-DOS and just doing a couple commands in the Windows CMD prompt is NOT using MS-DOS. Is this resumé from 1990 or what?  Then they list every Windows version ever made. Why?  Then UNIX. Ok, that is passable but saying Solaris or HP-UX or something would be more specific.  Now for Linux: WTF is Linux 6.x/7.x/8.x supposed to mean?  You do know Linux is the kernel and, as of this writing, the latest stable version is 3.1.2.  There are a bunch of distributions out there: Debian, Ubuntu, Slackware, RedHat, Fedora, SuSE, etc.  So WTF is Linux 6.x/7.x/8.x???  Just proclaiming our ignorance are we?

Now for Languages.  Are you seriously trying to tell me you program in all these languages? I thought not or you would be applying for a software developers job and not a test job. What you probably meant to say was you have tested some applications that were probably written in these languages with a few of them being what you used in college for your programming class. FORTRAN? Really?

Let's speed this up.  When did IIS become a database?  Isn't WebLogic by Oracle which you already listed?  Why does anyone list MS Office? Are you applying for an administrative assistant position?  MS Office should be a given these days for what little you will actually use it for.  Now the other applications listed here, I just have to ask why?

Also, in the Summary/Expertise section, I just love the ones that proclaim to be Experts in just about everything. NO YOU ARE NOT! Get over yourself.  I kid you not, I once read a resumé where the applicant stated he was a expert in every one of the 20+ areas he listed.  I threw the resumé in the trash without finishing reading it.  But, after going through and rejecting about 30 other people, this resumé was submitted again, so we actually did a phone screen on this person.  Yes, that is as far as he got. Once on the phone, it was fairly easy to tell he wasn't much of an expert at anything but lying.


Proofreading Is So Passé:
You do know that your resumé represents you to your perspective employer, right?  I just love the resumés that state they have excellent communications skills but then proceed to have a shit ton of grammar and spelling errors.  I know for 99% of the resumés I see nowadays that English is not their first language, but come on.  All are written in MS Word which contains both a Spelling and Grammar checker.  They could at least start with that, then get someone to proofread the thing.  Geez.