So I guess it is time for me to reveal one of my worst kept secrets, yes I like to shop. For me it is not the shopping that is the problem it is the absence of it. Last time I gave in to my addiction I started to think about the way I approach my day of shopping pleasure. My conclusion is that I at large have the same exploratory approach to shopping as I approach testing.
A little while ago I was asked whether I knew of any “Industry Best Practice Benchmarks” for “Defect Resolution Time SLA’s”. Some context – this is an incredibly large project with multiple development teams and multiple applications and one very large testing team. The idea behind the SLA I think, was an attempt to manage the bug fixing time. They wanted to create an awareness, so that bugs would be escalated to top management whenever the SLA was not met. In turn, a project that was way over budget and on extremely tight deadlines, could be “project managed” successfully, through the fast fixing and retesting of the massive number of bugs.
I have some questions for you testers out there who like using test cases. To be clear, I’m talking about detailed, stepwise instructions written in order to guide testing and be executed manually.
I have never enjoyed using them and I want to know why you do. Really. I don’t get why test cases are ‘all that’. If there’s anyone out there that loves using test cases, please clue me in as to why. I don’t get it. Help me understand.
If you ever tried SBTM (and if you haven’t – do try it), you might have had the sensation I often get during debriefs: “oooohhh!! I should also have tested this … and that! and…”
Even after nearly two hours of concentrated testing, all that is needed to get more inspiration is to tell others what you did – after a little recess.
I think I will call this iterative thinking, in case no-one else coughed up a name for it – or even that name for it
It shows up often during debriefs, that are almost put on hold while a few testers grab the nearest computer to show, tell and discuss – actually sometimes restarting a session to refine the information they’ve got.
I don’t think we value iterative thinking enough. We tend to value just the ideas we had. I think it would be valuable to allow ourselves to re-invent our ideas, though many would see this as a waste of time. “No need to do that, we already got the ideas we need”. But maybe we can still improve things, and get even better ideas once we revisit the chain of thoughts that led us to the first idea. By that time we will have learned the first idea, so it’s not really new.
All it would take is to allow for recesses and repetitive meetings, when we’re to come up with new ideas, analyze stuff, get inspiration. Exchange the outcome from the first, short meeting with others. Take whatever inspiration to the next meeting, which is held somewhere else. Move around, play with the idea. Refine it.
Would that be so hard?
Rummaging through some documents on my laptops the other day, I accidentally stumbled over an old presentation made by a “process consultant” who shall remain nameless for his own sake. I hadn’t paid much attention to it before, but I opened it and read the executive summary just for fun and got a bit scared of what I read. Here are a few of the friendly suggestions in a nutshell for you to consider.
1. Don’t spend time developing your own processes, “steal” standard processes and use standard solutions.
Yeah, imagine what would happen if you left it up to the people working in the organization to exercise judgment and responsibility over how work should be structured. Or if you were to involve them in the change process.
Ticking off test cases is regarded by many to be good documentation that the test case was tested and was shown to work.
While there’s quite a few problems around this ticking off process in itself, as it often is just a ‘pass’ or a ‘fail’ (I once discussed at a conference expo with a vendor who claimed: what other results could there be ?), I am worried about the value of this as documentation of the test.
Posted by marilyn | Posted in Main | Posted on 20-10-2010
Over the years of working at various client sites (mostly large financial institutions), I have come across a number of attempted implementations of the TPI model (Test Process Improvement) or similar attempts to reach a new level of “maturity” of “Quality Assurance”.
Now I am not saying that everything in these models is rubbish, but in my experience of seeing people trying to implement the model, there does seem to be a lot of the “Best Practice” wording thrown around and not enough consideration given to context.
Do we as software testers need to have programming skills? This question is not new. It has been discussed back and forth for some years now. I don’t know if we as a community are heading toward any sort of consensus on this, but I’m going to throw my opinion out there all the same.
The short answer is ‘No, we don’t need to, but it sure can help’. There are plenty of reasons I can think of to have at least a rudimentary ability to cut code and no real arguments against (If you can think of any, I’d love to hear them).
New hire: Um, can I talk to you?
Manager: Sure, what about?
NH: I think I mislead you during my interview.
Manager: What do you mean?
NH: In the interview I was trying to show the breadth of my knowledge but I think I may have given the impression that I know everything. I don’t. In fact I need to ask you to give me some time to doing some studying. There are some things I need to know for this job.
Manager: I see. What is the problem? Do you need to attend some expensive training?
NH: No. I’ll probably get some books and do some research on the web. Also, I’ll need to do a deep dive on a part of our product.
Manager: OK. Um, well get the books expensed. What else do you need?
NH: Time. I, well, might, possibly, well, need to do this during work…
Manager: Excuse me? Are you saying you will be learning on the job?
NH: Yeah, it might work out that way…
Manager: (laughing) Well don’t let this happen too often or you might just end up knowing everything!
Software Config Management has been, in my experience, one of those nebulous practices that defies definition until you realize that you need it. It’s probably something you can get away without having if your setup is small enough and your developers, testers and system technomancers all communicate clearly, effectively and frequently (I’ll just wait for the snickering up the back to subside, shall I?). Dedicated config management personnel seem to happen to a company when it reaches a certain size of personnel rather than a certain system complexity. I sometimes wonder if that isn’t a bit like shutting the proverbial gate as hoofbeats fade into the distance.