[Testing] Testing Resources Of Niftiness
Mel Chua
mel at laptop.org
Thu Nov 20 21:08:07 EST 2008
One last thing - I know some members of our test community are trying to
learn more about testing (I'm always learning, myself!) and have said
that they'd appreciate occasional neat resources, articles, posts, and
papers as they may apply to testing at OLPC.
Therefore, I'm starting TRON: Testing Resources of Niftiness. I'll start
sharing my favorite OLPC-ish testing resource after each meeting.
(Complaints, comments, suggestions, and your own pointers to niftiness
are very welcome!)
Since this is the first one and some of the people here are new to
testing, I'll start with a pointer to a great self-study intro course on
software testing by Cem Kaner and James Bach. This has a lot of the
basic background and vocabulary that might be helpful to pick up, and
focuses on black-box testing (external testing, without looking at the
code - the kind we've done here so far).
http://www.testingeducation.org/BBST/
Now...
My favorite article this week is actually a blog post: Can your kid beat
you in testing?
(http://software-testing-zone.blogspot.com/2008/09/can-your-kid-beat-you-in-testing.html)
The connection to OLPC should be pretty obvious. ;-) We've spoken about
wanting to get test feedback from kids before; how can we facilitate
test sessions with them that take advantage of the respective strengths
of kids and adult testers? (Paired testing - one child per tester?)
Anyway, some food for thought.
I also can't resist some testing humor, with all the talk about metrics
this week.
Enjoy!
-Mel
---------
The software engineering community has placed a great deal of emphasis
on metrics and their use in software development. The following metrics
are probably among the most valuable for a software project:
The Pizza Metric
How: Count the number of pizza boxes in the lab.
What: Measures the amount of schedule under-estimation. If people are
spending enough after-hours time working on the project that they need
to have meals delivered to the office, then there has obviously been a
mis-estimation somewhere.
The Aspirin Metric
How: Maintain a centrally-located aspirin bottle for use by the team. At
the beginning and end of each month, count the number of aspirin
remaining in the bottle.
What: Measures stress suffered by the team during the project. This most
likely indicates poor project design in the early phases, which causes
over-expenditure of effort later on. In the early phases, high aspirin
usage probably indicates that the product's goals or other parameters
were poorly defined.
The Beer Metric
How: Invite the team to a beer bash each Friday. Record the total bar bill.
What: Closely related to the Aspirin Metric, the Beer Metric measures
the frustration level of the team. Among other things, this may indicate
that the technical challenge is more difficult than anticipated.
The Creeping Feature Metric
How: Count the number of features added to the project after the design
has been signed off, but that were not requested by any requirements
definition.
What: This measures schedule slack. If the team has time to add features
that are not necessary, then there was too much time allocated to a
schedule task.
The "Duck!" Metric
How: This one is tricky, but a likely metric would be to count the
number of engineers that leave the room when a marketing person enters.
This is only valid after a requirements document has been finalized.
What: Measures the completeness of initial requirements. If too many
requirements changes are made after the product has been designed, then
the engineering team will be wary of marketing, for fear of receiving
yet another change to a design which met all initial specifications.
More information about the Testing
mailing list