• The Devil Is In The Details

    Ever notice how dashboards never tell the whole story? Don’t you want to know what’s causing that spike in the chart?

    Example:

    Testing dashboard shows the number of test cases by status (passed/failed/not executed/etc.). They may even show the number of test cases by status per module/functionality for the application under test (AUT). That’s all good and all but I would wonder how many defects are tied to the test cases that didn’t have a status of passed. Wouldn’t you?

    A wise man once told me that the devil is the details. He is right (at least I think so). If you can’t see the details, how do you know what needs to be made top priority? How do you know that one defect, even if originally marked as Minor should really be marked as Critical because it’s blocking a high number of test cases?

    The Devil Is In The Details

    Before you create/publish your next dashboard/metric sit back and ask yourself this: Does this really show the details needed to give the full picture?

    A dashboard should allow the user to not only see a rollup view of the AUT but also the details behind the numbers. Below are some examples of testing charts that may provide the big picture:

    • What modules/functionality was tested?
    • What is the status for each modules/functionality
    • What is the execution status by day/week?
    • How many defects are tied to the test cases?
    • What is the severity of the defects associated to the test cases?

    Do your dashboards/metrics tell you the whole story?

    Share

Trackback URL for this post:
https://www.teamqualitypro.com/software-metrics/the-devil-is-in-the-details/trackback/