Eight to Late

Sensemaking and Analytics for Organizations

The case of the missed requirement

with 5 comments

It would have been a couple of weeks after the kit tracking system was released that Therese called Mike to report the problem.

“How’re you going, Mike?” She asked, and without waiting to hear his reply, continued, “I’m at a site doing kit allocations and I can’t find the screen that will let me allocate sub-kits.”

“What’s a sub-kit?” Mike was flummoxed; it was the first time he’d heard the term. It hadn’t come up during any of the analysis sessions, tests, or any of the countless conversations he’d had with end-users during development.

“Well, we occasionally have to break open kits and allocate different parts of it to different sites,”  said Therese.  “When this happens, we need to keep track of which site has which part.”

“Sorry Therese, but this never came up during any of the requirements sessions, so there is no screen.”

“What do I do? I have to record this somehow.” She was upset, and understandably so.

“Look,” said Mike, “could you make a note of the sub-kit allocations on paper – or better yet, in Excel?

“Yeah, I could do that if I have to.”

“Great. Just be sure to record all the kit identifier and which part of the kit is allocated to which site.  We’ll have a chat about the sub-kit allocation process when you are back from your site visit. Once I understand the process, I should be able to have it programmed in a couple of days. When will you be back?”

“Tomorrow,” said Therese.

“OK, I’ll book something for tomorrow afternoon.”

The conversation concluded with the usual pleasantries.

After Mike hung up he wondered how they could have missed such an evidently important requirement. The application had been developed in close consultation with users.  The requirements sessions had  involved more than half the user community. How had they forgotten to mention such an important requirement and, more important, how had he and the other analyst not asked the question, “Are kits ever divided up between sites?”

Mike and Therese had their chat the next day. As it turned out, Mike’s off-the-cuff estimate was off by a long way. It took him over a week to add in the sub-kit functionality, and another day or so to import all the data that users had entered in Excel (and paper!) whilst the screens were being built.

The missing requirement turned out to be a pretty expensive omission.


The story of Therese and Mike may ring true with those who are involved with software development. Gathering requirements is an error prone process: users forget to mention things, and analysts don’t always ask the right questions.  This is one reason why iterative development is superior to BDUF approaches: the former offers many more opportunities for interaction between users and analysts, and hence many more opportunities to catch those elusive requirements.

Yet, although Mike had used a joint development approach, with plenty of interaction between users and developers, this important requirement had been overlooked.

Further, as Mike’s experience corroborates, fixing issues associated with missing requirements  can be expensive.

Why is this so? To offer an answer, I can do no better than to quote from  Robert Glass’ book, Facts and Fallacies of Software Engineering.

Fact 25 in the book goes: Missing requirements are the hardest requirements errors to correct.

In his discussion of the above, Glass has this to say:

Why are missing requirements so devastating to problem solution? Because each requirement contributes to the level of difficulty of solving a problem, and the interaction among all those requirements quickly escalates the complexity of the problem’s solution. The omission of one requirement may balloon into failing to consider a whole host of problems in designing a solution.

Of course, by definition, missing requirements are hard to test for. Glass continues:

Why are missing requirements hard to detect and correct? Because the most basic portion of the error removal process in software is requirements-driven. We define test cases to verify that each requirement in the problem solution has been satisfied. If a requirement is not present, it will not appear in the specification and, therefore, will not be checked during any of the specification-driven reviews or inspections; further there will be no test cases built to verify its satisfaction. Thus the most basic error removal approaches will fail to detect its absence.

As a corollary to the above fact, Glass states that:

The most persistent software errors – those that escape the testing process and persist into the production version of the software – are errors of omitted logic. Missing requirements result in omitted logic.

In his research, Glass found that 30% of persistent errors were errors of omitted logic! It is pretty clear why these errors persist – because it is difficult to test for something that isn’t there. In the story above, the error would have remained undetected until someone needed to allocate sub-kits – something not done very often. This is probably why Therese and other users forgot to mention it. Why the analysts didn’t ask is another question: it is their job to ask questions that will catch such elusive requirements. And before Mike reads this and cries foul, I should admit that I was the other analyst on the project, and I have absolutely no defence to offer.

Written by K

October 17, 2009 at 2:42 pm

5 Responses

Subscribe to comments with RSS.

  1. I wonder if a requirements traceability process that mapped each “object” – in this case a kit – to the verbs and nouns associated with that object would have revealed this missing requirement?

    The requirements engineer could have asked – maybe – “what else can we do with kits, besides ship them?”

    This approach is a systems engineering paradigm, where the terminal nodes of the product breakdown tree have attributes – nouns and verbs – and drive the conversation about their behavior or sub-behaviors, dependencies, interactions, interfaces and all those things around “objects.”

    Glen B. Alleman

    October 20, 2009 at 11:44 am

  2. Glen,

    Thanks for your comment. You’re absolutely right – any process that systematises requirements gathering will help. In the story described above the right questions should have been asked and a more systematic approach used. However, even with a structured approach, there is a (small but significant) possibility that things will be missed – typically these would be unusual scenarios which users may forget to mention, despite prompting.

    Another point worth noting is that requirements traceability can be difficult to implement because of the requirements explosion problem. Glass discusses this point in some detail in his book.




    October 20, 2009 at 8:11 pm

  3. Hello there,

    I am Yousuf Siddiqui and I provide coaching to leaders and organizations.
    I refer to my work as Business Performance Coaching. I increase my clients’ effectiveness by thinking through their most burning issues with them and creating plans to get tangible, measurable and specific results.

    I want to connect with you and follow you on your blog. I hope to learn from you and exchange thoughts with you.

    Yousuf Siddiqui,

    Yousuf Siddiqui

    October 21, 2009 at 6:52 am

  4. Hey there – Great Post! We’ve dealt with this kind of issue by using BABoK idea of enterprise analysis along with daily observation of users. Matter of fact, each daily observation session comes with incredulous BA’s shaking their heads like ‘Wow – she never TOLD us she did that.’

    Michiko Diby

    October 22, 2009 at 10:05 am

  5. Michiko,

    Thanks, that’s an excellent point: if one has the resources, on-the-job observation is a very good way to surface hidden requirements.




    October 22, 2009 at 7:46 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: