Eight to Late

Sherlock Holmes and the case of the failed projects

with one comment

….as narrated by Dr. John H. Watson M. D.

Foreword

Of all the problems which had been submitted to my friend, Mr. Sherlock Holmes, for consideration during the years of our friendship, there has been one that stands out for the sheer simplicity of its resolution.  I have (until now) been loath to disclose details of the case as I felt the resolution to be so trivial as to not merit mention.

So why bring it up after all these years?

Truth be told, I am increasingly of the mind that Holmes’ diagnosis in the Case of the Failed Projects (as I have chosen to call this narrative), though absolutely correct, has been widely ignored. Indeed, the writings of Lord Standish and others from the business press have convinced me that the real lesson from his diagnosis is yet to be learnt by those who really matter:  i.e. executives and managers.

As Holmes might have said, this is symptomatic of a larger malaise: that of a widespread ignorance of elementary logic and causality.

A final word before I get into the story. As most readers know, my friend is better known for his work on criminal cases. The present case, though far more mundane in its details, is in my opinion perhaps his most important because of its remarkable implications. The story has, I believe, been told at least once before but, like all such narratives, its effect is much less striking when set forth en bloc in a single half-column of print than when the facts slowly emerge before one’s own eyes.

So, without further ado, then, here is the tale…

The narrative

Holmes was going through a lean patch that summer, and it seemed that the only cases that came his way had to do with pilfered pets or suspicious spouses.   Such work, if one can call it that, held little allure for him.

He was fed up to the point that he was contemplating a foray into management consulting.  Indeed, he was certain he could do as well, if not better than the likes of Baron McKinsey and Lord Gartner (who seemed to be doing well enough). Moreover his success with the case of the terminated PMO  had given him some credibility in management circles.   As it turned out, it was that very case that led Mr. Bryant (not his real name) to invite us to his office that April morning.

As you may have surmised, Holmes accepted the invitation with alacrity.

The basic facts of the issue, as related by Bryant, were simple enough: his organization, which I shall call Big Enterprise, was suffering from an unduly high rate of project failure. I do not recall the exact number but offhand, it was around 70%.

Yes, that’s right: 7 out of every 10 projects that Big Enterprise undertook were over-budget, late or did not fulfil business expectations!

Shocking, you say… yet entirely consistent with the figures presented by Lord Standish and others.

Upon hearing the facts and figures, Holmes asked the obvious question about what Big Enterprise had done to figure out why the failure rate was so high.

“I was coming to that,” said Bryant, “typically after every project we hold a post-mortem.  The PMO  (which, as you know,I manage)  requires this. As a result, we have a pretty comprehensive record of ‘things that went well’ on our projects and things that didn’t.  We analysed the data from failed projects and found that there were three main reasons for failure: lack of adequate user input, incomplete or changing user requirements and inadequate executive support.”

“….but these aren’t the root cause,” said Holmes.

“You’re right, they aren’t” said Bryant, somewhat surprised at Holmes’ interjection. “Indeed, we did an exhaustive analysis of each of the projects and even interviewed some of the key team members. We concluded that the root cause of the failures was inadequate governance on the PMO’s  part,” said Bryant.

“I don’t understand.  Hadn’t you established governance processes prior to the problem? That is after all the raison d’etre of a PMO…”

“Yes we had, but our diagnosis implied those processes weren’t working. They needed to be tightened up.”

“I see,” said Holmes shortly. “I’ll return to that in due course. Please do go on and tell me what you did to address the issue of poor…or inadequate governance, as you put it.”

“Yes, so we put in place processes to address these problems. Specifically, we took the following actions. For the lack of user input, we recommended getting a sign-off from business managers as to how much time their people would commit to the project. For the second issue – incomplete or changing requirements – we recommended that in the short term, more attention be paid to initial requirement gathering, and that this be supported by a stricter change management regime. In the longer term, we recommended that the organization look into the possibility of implementing Agile approaches. For the third point, lack of executive support, we suggested that the problem be presented to the management board and CEO, requesting that they reinforce the importance of supporting project work to senior and middle management.”

Done with his explanation, he looked at the two of us to check if we needed any clarification. “Does this make sense?” he enquired, after a brief pause.

Holmes shook his head, “No Mr. Bryant the actions don’t make sense at all.  When faced with problems, the kneejerk reaction is to resort to more control. I submit that your focus on control misled you.”

“Misled? What do you mean?”

“Well, it didn’t work did it? Projects in Big Enterprise continue to fail, which is why we are having this meeting today.  The reason your prescription did not work is that you misdiagnosed the issue. The problem is not governance, but something deeper.”

Bryant wore a thoughtful expression as he attempted to digest this. “I do not understand, Mr. Holmes,” he said after a brief pause. “Why don’t you just tell me what the problem is and how can I fix it? Management is breathing down my neck and I have to do something about it soon.”

“To be honest, the diagnosis is obvious, and I am rather surprised you missed it,” said Holmes, “I shall give you a hint: it is bigger, much bigger, than the PMO and its governance processes.”

“I’m lost, Mr. Holmes.  I have thought about it long enough but have not been able to come up with anything. You will have to tell me,” said Bryant with a tone that conveyed both irritation and desperation.

“It is elementary, Mr. Bryant, when one has eliminated the other causes, whatever remains, however improbable, must be the truth. Your prior actions have all but established that the problem is not the PMO, but something bigger. So let me ask the simple question: what is the PMO a part of?”

“That’s obvious,” said Bryant, “it’s the organization, of course.”

“Exactly, Mr. Bryant: the problem lies in Big Enterprise’s organisational structures, rules and norms. It’s the entire system that’s the problem, not the PMO per se.”

Bryant looked at him dubiously.  “I do not understand how  the three points I made earlier – inadequate user involvement, changing requirements and lack executive sponsorship – are due to Big Enterprise’s structures, rules and norms. “

“It’s obvious,” said Holmes, as he proceeded to elaborate how lack of input was a consequence of users having to juggle their involvement in projects with their regular responsibilities. Changes in scope and incomplete requirements were but a manifestation of  the fact that users’ regular work pressures permitted only limited opportunities for interaction between users and the project team – and that it was impossible to gather all requirements…or build trust through infrequent interactions between the two parties. And as for lack of executive sponsorship – that was simply a reflection of the fact that the executives could not stay focused on a small number of tasks because they had a number of things that competed for their attention…and these often changed from day to day. This resulted in a reactive management style rather than a proactive or interactive one.  Each of these issues was an organizational problem that was well beyond the PMO.

“I see,” said Bryant, somewhat overwhelmed as he realized the magnitude of the problem, “…but this is so much bigger than me. How do I even begin to address it?”

“Well, you are the Head of the PMO, aren’t you?  It behooves you to explain this to your management.”

“I can’t do that!” exclaimed Bryant. “I could lose my job for stating these sorts of things, Mr. Holmes – however true they may be. Moreover, I would need incontrovertible evidence…facts demonstrating exactly how each failure was a consequence of organizational structures and norms, and was therefore out of the PMO’s control.”

Holmes chuckled sardonically. “I don’t think facts or ‘incontrovertible proof’ will help you Mr. Bryant. Whatever you say would be refuted using specious arguments…or simply laughed off.  In the end, I don’t know what to tell you except that it is a matter for your conscience;  you must do as you see fit.”

We left it at that; there wasn’t much else to say. I felt sorry for Bryant. He had come to Holmes for a solution, only to find that solving the problem might involve unacceptable sacrifices.

We bid him farewell, leaving him to ponder his difficult choices.

—-

Afterword

Shortly after our meeting with him, I heard that Bryant had left Big Enterprise. I don’t know what prompted his departure, but I can’t help but wonder if our conversation and his subsequent actions had something to do with it.

…and I think it is pretty clear why Lord Standish and others of his ilk still bemoan the unduly high rate of project failure.

 Notes

  1. Sherlock Holmes aficionados may have noted that the foreword to this story bears some resemblance to the first paragraph of the Conan Doyle classic, The Adventure of the Engineer’s Thumb.
  2. See my post entitled Symptoms not causes, a systems perspective on project failure for a more detailed version of the argument outlined in this story.
  3. For insight into the vexed question of governance, check out this post by Paul Culmsee and the book I co-authored with him.

Written by K

April 15, 2014 at 8:28 pm

Six heresies for business intelligence

with 8 comments

What is business intelligence?

I recently asked a few acquaintances to answer this question without referring to that great single point of truth in the cloud.  They duly came up with a variety of  responses ranging from data warehousing and the names of specific business intelligence tools to particular functions such as reporting or decision support.

After receiving their responses, I did what I asked my respondents not to: I googled the term.  Here are a few samples of what I found:

According to CIO magazine, Business intelligence is an umbrella term that refers to a variety of software applications used to analyze an organization’s raw data.

Wikipedia, on the other hand, tells us that BI is a set of theories, methodologies, architectures, and technologies that transform raw data into meaningful and useful information for business purposes.

Finally, Webopedia, tell us that BI [refers to] the tools and systems that play a key role in the strategic planning process of the corporation.

What’s interesting about the above responses and definitions is that they focus largely on processes and methodologies or tools and techniques. Now, without downplaying the importance of either, I think that many of the problems of business intelligence practice come from taking a perspective that is overly focused on methodology and technique.  In this post, I attempt to broaden this perspective by making some potentially controversial statements –or heresies – that challenge this view. My aim is not so much to criticize current practice as to encourage – or provoke – business intelligence professionals to take a closer look at some of the assumptions underlie their practices.

The heresies

Without further ado, here are my six heresies for business intelligence practice (in no particular order).

A single point of truth is a mirage

Many organisations embark on ambitious programs to build enterprise data warehouses – unified data repositories that serve as a single source of truth for all business-relevant data.  Leaving aside the technical  and business issues associated with establishing definitive data sources and harmonizing data, there is the more fundamental question of what is meant by truth.

The most commonly accepted notion of truth is that information (or data in a particular context) is true if it describes something as it actually is. A major issue with this viewpoint is that data (or information) can never fully describe a real-world object or event. For example, when a sales rep records a customer call, he or she notes down only what is required by the customer management system. Other data that may well be more important is not captured or is relegated to a “Notes” or “Comments” field that is rarely if ever searched or accessed. Indeed, data represents only a fraction of the truth, however one chooses to define it – more on this below.

Some might say that it is naïve to expect our databases to capture all aspects of reality, and that what is needed is a broad consensus between all relevant stakeholders as to what constitutes the truth. The problem with this is that such a consensus is often achieved by means that are not democratic. For example, a KPI definition chosen by a manager may be hotly contested by an employee.  Nevertheless, the employee has to accept it because that is the way (many) organisations work. Another significant issue is that the notion of relevant stakeholders is itself problematic because it is often difficult to come up with clear criterion by which to define relevance.

There are other ways to approach the notion of truth: for example, one might say that a piece of data is true as long as it is practically useful to deem it so. Such a viewpoint, though common, is flawed because utility is in the eye of the beholder: a sales manager may think it useful to believe a particular KPI whereas a sales rep might disagree (particularly if the KPI portrays the rep in a bad light!).

These varied interpretations of what constitute a truth have implications for the notion of a single point of truth. For one, the various interpretations are incommensurate – they cannot be judged by the same standard.  Further, different people may interpret the same piece of data differently. This is something that BI professionals have likely come across – say when attempting to come up with a harmonized definition for a customer record.

In short: the notion of a single point of truth is problematic because there is a great deal of ambiguity about what constitutes a truth.

There is no such thing as raw data

In his book, Memory Practices in the Sciences, Geoffrey Bowker wrote, “Raw data is both an oxymoron and a bad idea; to the contrary, data should be cooked with care.”  I love this quote because it tells a great truth (!) about so-called “raw” data.

To elaborate: raw data is never unprocessed. Firstly, the data collector always makes a choice as to what data will be collected and what will not. So in this sense, data already has meaning imposed on it. Second, and perhaps more important, the method of collection affects the data. For example, responses to a survey depend on how the questions are framed and how the survey itself is carried out (anonymous, face-to-face etc.).   This is also true for more “objective” data such as costs and expenses. In both cases, the actual numbers depend on specific accounting practices used in the organization. So, raw data is an oxymoron because data is never raw, and as Bowker tells us, we need to ensure that the filters we apply and the methods of collection we use are such that the resulting data is “cooked with care.”

In short: data is never raw, it is always “cooked.”

There are no best practices for business intelligence, only appropriate ones

Many software shops and consultancies devise frameworks and methodologies for business intelligence which they claim are based on best or proven practices. However, those who swallow that line and attempt to implement the practices often find that the results obtained are far from best.

I have discussed the shortcomings of best practices in a general context in an earlier article, and (at greater length) in my book. A problem with best practice approaches is that they assume a universal yardstick of what is best.  As a corollary, this also suggest that practices can be transplanted from one organization to another in a wholesale manner, without extensive customisation. This overlooks the fact that organisations are unique, and what works in one may not work in another.

A deeper issue is that much of the knowledge pertaining to best practices is tacit – that is, it cannot be codified in written form. Indeed, what differentiates good business intelligence developers or architects from great ones is not what they learnt from a textbook (or in a training course), but how they actually practice their craft.  These consist of things that they do instinctively and would find hard to put into words.

So, instead of looking to import best practices from your favourite vendor, it is better to focus on understanding what goes on in your environment. A critical examination of your environment and processes will reveal opportunities for improvement. These incremental improvements will cumulatively add up to your very own, customized “best practices.”

In short: develop your own business intelligence best practices rather than copying those peddled by “experts.”

Business intelligence does not support strategic decision-making

One of the stated aims of business intelligence systems is to support better business decision making in organisations (see the Wikipedia article, for example). It is true that business intelligence systems are perfectly adequate – even indispensable – for certain decision-making situations. Examples of these include, financial reporting (when done right!) and other operational reporting (inventory, logistics etc).  These generally tend to be routine situations with clear cut decision criteria and well-defined processes – i.e. decisions that can be programmed.

In contrast, decisions pertaining to strategic matters cannot be programmed. Examples of such decisions include: dealing with an uncertain business environment, responding to a new competitor etc. The reason such decisions cannot be programmed is that they depend on a host of factors other than data and are generally made in situations that are ambiguous.  Typically people use deliberative methods – i.e. methods based on argumentation – to arrive at decisions on such matters.  The sad fact is that all the major business tools in the market lack support for deliberative decision-making. Check out this post for more on what can be done about this.

In short: business intelligence does not support strategic decision-making .

Big data is not the panacea it is trumpeted to be

One of the more recent trends in business intelligence is the move towards analyzing increasingly large, diverse, rapidly changing datasets – what goes under the umbrella term big data.  Analysing these datasets entails the use of new technologies (e.g. Hadoop and NoSQL)  as well as statistical techniques that are not familiar to many mainstream business intelligence professionals.

Much has been claimed for big data; in fact, one might say too much.  In this article Tim Harford (aka the Undercover Economist) summarises the four main claims of “big data cheerleaders” as follows (the four phrases below are quoted directly from the article):

  1. Data analysis produces uncannily accurate results.
  2. Every single data point can be captured, making old statistical sampling techniques obsolete.
  3. It is passé to fret about what causes what, because statistical correlation tells us what we need to know.
  4. Scientific or statistical models aren’t needed.

The problem, as Harford points out, is that all of these claims are incorrect.

Firstly, the accuracy of the results that come out of a big data analysis depend critically on how the analysis is formulated. However, even analyses based on well-founded assumptions can get it wrong, as is illustrated in this article about Google Flu Trends.

Secondly, it is pretty obvious that it is impossible to capture every single data point (also relevant here is the discussion on raw data above – i.e. how data is selected for inclusion).

The third claim is simply absurd. The fact is detecting a correlation is not the same as  understanding what is going on a point made rather nicely by Dilbert.  Enough said, I think.

Fourthly, the claim that scientific or statistical models aren’t needed is simply ill-informed. As any big data practitioner will tell you, big data analysis relies on statistics. Moreover, as mentioned earlier, a correlation-based understanding is no understanding at all –  it cannot be reliably extrapolated to related situations without the help of hypotheses and (possibly tentative)  models of how the phenomenon under study works.

Finally, as Danah Boyd and Kate Crawford point out in this paper , big data changes the meaning of what it means to know something….and it is highly debatable as to whether these changes are for the better. See the paper for more on this point. (Acknowledgement: the title of this post is inspired by the title of the Boyd-Crawford paper).

In short:  business intelligence practitioners should not uncritically accept the pronouncements of big data evangelists and vendors.

Business intelligence has ethical implications

This heresy applies to much more than business intelligence: any human activity that affects other people has an ethical dimension. Many IT professionals tend to overlook this facet of their work because they are unaware of it – and sometimes prefer to remain so. Fact is, the decisions business intelligence professionals make with respect to usability, display, testing etc. have a potential impact on the people who use their applications. The impact may be as trivial as having to click a button or filter too many before they get their report, to something more significant, like a data error that leads to a poor business decision.

In short: business intelligence professionals ought to consider how their artefacts and applications affect their users.

In closing

This brings me to the end of my heresies for business intelligence. I suspect there will be a few practitioners who agree with me and (possibly many) others who don’t…and some of the latter may even find specific statements provocative. If so, I consider my job done, for my intent was to get business intelligence practitioners to question a few unquestioned tenets of their profession.

Written by K

April 3, 2014 at 9:29 pm

The essence of entrepreneurship

with 2 comments

Introduction

In keeping with the standard connotation of the term, Wikipedia defines entrepreneurship as the “process of identifying and starting a business venture, sourcing and organizing the required resources and taking both the risks and rewards associated with the venture.” We are all familiar with stories of successful entrepreneurs; indeed, how can we not be – magazines and books are filled with anecdotes and case studies of entrepreneurial folks whose example we are urged to follow…the inventors of a certain search engine being particularly favoured role models.

Yet, after we are done digesting the rhetoric of gurus and ghostwriters, we seem to be none the wiser. The stories, as entertaining as they are, fail to capture the essence of entrepreneurship.

There is a good reason for this: entrepreneurship is not a process as Wikipedia (and books/gurus) would have us believe. Rather it is about developing sensitivities towards anomalies or disharmonies in our day-to-day lives and then attempting to do something about them.  This post, which is based on portions of a brilliant book entitled, Disclosing New Worlds, is an attempt to elaborate on this point. The book is written by an unusual set of authors including a philosopher and an entrepreneur, so it is not surprising that it offers a completely fresh perspective of the topic.

Before I dive into it, a few words about how this article is organized: I begin with some background material that is necessary in order to understand the main arguments in the book. Despite my best efforts, this section is rather long and somewhat involved (I’d appreciate any feedback and/or suggestions for improvement). Following that, I present the authors’ critique of conventional views of entrepreneurship and discuss why they are inadequate.  I then (finally!) get to the main topic: a discussion of the essence of entrepreneurship, illustrating some of the key points through a concrete, though somewhat unusual example.

Background: Heidegger, rationalism and postmodernism

The central thesis of the book is based on the philosophy of Martin Heidegger, in particular his thoughts on how we perceive, encounter and deal with the world. For this reason, I will spend some time discussing Heidegger’s philosophy as it pertains to the discussion of entrepreneurship presented in the book.

The best way to understand to Heidegger’s perspective is to contrast it with the two dominant worldviews of our times: the  scientific-rational (or Cartesian) worldview that forms the basis of scientific thinking and the postmodern view which emphasizes the role of human choice and radical change.  I elaborate on the differences between the Heideggerean worldview on the one hand and Cartesianism and postmodernism on the other. I focus mainly on the Cartesian worldview as it is by far the more dominant one of the two, and will discuss postmodernism only briefly towards the end of this section.

A Cartesian observer perceives the world as being comprised of things and processes that can be observed and analysed in an objective manner.  To be sure, the importance of such a mode of thinking cannot be overstated; it is after all what makes science and technology possible.  However, and this is a key point, such a view does not come naturally to humans. As Heidegger noted, our actual day-to-day interactions with the world are not objective: we see tables, desks or computers not as objects to be analysed, but as things to be used in a natural way – i.e. without conscious thought. Heidegger coined the term ready-to-hand to denote this non-objective, natural way in which we deal with the world.

Heidegger claimed that we encounter things in the world primarily as being ready-to-hand rather than as objects in their own right. We take an objective attitude towards them only when they breakdown – i.e. when they stop functioning as we expect them to. For example, I become consciously aware of my computer as a computer only when it starts to malfunction.  In other words, it is only when the computer stops being ready-to-hand that I see it as an object to be examined in its own right. When it is functioning correctly, however, it is simply a tool that I use without conscious awareness that it is a computer. It is in this sense that the rational-scientific way of viewing the world is not a natural one. Indeed, the rational-scientific mode of thinking completely misses this natural way in which we encounter the world.

Although the foregoing might sound a bit “out there”, it is important to note that Heidegger’s philosophy is primarily practical for it deals with the day-to-day aspects of life. Indeed, our daily lives consist of a number of relatively self-contained worlds: home, work, friends – each with their own set of practices, i.e. things that we do within them in a natural way. A key Heideggerean concept in this connection is that of a disclosive space – which is a set of interrelated practices and ready-to-hand objects that define a particular aspect of our lives. For example, a disclosive space for a writer might include his or her equipment (computer, desk etc.) and practices (writing habits, rituals etc) that he or she may follow when writing.

I’ll use the example of writers to illustrate another important point. Different writers may have different ways of working – each of these define different disclosive spaces although all writers engage in essentially the same activity (i.e. that of writing). The differences between similar disclosive spaces amount to differences in what Heidegger called style.  Different writers have different working styles (not to be confused with their writing styles) as do different scientists, bakers, or even IT managers.   A style is the way in which our practices within a disclosive space hang together as a whole – that is, it is the way in which we perform our tasks at work or when writing, doing science, baking or even when managing people, projects and processes.  This is pretty much in line with the way in which we use the word “style” when we say, “that is (or is not) my style” or simply, “that’s (not) me.”

Another important aspect of Heideggerian thought is the notion of authenticity (see this article for a very readable discussion of authenticity in online interactions). According to Heidegger, being authentic means to act in a way that is true to oneself. This amounts to acting in a way that is consistent with what one really thinks or believes. Among other things, being authentic implies a deep awareness of who one is and what one stands for.  Indeed, authenticity (or the lack of it) is reflected in one’s style (Reminder: style is what defines differences between similar disclosive spaces). Authenticity is inconsistent with a rational scientific worldview because it necessarily implies that one acts in an engaged and involved way – the polar opposite of the detached, dispassionate attitude that is valued by rationalism.

From the foregoing, it should be clear that Heidegger emphasizes the “involvedness” with which we engage in our day to day activities, at least, when we are immersed in what we are doing. It is impossible to be objective when one is totally involved with what one is doing. This is completely antithetical to the scientific rational view in which we are supposed to maintain a detached, dispassionate view of the world.

An important corollary of the above is that the scientific-rational view sees the world in an ahistorical (or non-historical) way – i.e. one does not consider one’s actions as being part of an ongoing story.  Such an attitude can only result in partial knowledge, for to know things as they really are, one must understand their antecedents. Consider, for example, our current attitude to natural resources:  we see them as objects being available for uncontrolled exploitation rather than as non-renewable products of a (historical) process of evolution that ought to be used in a sustainable way.  Such a mindset is common to most rational-scientific thinking – history and social consequences are considered to be sideshows that have at best a peripheral relevance to the matter at hand. The dangers of such thinking are becoming increasingly apparent.

The postmodern worldview is at the other end of the spectrum from the rational one. Postmodernism originally developed as a challenge to commonly accepted worldviews such as the scientific-rational one as well as those rooted in cultural traditions.   Postmodernism tells us that the scientific worldview does not have universal applicability, and that other modes of thinking (humanism, religion) may be more appropriate in certain domains. Apart from choice, the notion of radical change is central to postmodernism.  So, although it is opposed to the rational worldview, it shares with a lack of due consideration of history because it advocates a discontinuous break with the past.

Before going on, it is worth summarizing the key messages of this section. In contrast to the Cartesian and postmodern views, Heidegger tells us that we experience the world (and its contents) in a ready-to-hand manner; that is, we encounter them not as objects to be analysed (as the rational view would have us believe) or to be interpreted as we please (as the postmodernists tell us), but as natural aspects of our day-to-day world.  Heidegger emphasized that our identities arise largely from the way we encounter and deal with these aspects of our lives. Different people deal with the same situation in different ways – and each of these ways constitutes a style.  As we shall see later, entrepreneurship is a certain style of encountering the world. However, before doing so, let us look at some conventional interpretations of entrepreneurship and see why they are deeply mistaken.

Conventional treatments of entrepreneurship

The authors critique the three major mainstream strands of thought on entrepreneurship:

  1. The theoretical approach
  2. The empirical approach
  3. The virtue-based (or devotional) approach

The theoretical approach is championed by writers such as Peter Drucker, who seek to build theoretical models of entrepreneurship. As he wrote in his classic, Innovation and Entrepreneurship, “Every practice rests on theory.”  It is easy to see that this claim is mistake by noting that there are many everyday practices that do not rest on theory – riding a bicycle for example.  In the case of entrepreneurship the gap between practice and theory is even wider because there is no well-defined process for entrepreneurship. In his book, Drucker claimed that entrepreneurship can be boiled down to a purposeful search for the “symptoms that indicate opportunities for innovation” and to “know and apply the principles of successful innovation” to these opportunities.

The problem with this viewpoint is that the “symptoms that indicate opportunities” are never obvious. At this very instant there are likely to be many such “symptoms” that we cannot see simply because they are not attuned to them. Some of these might be picked up people who are sensitive to such anomalies…and a small fraction of those who sense these anomalies might care enough to develop a concrete vision to do something about them. This is not a process in the usual sense of the word; it is a deeply personal journey that even the entrepreneur who experiences it would have difficulty articulating.

The empirical viewpoint is championed by those who believe that the “skill” of entrepreneurship is best learnt by studying examples of successful entrepreneurs. This approach consists of analysing a wide variety of case studies through which one develops an understanding of “different types” of entrepreneurship.  Indeed, the whole point of case-study based learning is that it is supposed to be a substitute for real world experience – a sort of short-cut to wisdom. The flaw with this logic is easy to see – reading detailed biographies of, say, Barack Obama or Stephen Hawking will not help one internalize the qualities that make a successful politician or physicist.

The virtue-based approach takes the view that successful entrepreneurs have certain qualities or virtues that makes them sensitive to potential entrepreneurial opportunities. George Gilder, a proponent of this view, suggests that the virtues of the successful entrepreneur are giving (philanthropy), humility and commitment. The problem with this view is again easy to see: there are many non-entrepreneurs who have these virtues and, perhaps more important, there are a great many entrepreneurs who have none of them. Nevertheless, the virtue-based approach is possibly closer to the mark it highlights the importance of second order practices – that is, practices that change the way we look at the world. Indeed, as we shall see next, entrepreneurship is a second-order practice.

History making – the essence of entrepreneurship

The concept of a disclosive space discussed above is the key to understanding what entrepreneurship is. As a reminder, a disclosive space is a set of interrelated practices and objects that define a certain aspect of our lives – for example, our driving a car, gardening etc.

When we act within a disclosive space, we are in effect disclosing (or, making apparent) an aspect of our lives.  These disclosures are usually unsurprising because we act in customary or expected ways. For example, when we see someone driving or gardening, we have a pretty good idea of what they are doing without having to be told what they are up to. Their actions more or less explain themselves because they correspond to normal or well accepted ways in which humans act.  However it is important to note that even though the practices are seen as normal, it doesn’t mean that they cannot be changed or improved – it is only that most of us do not see any scope for improvement.

This brings us to the crux of the argument: an entrepreneur is someone who sees scope for changing customary practices in a novel way. Moreover, since such changes completely transform the style of a disclosive space, in effect they disclose new worlds. Put another way, an entrepreneur is someone who sees anomalies in our customary ways of disclosing. He or she then holds on to those anomalies and attempts to fix or transform them by making changes in customary practices. Indeed, this is precisely what that much overused, overhyped and misunderstood term, innovation, is all about. Quoting from the book:

The kind of thinking that leads to innovation requires an openness to anomalies in life. It requires an interest in holding on to these anomalies in one’s daily life and in seeing clearly how the anomalies look under different conditions. If people do this in an enterprise….then they cannot see their lives and the …space in which they work as being settled…If one is living in the natural settled way of doing things then things happen as they should.  The unordinary will appear unnatural and monstrous, not a truth worthy of preservation or [more important] a focus for reorganizing one’s life.

It should also be clear, now, that an entrepreneur must have a good sense of history – to understand what changes he or she wants to bring about and why, an entrepreneur must have a deep understanding of the current situation and its antecedents.  Moreover, since such a person transforms established practices, in a more or less radical fashion, he or she is actually making history.

The book describes different ways in which historical disclosing can occur. These have applicability not only to entrepreneurship but also in the social and political sphere. However, for reasons of space I will not go into this. Instead, I will close this article with an example that illustrates the points I have made about entrepreneurship.

An example

In the early 1900s, a clerk at a patent office in Bern wrote a number of landmark papers that transformed physics and our understanding of the world. Indeed, Albert Einstein is a perfect example of entrepreneurship in action.  I will focus on just one of his contributions to physics – the special theory of relativity – and show how the way in which he arrived at this theory embodies the points I have made in the previous section (Note: I’ve glossed over some technical details below; the discussion is involved enough as it is!) :

  1. The pre-Einsteinian worldview was based on classical mechanics, which came out of the work of Newton and others. When Einstein proposed his theory of special relativity, classical mechanics had been around for more than 200 years, and had been successfully used to solve many scientific and engineering problems.
  2. One of the consequences of classical mechanics is that the speed of any object depends on the state of motion of the person who is observing it. An example will help make this cryptic statement clearer:  two trains travelling at the same speed in the same direction are motionless with respect to each other – i.e. to an observer located on one of the trains, the other train will appear to be motionless. However, an observer located on the ground will see both trains as moving.
  3. The work of James Clerk Maxwell on electromagnetic theory in the late 1800s predicted that the speed of light in a vacuum is a constant – approx. 300,000 km/sec.  Experiments showed that the speed of light turned out to be this value regardless of the state of motion of the observer.
  4. There is a contradiction between (2) and (3), for if Maxwell’s theory is to be consistent with classical mechanics, the speed of light ought to depend on the speed of the observer. However, although many experiments were devised to detect such a dependence, none was ever found.
  5. Einstein realized that either Newton or Maxwell had to be wrong. He held on to this anomaly for a long time, pondering the best way to resolve it. He finally surmised (for reasons I won’t go into here) that the fault lay with classical mechanics rather than Maxwell’s electromagnetic theory. Very simply, he made the bold guess that classical mechanics is wrong at speeds close to that of light.  In effect, Einstein resolved the anomaly by “fixing up” classical mechanics in such a way as to make it consistent with electromagnetic theory. The special theory of relativity is basically the resolution of this anomaly.  Indeed, most major scientific advances are made through the resolution of such anomalies.

In brief, then, the special theory of relativity:

  • Resolved a key anomaly of late 19th century physics in a completely novel way.
  • Disclosed a new world – literally!

In developing the theory, Einstein displayed a unique style of doing physics – for example, since it is impossible to travel at speeds close to that of light, he devised thought experiments to work out the consequences of travelling at such speeds. He also displayed a deep sense of the history of the problem that he was working on: without a thorough understanding of the work of Newton, Maxwell and others, it would have been impossible for him to develop his theory.

In short, Einstein is the quintessential entrepreneur because he made history by disclosing a new world.

Conclusion

Entrepreneurs are those who care deeply about anomalies and have the ability to hold on to and think about them over extended periods of time. In doing so they sometimes resolve the anomalies that worry them, and are then recognized as entrepreneurs. However, there are many who struggle without success, and they are no less entrepreneurial than those who succeed. Such people, whether successful or not, necessarily possess a deep sense of the history of the problem they attempt to address. Indeed, this is must be so, for in resolving the anomaly they care about, they write another chapter of that history.

Written by K

March 20, 2014 at 6:35 am

Posted in Organizations

Tagged with

The architect and the apparition – a business fable

with 7 comments

Sean yawned as he powered down his computer and stretched out in his chair. It was nearly 3 am and he had just finished proofreading his presentation for later that day. He didn’t remember ever being this tired; a great deal of effort had been expended over the last three months but it had been worth it. Now, finally, he was done.

He gazed down at the beautifully bound document on his desk with a fondness that lesser mortals might bestow on their progeny.

“That’s a fine looking document you have there,” said an oddly familiar voice from right behind his chair.

“Wha..,” squeaked Sean, shooting out of his chair,  upending his coffee mug in the process.

He grabbed a couple of tissues and dabbed ineffectually at the coffee stain that was spreading rapidly across the front of his brand new chinos.   “Damn,” he cursed as he looked up to find himself face-to-face with someone who looked just like him – right down to the Ralph Lauren shirt and Chinos (minus the unseemly stain).

“Pardon me,” sputtered the apparition, giving in to a fit of laughter. “That’s funniest thing I’ve seen in a long time, a scene worthy of million YouTube hits. You should’ve seen yourself jump out the chair and…”

“Pardon my rudeness, but who the f**k are you?” interrupted Sean, a tad testily. Who did this guy think he was anyway?  (Lest you get the wrong idea, Sean didn’t normally use expletives, but he reckoned the situation merited it.)

“Don’t swear at me,” said the double, “because I am you…well, your conscience actually. But, in the end I’m still you.”

“Bah,” replied Sean. He figured this had to be a prank hatched by one of his workmates. “Tell me which one of my smartarse colleagues put you up to this?” he demanded, “Let me guess; it is either Mal or Liz.”

“You don’t believe me, do you? No one put me up to this. Well actually, if anyone did, it was you!”

“That’s nonsense,” spat Sean, his blood pressure rising a notch, “I have no idea who you are.”

“Ah, now we get to the nub of the matter,” said the apparition, “You have no idea who I am, and that is precisely why I’m here:  to remind you that I exist and that you should listen to me from time to time. I usually start to bother you when you’re are about to do something stupid or unethical.”

“Me? Stupid? Unethical?  I have no idea what you’re on about,” contested Sean.

“It appears I need to spell out for you. Well here’s the executive summary:  I think you need to revise that document you’ve been working on. I’m your conscience, and I think I can help.”

“I… don’t… need… your… help,” said Sean enunciating each word exaggeratedly for emphasis, “you probably do not know this, but I have completed the biggest and most ambitious design I’ve ever done:  a comprehensive systems architecture for Big Enterprise. I’m confident of what I have proposed because it is backed by solid research and industry best practice.”

“I know what you have done,” said the doppelganger, “I’m your conscience, after all.” He paused to clear his throat. “And I’m sure you believe what you have written, “he continued, “but that doesn’t make it right.”

“It is impeccably researched! You know, I’ve cited over 800 references, yeah eight hundred,” said Sean with pride. “That’s over two references per page, and most of these are to works written by acknowledged experts  in the field.”

“I do not doubt your knowledge or diligence, my friend,” said the doppelganger with a smile, “what I worry about is your judgement.”

Sean was ready to blow a fuse, but was also confused (intrigued?) by the double’s choice of words. “Judgement?” he blurted, “WTF do you mean by ‘judgement?”  He picked up the tome and waved it in front of the doppelganger imperiously…but then spoilt the effect by almost spraining his wrist in the process. He put the book down hurriedly saying, “this is an objective evaluation; the facts speak for themselves.”

“Do they?” queried the apparition. Sure, you’ve collected all this information and have fashioned into a coherent report.  However, your recommendations, which appear to be based on facts, are in truth based on unverifiable assumptions, even opinions.”

“That’s nonsense,” dismissed Sean. “You haven’t even read the report, so you’re in no position to judge.”

“I have. I’m your conscience, remember?”

“Pah!”

“OK, so tell me what you did and how you did it,” said the apparition evenly.

Sean held forth for a few minutes, describing how he researched various frameworks, read case studies about them and then performed an evaluation based on criteria recommended by experts.

“I concede you that you truly believe you are right, but the sad fact is that you probably aren’t,” said the double, “and worse, you refuse to entertain that possibility.”

“That’s hogwash! If you’re so sure then prove it,” countered Sean.

“Hmmm, you are thicker than I thought, let me see if I can get my point across in a different way,” said the double.  “You’re doing something that will influence the future of technology in your organisation for a long time to come. That is an immense responsibility…”

“I’m aware of that, thank you,” interrupted Sean, raising his voice. He’d had enough of this presumptuous, insulting clown.

“If you say so,” said the doppelganger, “but, to be honest, I sense no doubts and see no caveats in your report.”

“That’s because I have none! I believe in and stand by what I have done,” shouted Sean.

“I have no doubt that you believe in what you have done. The question is, do others, will others?”

“I’m not stupid,” said Sean, “I’ve kept my managers and other key stakeholders in the loop throughout. They know what my recommendations are, and they are good with them.”

“How many stakeholders, and where are they located?”

“Over ten important stakeholders, senior managers, all of them, and all seated right here in the corporate head office,” said Sean. He made to pick up the tome again, but a twinge in his wrist reminded him that it might not be wise to do so. “Let me tell you that the feedback I have from them is that this is a fantastic piece of work,” he continued, emphasizing his point by rapping on the wrist-spraining tome with his knuckles. “So please go away and leave me to finish up my work.”

“Yeah, I’ll go, it seems you have no need of me,” said the double, “but allow me a couple of questions before I go. I am your conscience after all!”

“Ok, what is it?” said Sean impatiently. He couldn’t wait to see the back of the guy.

“You’re working in a multinational right? But you’ve spoken to a few stakeholders all of whom are sitting right here, in this very building. Have you travelled around and spoken with staff in other countries – say in Asia and Europe – and gotten to know their problems before proposing your all-embracing architecture?”

“Look,” said Sean, “it is impossible to talk to everyone, so, I have done the best I can: I have proposed a design that adheres to best practices, and that means my design is fundamentally sound,” asserted Sean. “Moreover, the steering committee has reviewed it, and has indicated that it will be approved.”

“I have no doubt that it will be approved,” said the apparition patiently, “the question is: what then?  Think about it, you have proposed an architecture for your enterprise without consulting the most important stakeholders – the people who will actually live it and work with it. Would you have an architect build your house that way? And how would you react to one who insisted on doing things his or her way because it is “best practice” to do so?”

“That’s a completely inappropriate comparison,” said Sean.

“No it isn’t, and you know it too” said the doppelganger. “But look, I’ve nothing more to add. I’ve said what I wanted to say.  Besides, I’m sure you’re keen to see the back of me…most people are.”

…and pfft…just like that, the apparition vanished, leaving a bemused architect and a rapidly drying coffee stain in its wake.

Written by K

March 6, 2014 at 7:30 pm

Rituals in information system design and development

leave a comment »

Introduction

Information system development is generally viewed as a rational process involving steps such as planning, requirements gathering, design etc. However, since it often involves many people, it is only natural that the process will have social and political dimensions as well.

The rational elements of the development process focus on matters such as analysis, coding and adherence to guidelines etc. On the other hand, the socio-political aspects are about things such as differences of opinion, conflict, organisational turf wars etc.  The interesting thing, however, is that elements that appear to be rational are sometimes subverted to achieve political ends. Shorn of their original intent, they become rituals that are performed for symbolic reasons rather than rational ones. In this paper I discuss rituals in system development, drawing on a paper by Daniel Robey and Lynne Markus entitled, Rituals in Information System Design .

Background

According to the authors, labelling the process of system design and development as rational implies that the  process can be set out and explained in a logical way. Moreover,  it also implies that the system being designed has clear goals that can be defined upfront, and that the implemented system will be used in the manner intended by the designers. On the other hand, a political perspective would emphasise the differences between various stakeholder groups (e.g. users, sponsors and developers) and how each group uses the process in ways that benefit them, sometimes to the detriment of others.

In the paper the authors discuss how  the following two elements of the system development process are consistent with both views summarised above.

  1. System development lifecycle.
  2. Techniques for user involvement

I’ll look at each of these in turn in the next two sections, emphasising their rational features.

Development lifecycle

The basic steps of a system development lifecycle, common to all methodologies, are:

  1. Inception
  2. Requirements gathering  / analysis
  3. Specification
  4. Design
  5. Programming
  6. Testing
  7. Training
  8. Rollout

Waterfall methodologies run through each of the above once whereas Iterative/Incremental methods loop through (a subset of) them as many times as needed.

It is easy to see that the lifecycle has a rational basis – specification depends on requirements and can therefore be done only after requirements have been gathered and analysis;  programming can only proceed after design in completed, and so on It all sounds very logical and rational. Moreover, for most mid-size or large teams, each of the above activities is carried out by different individuals – business analysts, architects/designers, programmers, testers, trainers and operations staff.  So the  advantage of following a formal development cycle is that it makes it easier to plan and coordinate large development efforts, at least in principle.

Techniques for user involvement

It is a truism that the success of a system depends critically on the level of user interest and engagement it generates. User involvement in different phases of system development  is therefore seen as a key to generating and maintaining user engagement.  Some of the common techniques to solicit user involvement include:

  1. Requirements analysis: Direct interaction with users is necessary in order to get a good understanding of their expectations from the system.  Another benefit is that it gives the project team an early opportunity to gain user engagement.
  2. Steering committees: Typically such committees are composed of key stakeholders from each group that  is affected by the system. Although some question the utility of steering committees, it is true that committees that consist of high ranking executives can help in driving user engagement.
  3. Prototyping:  This involves creating a working model that serves to demonstrate a subset of the full functionality of the system.  The great advantage of this method of user involvement that it gives users an opportunity to provide feedback early in the development lifecycle.

Again, it is easy to see that the above techniques have a rational basis: the logic being  that  involving  users early in the development process  helps them become familiar with the system, thus improving the chances that they will be willing, even enthusiastic adopters of the system when it is rolled out.

The political players

Politics is inevitable in any social system that has  stakeholder groups with differing interests.  In the case of system development, two important stakeholder groups are users and developers.  Among other things, the two groups differ in:

  1. Cognitive style: developers tend to be analytical/logical types while users come from a broad spectrum of cognitive types. Yes, this is a generalisation, but it is largely true.
  2. Position in organisation: in a corporate environment, business users generally outrank technical staff.
  3. Affiliations: users and developers belong to different organisational units and therefore have differing loyalties.
  4. Incentives: Typically member of the two groups have different  goals. The developers may be measured by the success of the rollout whereas users may be judged by their proficiency on the new system and the resulting gains in productivity. 

These lead to differences in ways the two groups perceive processes or events. For example, a developer may see a specification as a blueprint for design  whereas a user might see it as a bureaucratic document that locks them into choices they are ill equipped to make. Such differences in perceptions make it far from obvious that the different parties can converge on a common worldview that is assumed by the rational perspective. Indeed, in such situations it isn’t clear at all as to what constitutes “common interest.” Indeed, it is such differences that lead to the ritualisation of aspects of the systems development process.

Ritualisation of rational processes

We now look at how the differences in perspectives can lead to a situation where processes that are intended to be rational end up becoming rituals.

Let’s begin with an example that occurs at the inception phase of system development project: the formulation of a business case. The stated intent of a business case is to make a rational argument as to why a particular system should be built. Ideally it should be created jointly by the business and technology departments.  In practice, however, it frequently happens that one of the two parties  is given primary responsibility for it.  As the two parties are not equally represented, the business case ends up becoming a political document:  instead of presenting a balanced case, it presents a distorted view that focuses on one party’s needs. When this happens, the business case becomes symbol rather than substance – in other words, a ritual.

Another example is the handover process between developers and users (or operations, for that matter). The process is intended to ensure that the system does indeed function as promised in the scope document.  Sometimes though, both parties attempt to safeguard their own interests:  developers may pressure users to sign off whereas users may delay signing-off because they want to check the system ever more thoroughly.  In such situations the handover process  serves as a forum for both parties to argue their positions rather than as a means to move  the project  to a close.  Once again, the actual process is shorn of its original intent and meaning, and is thus ritualised.

Even steering committees can end up being ritualised.  For example, when a committee consists of senior executives from different divisions, it can happen that each member will attempt to safeguard the interests of his or her fief.  Committee meetings then become forums to bicker rather than to provide direction to the project.  In other words, they become symbolic events that achieve little of substance.

Discussion

The main conclusion from the above argument is that information system design and implementation is both a rational and political process.  As a consequence, many of the processes associated with it turn out to be more like rituals in that they symbolise rationality but are not actually rational at all.

That said, it should be noted that rituals have an important function:  they serve to give the whole process of systems development a veneer of rationality whilst allowing for the  political manouevering that is inevitable in large projects.  As the authors put it:

Rituals in systems development function to maintain the appearance of rationality in systems development and in organisational decision making. Regardless of whether it actually produces rational outcomes or not, systems development must symbolize rationality and signify that the actions taken are not arbitrary but rather acceptable within the organisation’s ideology. As such, rituals help provide meaning to the actions taken within an organisation

And I feel compelled to add:  even if the actions taken are completely irrational and arbitrary…

Summary…and a speculation

In my experience, the central message of the paper rings true:  systems development and design, like many other organisational processes and procedures, are often hijacked by different parties to suit their own ends.  In such situations, processes are reduced to rituals that maintain a facade of rationality whilst providing cover for politicking and other not-so-rational actions.

Finally, it is interesting to note that the problem of ritualisation is a rather general one: many allegedly rational processes in organisations are more symbol than substance.  Examples of other processes that are prone to ritualisation include performance management, project management and planning. This hints at a deeper issue, one that I think has its origins in modern management’s penchant for overly prescriptive, formulaic approaches to managing organisations and initiatives. That, however, remains a speculation and a topic for another time…

Written by K

February 25, 2014 at 9:08 pm

The consultant’s dilemma – a business fable

with 11 comments

It felt like a homecoming. That characteristic university smell  (books,  spearmint gum and a hint of cologne) permeated the hallway. It brought back memories of his student days: the cut and thrust of classroom debates, all-nighters before exams and near-all-nighters at Harry’s Bar on the weekends. He was amazed at how evocative that smell was.

Rich checked the directory near the noticeboard and found that the prof was still  in the same shoe-box office that he was ten years ago. He headed down the hallway wondering why the best teachers seemed to get the least desirable offices. Perhaps it was inevitable in a university system that rated grantsmanship over teaching.

It was good of the prof to see him at short notice. He had taken a chance really, calling on impulse because he had a few hours to kill before his flight home. There was too much travel in this job, but he couldn’t complain: he knew what he was getting into when he signed up.  No, his problem was deeper. He no longer believed in what he did. The advice he gave and the impressive, highly polished reports  he wrote for clients were useless…no, worse, they were dangerous.

He knew he was at a crossroad. Maybe, just maybe, the prof would be able to point him in the right direction.

Nevertheless, he was assailed by doubt as he approached the prof’s office. He didn’t have any right to burden the prof with his problems …he could still call and make an excuse for not showing up. Should he leave?

He shook his head. No, now that he was here he might as well at least say hello. He knocked on the door.

“Come in,” said the familiar voice.

He went in.

“Ah, Rich, it is good to see you after all these years. You’re looking well,” said the prof, getting up and shaking his hand warmly.

After a brief exchange of pleasantries, he asked Rich to take a seat.

“Just give me a minute, I’m down to the last paper in this pile,” said the prof, gesturing at a heap of term papers in front of him. “If I don’t do it now, I never will.”

“Take your time prof,” said Rich, as he sat down.

Rich cast his eye over the bookshelf behind the prof’s desk.  The titles on the shelf reflected the prof’s main interest: twentieth century philosophy. A title by  Habermas caught his eye.

Habermas!

Rich recalled a class in which the prof had talked about Habermas’ work on communicative rationality and its utility in making sense of ambiguous issues in management. It was in that lecture that the prof had introduced them to the evocative term that captured ambiguity in management (and other fields) so well, wicked problems.

There were many things the prof spoke of, but ambiguity and uncertainty were his overarching themes.  His lectures stood in stark contrast to those of his more illustrious peers: the prof dealt with reality in all its messiness, the other guys lived in a fantasy  world in which their neat models worked and  things went according to plan.

Rich had learnt  from the prof that philosophy was not an arcane subject, but one that held important lessons for everyone (including hotshot managers!). Much of what he learnt in that single term of philosophy had stayed with him. Indeed, it was what had brought him back to the prof’s door after all these years.

“All done,” said the prof, putting his pen down and flicking the marked paper into the pile in front of him.  He looked up at Rich: “Tell you what, let’s go to the café. The air-conditioning there is so much better,” he added, somewhat apologetically.

As they walked out of the prof’s office, Rich couldn’t help but wonder why the prof stuck around in a place where he remained unrecognized and unappreciated.

The café was busy. Though it was only mid-afternoon, the crowd was  already in Friday evening mode. Rich and the prof ordered their coffees and found a spot at the quieter end of the cafe.

After some small talk, the prof looked him and said, “Pardon my saying so, Rich, but you seem preoccupied. Is there something you want to talk about?”

“Yes, there is…well, there was, but I’m not so sure now.”

“You might as well ask,” said the prof. “My time is not billable….unlike yours.” His face crinkled into a smile that said, no offence intended.

“Well, as I mentioned when I called you this morning, I’m a management consultant with Big Consulting. By all measures, I’m doing quite well: excellent pay, good ratings from my managers and clients, promotions etc. The problem is, over the last month or so I’ve been feeling like a faker who plays on clients’ insecurities, selling them advice and solutions that are simplistic and cause more problems than they solve,” said Rich.

“Hmmm,” said the prof, “I’m curious. What triggered these thoughts after a decade in the game?”

“Well, I reckon it was an engagement that I completed a couple of months ago. I was the principal consultant for a big change management initiative at a multinational.  It was my first gig as a lead consultant for a change program this size. I was responsible for managing all aspects of the engagement – right from the initial discussions with the client,  to advising them on the change process and finally implementing it.” He folded his hands behind his head and leaned back in his chair as he continued,  “In theory I’m supposed to offer independent advice. In reality, though, there is considerable pressure to use our standard, trademarked solutions. Have you heard of our 5 X Model of Change Management?”

“Yes, I have,” nodded the prof.

“Well, I could see that the prescriptions of 5 X would not work for that organization. But, as I said, I had no choice in the matter.”

“Uh-huh, and then?”

“As I had foreseen,” said Rich, “the change was a painful, messy one for the organization. It even hit their bottom line significantly.  They are trying to cover it up, but everyone in the organization knows that the change is the real reason for the drop in earnings.  Despite this, Big Consulting has emerged unscathed. A bunch of middle managers on the client’s side have taken the rap.” He shook his head ruefully. “They were asked to leave,” he said.

“That’s terrible,” said the prof, “I can well understand how you feel.”

“Yes, I should not have prescribed 5 X. It is a lemon. The question is: what should I do now?” queried Rich.

“That’s for you to decide. You can’t change the past, but you might be able to influence the future,” said the prof with a smile.

“I was hoping you could advise me.”

“I have no doubt that you have reflected on the experience. What did you conclude?”

“That I should get out of this line of work,” said Rich vehemently.

“What would that achieve?” asked the prof gently.

“Well, at least I won’t be put into such situations again. I’m not worried about finding work, I’m sure I can find a job with the Big Consulting name on my resume,” said Rich.

“That’s true,” said the prof, “but is that all there is to it? There are other things to consider. For instance, Big Consulting will continue selling snake oil. How would you feel about that?”

“Yeah, that is a problem – damned if I do, damned if I don’t,” replied Rich. “You know, when I was sitting in your office, I recalled that you had spoken about such dilemmas in one of your classes. You said that the difficulty with such wicked issues is that they cannot be decided based on facts alone, because the facts themselves are either scarce or contested…or both!”

“That’s right,” said the prof, “and this is a wicked problem of a kind that is very common, not just in professional work but also in life.  Even relatively mundane issues such as  whether or not to switch jobs have wicked elements. What we forget sometimes, though, is that our decisions on such matters or rather, our consequent actions, might also affect others.”

“So you’re saying I’m not the only stakeholder (if I can use that term) in my problem. Is that right?”

“That’s right, there are other people to consider,” said the prof, “but the problem is you don’t know who they are .They are all the people who will be affected in the future by the decision you make now. If you quit, Big Consulting will go on selling this solution and many more people might be adversely affected. On the other hand, if you stay, you could try to influence the future direction of Big Consulting, but that might involve some discomfort for yourself. This makes your wicked problem an ethical one.  I suspect this is why you’re having a hard time going with the “quit” option.”

There was a brief silence. The prof could see that Rich was thinking things through.

“Prof, I’ve got to hand it to you,” said Rich shaking his head with a smile, “I was so absorbed by the quit/don’t quit dilemma from my personal perspective that I didn’t realize there are other angles to consider.  Thanks, you’ve helped immensely. I’m not sure what I will do, but I do know that what you have just said will help me make a more considered choice.  Thank you!”

“You’re welcome, Rich”

…And as he boarded his flight later that evening,  Rich finally understood why the prof continued to teach at a place where he remained unrecognized and unappreciated

Written by K

February 13, 2014 at 10:31 pm

On the inherent ambiguities of managing projects

with 9 comments

Much of mainstream project management is technique-based – i.e.  it is based on  processes that are aimed at achieving well-defined ends. Indeed, the best-known guide in the PM world, the PMBOK, is structured as a collection of processes and associated  “tools and techniques” that are categorised into various “knowledge areas.”

Yet, as experienced project managers know, there is more to project management than processes and techniques: success often depends on a project manager’s ability to figure out what to do in unique situations.  Dealing with such situations is more an art rather than science. This process (if one can call it that) is difficult to formalize and even harder to teach. As Donald  Schon wrote in a paper on the crisis of professional knowledge :

…the artistic processes by which practitioners sometimes make sense of unique cases, and the art they sometimes bring to everyday practice, do not meet the prevailing criteria of rigorous practice. Often, when a competent practitioner recognizes in a maze of symptoms the pattern of a disease, constructs a basis for coherent design in the peculiarities of a building site, or discerns an understandable structure in a jumble of materials, he does something for which he cannot give a complete or even a reasonably accurate description. Practitioners make judgments of quality for which they cannot state adequate criteria, display skills for which they cannot describe procedures or rules.

Unfortunately this kind of ambiguity is given virtually no consideration in standard courses on project management. Instead, like most technically-oriented professions such as engineering,  project management treats problems as being well-defined and amenable to standard techniques and solutions. Yet, as Schon tells us:

…the most urgent and intractable issues of professional practice are those of problem-finding. “Our interest”, as one participant put it, “is not only how to pour concrete for the highway, but what highway to build? When it comes to designing a ship, the question we have to ask is, which ship makes sense in terms of the problems of transportation?

Indeed, the difficulty in messy project management scenarios often lies in figuring out what to do  rather than how to do it.  Consider the following situations:

  1. You have to make an important project-related decision, but don’t have enough information to make it.
  2. Your team is overworked and your manager has already turned down a request for more people.
  3. A key consultant on your project has resigned.

Each of the above is a not-uncommon scenario in the world of projects. The problem in each of these cases lies in  figuring out what to  do given  the unique context of the project. Mainstream project management offers little advice on how to deal with such situations, but their ubiquity suggests that they are worthy of attention.

In reality, most project managers deal with such situations using a mix of common sense, experience and instinct, together with a deep appreciation of the specifics of the environment (i.e. the context).  Often times their actions may be in complete contradiction to textbook techniques.  For example, in the first case described above, the rational thing to do is to gather more data before making a decision. However, when faced with such a situation,  a project manager might make a snap decision based on his or her knowledge of the politics of the situation.  Often times  the project manager will not be able to adequately explain the rationale for the decision beyond knowing that “it felt like the right thing to do.” It is more  an improvisation than a plan.

Schon used the term reflection-in-action to describe how practitioners deal with such situations, and used the following example to illustrate how it works in practice:

Recently, for example, I built a wooden gate. The gate was made of wooden pickets and strapping. I had made a drawing of it, and figured out the dimensions I wanted, but I had not reckoned with the problem of keeping the structure square. I noticed, as I began to nail the strapping to the pickets that the whole thing wobbled. I knew that when I nailed in a diagonal piece, the structure would become rigid. But how would I be sure that, at that moment, the structure would be square? I stopped to think. There came to mind a vague memory about diagonals-that in a square, the diagonals are equal. I took a yard stick, intending to measure the diagonals, but I found it difficult to make these measurements without disturbing the structure. It occurred to me to use a piece of string. Then it became apparent that I needed precise locations from which to measure the diagonal from corner to corner. After several frustrating trials, I decided to locate the center point at each of the corners (by crossing diagonals at each corner), hammered in a nail at each of the four center points, and used the nails as anchors for the measurement string. It took several moments to figure out how to adjust the structure so as to correct the errors I found by measuring, and when I had the diagonal equal, I nailed in the piece of strapping that made the structure rigid…

Such encounters with improvisation are often followed by a retrospective analysis of why the actions taken worked (or didn’t). Schoen called this latter process reflection-on-action.  I think it isn’t a stretch to say that project managers hone their craft through reflection in and on ambiguous situations. This knowledge cannot be easily codified into techniques or practices but is worthy of study in its own right. To this end, Schon advocated an epistemology of (artistic) practice – a study of what such knowledge is and how it is acquired. In his words:

…the study of professional artistry is of critical importance. We should be turning the puzzle of professional knowledge on its head, not seeking only to build up a science applicable to practice but also to reflect on the reflection-in-action already embedded in competent practice. We should be exploring, for example, how the on-the-spot experimentation carried out by practicing architects, physicians, engineers and managers is like, and unlike, the controlled experimentation of laboratory scientists. We should be analyzing the ways in which skilled practitioners build up repertoires of exemplars, images and strategies of description in terms of which they learn to see novel, one-of-a-kind phenomena. We should be attentive to differences in the framing of problematic situations and to the rare episodes of frame-reflective discourse in which practitioners sometimes coordinate and transform their conflicting ways of making sense of confusing predicaments. We should investigate the conventions and notations through which practitioners create virtual worlds-as diverse as sketch-pads, simulations, role-plays and rehearsals-in which they are able to slow down the pace of action, go back and try again, and reduce the cost and risk of experimentation. In such explorations as these, grounded in collaborative reflection on everyday artistry, we will be pursuing the description of a new epistemology of practice.

It isn’t hard to see that similar considerations hold for project management and related disciplines.

In closing, project management as laid out in books and BOKs does not equip a project manager to deal with ambiguity.  As a start towards redressing this, formal frameworks need to acknowledge the limitations of the techniques and procedures they espouse.  Although there is no simple, one-size-fits-all way to deal with ambiguity in projects,  lumping it into a bucket called “risk” (or worse, pretending it does not exist)  is not the answer.

Written by K

January 30, 2014 at 9:07 pm

Follow

Get every new post delivered to your Inbox.

Join 250 other followers

%d bloggers like this: