Archive for July 2013
A central myth about decision making in organisations is that it is (or ought to be) a rational and objective process. Although this myth persists, there is a growing realisation that many organisational issues are wicked – i.e. they are hard to define, let alone solve. The difficulty in defining such problems arises from the fact that they are multifaceted, which in turn gives rise to a diversity of viewpoints about them. So it is that people involved in a wicked decision problem will have different opinions on what the problem is and how it should be tackled. This makes it impossible to decide on wicked issues on logical grounds alone , and hence the ineffectiveness of rational decision making processes for such problems.
Often times the wickedness of a problem is a consequence of the way in which it is framed. I’ll have more to say about frames later in this post, for now I’ll just note that the term frame refers to the perceived assumptions and context regarding a problem, and I say perceived because these are often matters of opinion and belief . For example, depending on ones background and beliefs, the issue of crime may be seen as a law and order problem (lack of policing) or an economic one (poverty or lack of opportunity).
Although organisational issues are not as complex and multifaceted as social ones such as crime, most managers would have experienced situations in which they simply did not know what to do because the problem was not decidable based on their preconceptions regarding the facts and assumptions surrounding the problem, and the organisational situation in which it lives (the context).
My use of the word decidable in the previous sentence may raise some eyebrows because the term has a very precise meaning in mathematics. The notion of undecidability (or decidability) comes from the work of Kurt Goedel who proved that any system based on a set of axioms (premises) will necessarily contain statements that can neither be proven nor disproven within that system.
Now an axiomatic system is nothing but a framework consisting of a set of premises plus some logical rules using which one can derive statements that are true within the system (these true statements are theorems). One can thus make an analogy between axiomatic systems in mathematics and (for the want of a better term) decision systems in organisations: decisions in organisations are outcomes of a set of premises plus some rules (not necessarily logical ones!) using which one can make arguments supporting one or the other viewpoint. In terms of the analogy, it is clear that wicked problems in organisations are akin to undecidable problems in mathematics in that they are not solvable within the frame in which they are posed.
The interesting thing about undecidable problems in mathematics is that although statements may be undecidable within a particular system of axioms, they can sometimes be rendered be decidable within another, broader system. Put simply, a proposition that is undecidable may be rendered decidable by modifying or expanding the underlying premises or assumptions. In even simpler terms, the decidability of a statement depends on one’s frame or viewpoint. In terms of the analogy this amounts to saying that wickedness (or the lack of it) depends on how the problem is framed.
Wicked (or undecidable) decision problems can sometimes be managed (or rendered decidable) by an appropriate choice of frame.
The metaphysics of organisational decision-making
I should hasten to add that the foregoing cannot be used as a justification for making a decision based on a convenient frame that is aligned to one’s own interests and opinions. Indeed, an appropriate choice of frame is one that takes into account the entire spectrum of interests and opinions relating to the decision problem. So much so that the choice of a correct frame is a metaphysical issue because it forces the decision-maker(s) to choose how they view themselves in social and ethical terms – in short, as socially responsible human beings!
I realise this statement may sound over the top to many readers so I’ll try to argue for its plausibility, if not its truth, by drawing on a brilliant paper entitled, Ethics and Second-Order Cybernetics, by the eloquent cybernetician and polymath, Heinz von Foerster.
Noting that just about everything about metaphysics is controversial, von Foerster tells us:
When I invoke Metaphysics, I do not seek agreement with anybody else about her nature. This is because I want to say precisely what it is when we become metaphysicians, whether or not we call ourselves metaphysicians. I say we become metaphysicians whenever we decide upon in principle undecidable questions.
Why does our deciding on undecidable questions make us metaphysicians?
The answer lies in the difference between decidable and undecidable questions. The former are unambiguously decided by the framework within which they are posed whereas the latter are not. Therefore we are forced to make a choice (of framework and consequent decision) based on our interests and opinions. The point is, our interests and opinions tell us something about who we are, so the choices we make when deciding on undecidable questions define our individual human qualities.
As von Foerster states:
Decidable questions are already decided by the framework in which they are asked, and by the rules of how to connect what we call “the question” with what we may take for an “answer.” In some cases it may go fast, in others it may take a long, long time, but ultimately we will arrive, after a sequence of compelling logical steps, at an irrefutable answer: a definite Yes, or a definite No. But we are under no compulsion, not even under that of logic, when we decide upon in principle undecidable questions. There is no external necessity that forces us to answer such questions one way or another. We are free! The complement to necessity is not chance, it is choice! We can choose who we wish to become when we decide on in principle undecidable questions.
The claim that we choose who we wish to become becomes evident when one notes that organisational decisions often put decision makers into situations in which they have to make ethical choices. For example, cost cutting measures may lead to job losses, changes in work policies may affect employee well being, wrong choices of technologies may pollute the environment and so on. The point is that most undecidable (or wicked!) problems in organisational life have ethical dimensions, and we define ourselves as human beings when we make decisions regarding them.
No wonder then that we have so many devices by which people try to avoid the making decisions and the consequent responsibility that comes with it. As von Foerster states:
With much ingenuity and imagination, mechanisms were contrived by which one could bypass this awesome burden. With hierarchies, entire institutions have been built where it is impossible to localize responsibility. Everyone in such a system can say: “I was told to do X.”
On the political stage we hear more and more the phrase of Pontius Pilate: “I have no choice but X.” In other words “Don’t make me responsible for X, blame others.” This phrase apparently replaces: “Among the many choices I had, I decided on X.
Then, aiming squarely at rationality and objectivity, he writes:
I mentioned objectivity before and I mention it here again as another popular device of avoiding responsibility. Objectivity requires that the properties of the observer shall not enter the description of his observations. With the essence of observing, namely the processes of cognition, being removed, the observer is reduced to a copying machine, and the notion of responsibility has been successfully juggled away.
..and I take it as given that none of us wish to be reduced to mere copying machines.
The mechanisms of decision making in organisations encourage decision makers to avoid the burden of responsibility rather than accept it – “Sorry, but it is business” or “I’m just following orders” are common phrases that flag such avoidance. From personal experience, I’m painfully aware of how easy it is sweep ethical issues out of one’s field of vision when dealing with wicked problems…and I now also understand that metaphysics is not a rarefied academic discipline, but one that holds practical lessons for us all – you, me, our peers, and those who sit on the floors below and above us.
Management, as it is practiced, is largely about “getting things done.” Consequently management education and research tends to focus on improving the means by which specified ends are achieved. The ends themselves are not questioned as rigorously as they ought to be. The truth of this is reflected in the high profile corporate scandals that have come to light over the last decade or so, not to mention the global financial crisis.
Today, more than ever, there is a need for a new kind of management practice, one in which managers critically reflect on the goals they pursue and the means by which they aim to achieve them. In their book entitled, Making Sense of Management: A Critical Introduction, management academics Mats Alvesson and Hugh Willmott describe what such an approach to management entails. This post is a summary of the central ideas described in the book.
Critical theory and its relevance to management
The body of work that Alvesson and Willmott draw from is Critical Theory, a discipline that is based on the belief that knowledge ought to be based on dialectical reasoning – i.e. reasoning through dialogue – rather than scientific rationality alone. The main reason for this being that science (as it is commonly practiced) is value free and is therefore incapable of addressing problems that have a social or ethical dimension. This idea is not new, even scientists such as Einstein have commented on the limits of scientific reasoning.
Although Critical Theory has its roots in the Renaissance and Enlightenment, its modern avatar is largely due to a group of German social philosophers who were associated with the Frankfurt-based Institute of Social Research which was established in the 1920s. Among other things, these philosophers argued that knowledge in the social sciences (such as management) can never be truly value-free or objective. Our knowledge of social matters is invariably coloured by our background, culture, education and sensibilities. This ought to be obvious, but it isn’t: economists continue to proclaim objective truths about the right way to deal with economic issues, and management gurus remain ready to show us the one true path to management excellence.
The present day standard bearer of the Frankfurt School is the German social philosopher, Juergen Habermas who is best known for his theory of communicative rationality – the idea that open dialogue, free from any constraints is the most rational way to decide on matters of importance. For a super-quick introduction to the basic ideas of communicative rationality and its relevance in organisational settings, see my post entitled, More than just talk: rational dialogue in project environments. For a more detailed (and dare I say, entertaining) introduction to communicative rationality with examples drawn from The Borg and much more, have a look at Chapter 7 of my book, The Heretic’s Guide to Best Practices, co-written with Paul Culmsee.
The demise of command and control?
Many professional managers see their jobs in purely technical terms, involving things such as administration, planning, monitoring etc. They tend to overlook the fact that these technical functions are carried out within a particular social and cultural context. More to the point, and this is crucial, managers work under constraints of power and domination: they are not free to do what they think is right but have to do whatever their bosses order them to, and, so in turn behave with their subordinates in exactly the same way.
As Alvesson and Willmott put it:
Managers are intermediaries between those who hire them and those whom they manage. Managers are employed to coordinate, motivate, appease and control the productive efforts of others. These ‘others’ do not necessarily share managerial agendas…
Despite the talk of autonomy and empowerment, modern day management is still very much about control. However, modern day employees are unlikely to accept a command and control approach to being managed, so organisations have taken recourse to subtler means of achieving the same result. For example, organisational culture initiatives aimed at getting employees to “internalise”the values of the organisation are attempts to “control sans command.”
A critical look at the status quo
A good place to start with a critical view of management is in the area of decision-making. Certain decisions, particularly those made at executive levels, can have a long term impact on an organisation and its employees. Business schools and decision theory texts tells us that decision-making is a rational process. Unfortunately, reality belies that claim: decisions in organisations are more often made on the basis of politics and ideology rather than objective criteria. This being the case, it is important that decisions be subject to critical scrutiny. Indeed it is possible that many of the crises of the last decade could have been avoided if the decisions that lead to them had been subjected a to critical review.
Many of the initiatives that are launched in organisation-land have their origins in executive-level decisions that are made on flimsy grounds such as “best practice” recommendations from Big 4 consulting companies. Mid-level managers who are required to see these through to completion are then faced with the problem of justifying these initiatives to the rank and file. Change management in modern organisation-land is largely about justifying the unjustifiable or defending the indefensible.
The critique, however, goes beyond just the practice of management. For example, Alvesson and Willmott also draw attention to things such as the objectives of the organisation. They point out that short-sighted objectives such as “maximising shareholder value” is what lead to the downfall of companies such as Enron. Moreover, they also remind us of an issue that is becoming increasingly important in today’s world: that natural resources are not unlimited and should be exploited in a judicious, sustainable manner.
As interesting and important as these “big picture” issues are, in the remainder of this post I’ll focus attention on management practices that impact mid and lower level employees.
A critical look at management specialisations
Alvesson and Willmott analyse organisational functions such as Human Resource Management (HRM), Marketing and Information Systems (IS) from a critical perspective. It would take far too many pages to do justice to their discussion so I’ll just present a brief summary of two areas: HR and IS.
The rhetoric of HRM in organisations stands in stark contradiction to its actions. Despite platitudinous sloganeering about empowerment etc., the actions of most HR departments are aimed at getting people to act and behave in organisationally acceptable ways. Seen in a critical light, seemingly benign HR initiatives such as organizational culture events or self-management initiatives are exposed as being but subtle means of managerial control over employees. (see this paper for an example of the former and this one for an example of the latter).
Since the practice of IS focuses largely on technology, much of the IS research and practice tends to focus on technology trends and “best practices.” As might be expected, the focus is on “fad of the month” and thus turns stale rather quickly. As examples: the 1990s saw an explosion of papers and projects in business process re-engineering; the flavour of the decade in the 2000s was service-oriented architecture; more recently, we’ve seen a great deal of hot air about the cloud. Underlying a lot of technology related decision-making is the tacit assumption that choices pertaining to technology are value-free and can be decided on the basis of technical and financial criteria alone. The profession as a whole tends to take an overly scientific/rational approach to design and implementation, often ignoring issues such as power and politics. It can be argued that many failures of large-scale IS projects are due to the hyper-rational approach taken by many practitioners.
In a similar vein, most management specialisations can benefit from the insights that come from taking a critical perspective. Alvesson and Willmott discuss marketing, accounting and other functions. However, since my main interest is in solutions rather than listing the (rather well-known) problems, I’ll leave it here, directing the interested reader to the book for more.
Towards an enlightened practice of management
In the modern workplace it is common for employees to feel disconnected from their work, at least from time to time if not always. In a prior post, I discussed how this sense of alienation is a consequence of our work and personal lives being played out in two distinct spheres – the system and the lifeworld. In brief, the system refers to the professional and administrative sphere in which we work and/or interact with institutional authority and the lifeworld is is the everyday world that we share with others. Actions in the lifeworld are based on a shared understanding of the issue at hand whereas those in the system are not.
From the critical analysis of management specialisations presented in the book, it is evident that the profession, being mired in a paradigm consisting of prescriptive, top-down practices, serves to perpetuate the system by encroaching on the lifeworld values of employees. There are those who will say that this is exactly how it should be. However, as Alvesson and Wilmott have stated in their book, this kind of thinking is perverse because it is ultimately self-defeating:
The devaluation of lifeworld properties is perverse because …At the very least, the system depends upon human beings who are capable of communicating effectively and who are not manipulated and demoralized to the point of being incapable of cooperation and productivity.
Alvesson and Willmott use the term emancipation, to describe any process whereby employees are freed from shackles of system-oriented thinking even if only partially (Note: here I’m using the term system in the sense defined above – not to be confused with systems thinking, which is another beast altogether). Acknowledging that it is impossible to do this at the level of an entire organisation or even a department, they coin the term micro-emancipation to describe any process whereby sub-groups of organisations are empowered to think through issues and devise appropriate actions by themselves, free (to the extent possible) from management constraints or directives.
Although this might sound much too idealistic to some readers, be assured that it is eminently possible to implement micro-emancipatory practices in real world organisations. See this paper for one possible framework that can be used within a multi-organisation project along with a detailed case study that shows how the framework can be applied in a complex project environment.
Alvesson and Willmott warn that emancipatory practices are not without costs, both for employers and employees. For example, employees who have gained autonomy may end up being less productive which will in turn affect their job security. In my opinion, view, this issue can be addressed through an incrementalist approach wherein both employers and employees work together to come up with micro-emancipatory projects at the grassroots level, as in the case study described in the paper mentioned in the previous paragraph.
…and so to conclude
Despite the rhetoric of autonomy and empowerment, much of present-day management is stuck in a Taylorist/Fordist paradigm. In modern day organisations command and control may not be obvious, but they often sneak in through the backdoor in not-so-obvious ways. For example, employees almost always know that certain things are simply “out of bounds” for discussion and of the consequences of breaching those unstated boundaries can be severe.
In its purest avatar, a critical approach to management seeks to remove those boundaries altogether. This is unrealistic because nothing will ever get done in an organisation in which everything is open for discussion; as is the case in all social systems, compromise is necessary. The concept of micro-emancipation offers just this. To be sure, one has to go beyond the rhetoric of empowerment to actually creating an environment that enables people to speak their minds and debate issues openly. Though it is impossible to do this at the level of an entire organisation, it is definitely possible to achieve it (albeit approximately) in small workgroups.
To conclude: the book is worth a read, not just by management researchers but also by practicing managers. Unfortunately the overly-academic style may be a turn off for practitioners, the very people who need to read it the most.
The term corporate immune system was coined by James Birkenshaw as a way to describe the tendency of corporate head offices to resist entrepreneurial initiatives by their subsidiaries. In the present day, the term has also been used to refer to the tendency of organisations to reject or suppress novel ideas or processes that employees may come up with. This post is about the latter usage of the phrase.
The metaphor of an immune system is an apt one: apart from being a good description of what happens, it also suggests ways in which one can overcome or bypass managerial resistance to initiatives that are seen as threats. In this post I build on Stefan Lindegaard’s excellent article, to discuss how the Dengue virus can teach us a trick or two about how employees can get around the corporate immune system.
The mechanics of Dengue infection
Dengue fever, also known as breakbone fever, is endemic to many tropical countries. Its symptoms are fever, severe headaches, muscle and joint pains and a characteristic skin rash. Dengue is caused by a virus that is transmitted by the Aedes Aegyptii mosquito which can be identified by the white bands on its legs. Although it originated in Africa, the Aedes species is now found in most tropical and sub-tropical countries throughout the world.
There are four closely related strains (or serotypes) of the Dengue virus– imaginatively named Dengue 1 through Dengue 4. This has interesting consequences as we shall see shortly. First let’s have a quick look at what goes on in the human body after a bite from carrier mosquito. My discussion is based on this article from the Scitable website.
Once a person is bitten by a carrier mosquito, the virus starts to infect skin cells and specialised immune cells (called Langerhans cells) that are near the site of the bite. The infected Langerhans cells travel via the bloodstream to the lymph nodes which are responsible for producing white blood cells (WBCs) that combat infections.
The WBCs are the body’s first line of defence against an infection. The problem is WBCs generally do not succeed in destroying the Dengue virus; worse, they actually end up getting infected by it. The infected white blood cells then help in spreading the virus to other organs in the body.
However, all is not lost because the body has another line of defence – the adaptive immune system – which produces antibodies that target specific intruders. Once the infection spreads, the adaptive immune system kicks in, producing antibodies that recognise and neutralise the virus. The fever an infected person experiences is a manifestation of the battle between the antibodies and the virus. In a healthy person, the immune system eventually wins and the person recovers.
Now here’s the interesting bit: a person who has been infected by the virus gains long term immunity, but only against the particular Dengue serotype that he or she was infected by. If the person is bitten by a mosquito carrying another serotype, the antibodies for the old serotype actually assist the new strain to spread within the body. Essentially this happens because the antibodies for the old strain see the new strain as the old one and thus attempt to engulf it. However, because the virus is different, the antibody cannot bind with it completely. It thus forms an antibody-virus complex within which the virus is still capable of replicating.
These circulating antibody-virus complexes then infect other white blood cells which in turn carry the virus to other parts of the body. This results in a higher volume of virus in the bloodstream than would have occurred otherwise, and hence a more severe infection. This is well known: subsequent infections of Dengue often lead to considerably more severe symptoms than the first one.
The above description is sufficient for the present discussion, but you may want to see this article to learn more about this fascinating virus.
Overcoming the corporate immune system
The processes of primary and secondary Dengue infections hold some lessons for those who want to gain executive support for proposals that might be just a tad too radical for their workplaces. A direct approach, wherein the idea is pitched directly to executives is unlikely to work for at least a couple reasons:
- The generic corporate immune system (akin to white blood cells in the human body) will attempt to take it down. This is typified by the generic, “It will never work here (so let’s not try it)” response.
- Let’s assume that you are at your persuasive best and manage to get past the generic first line corporate defence. You still cannot rest easy because, in time, managerial ingenuity will come up specific managerial objections to the idea (these are akin to strain-specific antibodies).
However, all is not lost, we can take inspiration from the secondary infection process described in the previous section. The second serotype is able to do a more thorough job in infecting its host because antibodies actually help in transporting the virus through the body. This happens because the antibodies do not fully recognise the virus and thus bind with it incompletely.
So the trick to getting your idea past the corporate immune system is to cast it in terms that are familiar to managers and to get them to have a stake in it. Here’s one way to do this:
- Make a connection between your idea and an already well-established element or aspect of your organisation. Be sure to stress this connection in your pitch (see point 2). This way, the idea is seen as a logical continuation what already exists – i.e. it is seen as old rather than new, much as the old serotype antibodies see the new strain as the old one.
- Present your idea to a manager who may be in a position to help you, seeking her advice on it.
- Take the advice offered seriously – i.e. modify the idea in a way that incorporates the advice.
- Re-present the idea to the manager, thanking her for their advice and emphasising how it makes a difference.
- If they are receptive, ask her if she’d would be willing to socialise the idea amongst her peers. If you have genuinely taken her advice, chances are she’ll be willing to do this. After all, the idea is now hers too.
The above are generic steps that can be tailored to specific situations. For example, the same principles apply when writing a business case for a new system or whatever – emphasise continuity and get people to be a part of the idea by offering them a stake in it. The bottom line is that the corporate immune response can be “tricked” into accepting novel ideas, much as the human immune system is fooled by the Dengue virus.
The metaphor of a corporate immune system not only provides an evocative description of how organisations kill novel ideas, but also suggests how such organisational resistance can be overcome. In this post I have described one such strategy based on the fiendishly clever dengue virus.