Eight to Late

Sensemaking and Analytics for Organizations

Archive for the ‘Organizational Culture’ Category

The “value add” tax – a riff on corporate communication

leave a comment »

A mainstay of team building workshops is the old “what can we do better” exercise.  Over the years I’ve noticed that “improving communication” is an item that comes up again and again in these events.  This is frustrating for managers. For example, during a team-building debrief some years ago, an exasperated executive remarked, “Oh don’t pay any attention to that [better communication], it keeps coming up no matter what we do.”

The executive had a point.  The organisation had invested much effort in establishing new channels of communication – social media, online, face-to-face forums etc.  The uptake, however, was disappointing:  turnout at the face-to-face meetings was consistently low as was use of other channels.

As far as management was concerned, they had done their job by establishing communication channels and making them available to all. What more could they  be expected to do? The matter was dismissed with a collective shrug of suit-clad shoulders…until the next team building event, when the issue was highlighted by employees yet again.

After much hand-wringing, the organisation embarked on another “better communication cycle.”  Much effort was expended…again, with the same disappointing results.

Anecdotal evidence via conversations with friends and collaborators suggests that variants of this story play out in many organisations. This makes the issue well worth exploring. I won’t be so presumptuous as to offer answers; I’m well aware that folks much better qualified than I have spent years attempting to do so. Instead I raise a point which, though often overlooked, might well have something to do with the lack of genuine communication in organisations.

Communication experts have long agreed that face-to-face dialogue is the most effective mode of communication. Backing for this comes from the interactional or pragmatic view, which is based on the premise that communication is more about building relationships than conveying information. Among other things, face-to-face communication enables the communicating parties to observe and interpret non-verbal signals such as facial expression and gestures and, as we all know, these often “say” much more than what’s being said.

A few months ago I started paying closer attention to non-verbal cues. This can be hard to do because people are good at disguising their feelings. Involuntary expressions indicative of people’s real thoughts can be fleeting. A flicker of worry, fear or anger is quickly covered by a mask of indifference.

In meetings, difficult topics tend to be couched in platitudinous language. Platitudes are empty words that sound great but can be interpreted in many different ways. Reconciling those differences often leads to pointless arguments that are emotionally draining. Perhaps this is why people prefer to take refuge in indifference.

A while ago I was sitting in a meeting where the phrase “value add activity” (sic) cropped up once, then again…and then many times over. Soon it was apparent that everyone in the room had a very different conception of what constituted a “value add activity.” Some argued that project management is a value add activity, others disagreed vehemently arguing that project management is a bureaucratic exercise and that real value lies in creating something. Round and round the arguments went but there was no agreement on what constituted a “value add activity.” The discussion generated a lot of heat but shed no light whatsoever on the term.

A problem with communication in the corporate world is that it is loaded with such platitudes. To make sense of these, people have to pay what I call a “value add” tax – the effort in reaching a consensus on what the platitudinous terms mean. This can be emotionally extortionate because platitudes often touch upon issues that affect people’s sense of well-being.

Indifference is easier because we can then pretend to understand and agree with each other when we would rather not understand, let alone agree, at all.

Written by K

November 19, 2015 at 8:02 am

From the coalface: an essay on the early history of sociotechnical systems

with 2 comments

The story of sociotechnical systems began a little over half a century ago, in a somewhat unlikely setting: the coalfields of Yorkshire.

The British coal industry had just been nationalised and new mechanised mining methods were being introduced in the mines. It was thought that nationalisation would sort out the chronic labour-management issues and mechanisation would address the issue of falling productivity.

But things weren’t going as planned. In the words of Eric Trist, one of the founders of the Tavistock Institute:

…the newly nationalized industry was not doing well. Productivity failed to increase in step with increases in mechanization. Men were leaving the mines in large numbers for more attractive opportunities in the factory world. Among those who remained, absenteeism averaged 20%. Labour disputes were frequent despite improved conditions of employment.   – excerpted from, The evolution of Socio-technical systems – a conceptual framework and an action research program, E. Trist (1980)

Trist and his colleagues were asked by the National Coal Board to come in and help. To this end, they did a comparative study of two mines that were similar except that one had high productivity and morale whereas the other suffered from low performance and had major labour issues.

Their job was far from easy: they were not welcome at the coalface because workers associated them with management and the Board.

Trist recounts that around the time the study started, there were a number of postgraduate fellows at the Tavistock Institute. One of them, Ken Bamforth, knew the coal industry well as he had been a miner himself.  Postgraduate fellows who had worked in the mines were encouraged to visit their old workplaces after  a year and  write up their impressions, focusing on things that had changed since they had worked there.   After one such visit, Bamforth reported back with news of a workplace innovation that had occurred at a newly opened seam at Haighmoor. Among other things, morale and productivity at this particular seam was high compared to other similar ones.  The team’s way of working was entirely novel, a world away from the hierarchically organised set up that was standard in most mechanised mines at the time. In Trist’s words:

The work organization of the new seam was, to us, a novel phenomenon consisting of a set of relatively autonomous groups interchanging roles and shifts and regulating their affairs with a minimum of supervision. Cooperation between task groups was everywhere in evidence; personal commitment was obvious, absenteeism low, accidents infrequent, productivity high. The contrast was large between the atmosphere and arrangements on these faces and those in the conventional areas of the pit, where the negative features characteristic of the industry were glaringly apparent. Excerpted from the paper referenced above.

To appreciate the radical nature of practices at this seam, one needs to understand the backdrop against which they occurred. To this end, it is helpful to compare the  mechanised work practices introduced in the post-war years with the older ones from the pre-mechanised era of mining.

In the days before mines were mechanised, miners would typically organise themselves into workgroups of six miners, who would cover three work shifts in teams of two. Each miner was able to do pretty much any job at the seam and so could pick up where his work-mates from the previous shift had left off. This was necessary in order to ensure continuity of work between shifts. The group negotiated the price of their mined coal directly with management and the amount received was shared equally amongst all members of the group.

This mode of working required strong cooperation and trust within the group, of course.  However, as workgroups were reorganised from time to time due to attrition or other reasons, individual miners understood the importance of maintaining their individual reputations as reliable and trustworthy workmates. It was important to get into a good workgroup because such groups were more likely to get more productive seams to work on. Seams were assigned by bargaining, which was typically the job of the senior miner on the group. There was considerable competition for the best seams, but this was generally kept within bounds of civility via informal rules and rituals.

This traditional way of working could not survive mechanisation. For one, mechanised mines encouraged specialisation because they were organised like assembly lines, with clearly defined job roles each with different responsibilities and pay scales. Moreover, workers in a shift would perform only part of the extraction process leaving those from subsequent shifts to continue where work was left off.

As miners were paid by the job they did rather than the amount of coal they produced, no single group had end-to-end responsibility for the product.   Delays due to unexpected events tended to get compounded as no one felt the need to make up time. As a result, it would often happen that work that was planned for a shift would not be completed. This meant that the next shift (which could well be composed of a group with completely different skills) could not or would not start their work because they did not see it as their job to finish the work of the earlier shift. Unsurprisingly, blame shifting and scapegoating was rife.

From a supervisor’s point of view, it was difficult to maintain the same level of oversight and control in underground mining work as was possible in an assembly line. The environment underground is simply not conducive to close supervision and is also more uncertain in that it is prone to unexpected events.  Bureaucratic organisational structures are completely unsuited to dealing with these because decision-makers are too far removed from the coalface (literally!).  This is perhaps the most important insight to come out of the Tavistock coal mining studies.

As Claudio Ciborra  puts it in his classic book on teams:

Since the production process at any seam was much more prone to disorganisation than due to uncertainty and complexity of underground conditions, any ‘bureaucratic’ allocation of jobs could be easily disrupted. Coping with emergencies and coping with coping became part of worker’s and supervisors’ everyday activities. These activities would lead to stress, conflict and low productivity because they continually clashed with the technological arrangements and the way they were planned and subdivided around them.

Thus we see that the new assembly-line bureaucracy inspired work organisation was totally unsuited to the work environment because there was no end-to-end responsibility, and decision making was far removed from the action. In contrast, the traditional workgroup of six was able to deal with uncertainties and complexities of underground work because team members had a strong sense of responsibility for the performance of the team as a whole. Moreover, teams were uniquely placed to deal with unexpected events because they were actually living them as they occurred and could therefore decide on the best way to deal with them.

What Bamforth found at the Haighmoor seam was that it was possible to recapture the spirit of the old ways of working by adapting these to the larger specialised groups that were necessary in the mechanised mines. As Ciborra describes it in his book:

The new form of work organisation features forty one men who allocate themselves to tasks and shifts. Although tasks and shifts those of the conventional mechanised system, management and supervisors do not monitor, enforce and reward single task executions. The composite group takes over some of the managerial tasks, as it had in the pre-mechanised marrow group, such as the selection of group members and the informal monitoring of work…Cycle completion, not task execution becomes a common goal that allows for mutual learning and support…There is basic wage and a bonus linked to the overall productivity of the group throughout the whole cycle rather than a shift.  The competition between shifts that plagued the conventional mechanised method is effectively eliminated…

Bamforth and Trist’s studies on Haighmoor convinced them that there were viable (and better!) alternatives to those that were typical of mid to late 20th century work places.  Their work led them to the insight that the best work arrangements come out of seeking a match between technical and social elements of the modern day workplace, and thus was born the notion of sociotechnical systems.

Ever since the assembly-line management philosophies of Taylor and Ford, there has been an increasing trend towards division of labour, bureaucratisation and mechanisation / automation of work processes.  Despite the early work of the Tavistock school and others who followed, this trend continues to dominate management practice, arguably even more so in recent years. The Haighmoor innovation described above was one of the earliest demonstrations that there is a better way.   This message has since been echoed by many academics and thinkers,  but remains largely under-appreciated or ignored by professional managers who have little idea – or have completely forgotten – what it is like to work at the coalface.

Written by K

April 7, 2015 at 10:30 pm

Scapegoats and systems: contrasting approaches to managing human error in organisations

with 5 comments

Much can be learnt about an organization by observing what management does when things go wrong.  One reaction is to hunt for a scapegoat, someone who can be held responsible for the mess.  The other is to take a systemic view that focuses on finding the root cause of the issue and figuring out what can be done in order to prevent it from recurring.  In a highly cited paper published in 2000, James Reason compared and contrasted the two approaches to error management in organisations. This post is an extensive summary of the paper.

The author gets to the point in the very first paragraph:

The human error problem can be viewed in two ways: the person approach and the system approach. Each has its model of error causation and each model gives rise to quite different philosophies of error management. Understanding these differences has important practical implications for coping with the ever present risk of mishaps in clinical practice.

Reason’s paper was published in the British Medical Journal and hence his focus on the practice of medicine. His arguments and conclusions, however, have a much wider relevance as evidenced by the diverse areas in which his paper has been cited.

The person approach – which, I think is more accurately called the scapegoat approach – is based on the belief that any errors can and should be traced back to an individual or a group, and that the party responsible should then be held to account for the error. This is the approach taken in organisations that are colloquially referred to as having a “blame culture.”

To an extent, looking around for a scapegoat is a natural emotional reaction to an error. The oft unstated reason behind scapegoating, however, is to avoid management responsibility.  As the author tells us:

People are viewed as free agents capable of choosing between safe and unsafe modes of behaviour.  If something goes wrong, it seems obvious that an individual (or group of individuals) must have been responsible. Seeking as far as possible to uncouple a person’s unsafe acts from any institutional responsibility is clearly in the interests of managers. It is also legally more convenient

However, the scapegoat approach has a couple of serious problems that hinder effective risk management.

Firstly, an organization depends on its frontline staff to report any problems or lapses. Clearly, staff will do so only if they feel that it is safe to do so – something that is simply not possible in an organization that takes scapegoat approach. The author suggests that the Chernobyl disaster can be attributed to the lack of a “reporting culture” within the erstwhile Soviet Union.

Secondly, and perhaps more important, is that the focus on a scapegoat leaves the underlying cause of the error unaddressed. As the author puts it, “by focusing on the individual origins of error it [the scapegoat approach] isolates unsafe acts from their system context.” As a consequence, the scapegoat approach overlooks systemic features of errors – for example, the empirical fact that the same kinds of errors tend to recur within a given system.

The system approach accepts that human errors will happen. However, in contrast to the scapegoat approach, it views these errors as being triggered by factors that are built into the system. So, when something goes wrong, the system approach focuses on the procedures that were used rather than the people who were executing them. This difference from the scapegoat approach makes a world of difference.

The system approach looks for generic reasons why errors or accidents occur. Organisations usually have a series of measures in place to prevent errors – e.g. alarms, procedures, checklists, trained staff etc. Each of these measures can be looked upon as a “defensive layer” against error. However, as the author notes, each defensive layer has holes which can let errors “pass through” (more on how the holes arise a bit later).  A good way to visualize this is as a series of slices of Swiss Cheese (see Figure 1).

The important point is that the holes on a given slice are not at a fixed position; they keep opening, closing and even shifting around, depending on the state of the organization.  An error occurs when the ephemeral holes on different layers temporarily line up to “let an error through”.

There are two reasons why holes arise in defensive layers:

  1. Active errors: These are unsafe acts committed by individuals. Active errors could be violations of set procedures or momentary lapses. The scapegoat approach focuses on identifying the active error and the person responsible for it. However, as the author points out, active errors are almost always caused by conditions built into the system, which brings us to…
  2. Latent conditions: These are flaws that are built into the system. The author uses the term resident pathogens to describe these – a nice metaphor that I have explored in a paper review I wrote some years ago. These “pathogens” are usually baked into the system by poor design decisions and flawed procedures on the one hand, and ill-thought-out management decisions on the other. Manifestations of the former include faulty alarms, unrealistic or inconsistent procedures or poorly designed equipment; manifestations of the latter include things such as unrealistic targets, overworked staff and the lack of  funding for appropriate equipment.

The important thing to note is that latent conditions can lie dormant for a long period before they are noticed. Typically a latent condition comes to light only when an error caused by it occurs…and only if the organization does a root cause analysis of the error – something that is simply not done in an organization takes a scapegoat approach.

The author draws a nice analogy that clarifies the link between active errors and latent conditions:

…active failures are like mosquitoes. They can be swatted one by one, but they still keep coming. The best remedies are to create more effective defences and to drain the swamps in which they breed. The swamps, in this case, are the ever present latent conditions.

“Draining the swamp” is not a simple task.  The author draws upon studies of high performance organisations (combat units, nuclear power plants and air traffic control centres) to understand how they minimised active errors by reducing system flaws. He notes that these organisations:

  1. Accept that errors will occur despite standardised procedures, and train their staff to deal with and learn from them.
  2. Practice responses to known error scenarios and try to imagine new ones on a regular basis.
  3. Delegate responsibility and authority, especially in crisis situations.
  4. Do a root cause analysis of any error that occurs and address the underlying problem by changing the system if needed.

In contrast, an organisation that takes a scapegoat approach assumes that standardisation will eliminate errors, ignores the possibility of novel errors occurring, centralises control and, above all, focuses on finding scapegoats instead of fixing the system.

Acknowledgement:

Figure 1 was taken from the Patient Safety Education website of Duke University Hospital.

Further reading:

The Swiss Cheese model was first proposed in 1991. It has since been applied in many areas. Here are a couple of recent applications and extensions of the model to project management:

  1. Stephen Duffield and Jon Whitty use the Swiss Cheese model as a basis for their model of Systemic Lessons Learned and Knowledge Captured (SLLKC model) in projects.
  2. In this post, Paul Culmsee extends the SLLKC model to incorporate aspects relating to teams and collaboration.

Written by K

July 29, 2014 at 8:43 pm

Towards a critical practice of management – a book review

leave a comment »

Introduction

Management, as it is practiced, is largely about “getting things done.”  Consequently management education and research tends to focus on improving the means by which specified ends are achieved. The ends themselves are not questioned as rigorously as they ought to be. The truth of this is reflected in the high profile corporate scandals that have come to light over the last decade or so, not to mention the global financial crisis.

Today, more than ever, there is a need for a new kind of management practice, one in which managers critically reflect on the goals they  pursue and the means by which they aim to achieve them.  In their book entitled,  Making Sense of Management: A Critical Introduction,  management academics Mats Alvesson and Hugh Willmott describe what such an approach to management entails.  This post is a summary of the central ideas described in the book.

Critical theory and its relevance to management

The body of work that Alvesson and Willmott draw from is Critical Theory, a discipline that is based on the belief that knowledge ought to be based on dialectical reasoning – i.e. reasoning through dialogue – rather than scientific rationality alone. The main reason for this being that science (as it is commonly practiced) is value free and is therefore incapable of addressing problems that have a social or ethical dimension.  This idea is not new,  even scientists such as Einstein have commented on the limits of scientific reasoning.

Although Critical Theory has its roots in the Renaissance and Enlightenment,  its modern avatar is largely due to a group of German social philosophers who were associated with the Frankfurt-based Institute of Social Research which was established in the 1920s.   Among other things, these philosophers argued that knowledge in the social sciences  (such as management) can never be truly value-free or objective.  Our knowledge of social matters is invariably coloured by our background, culture, education and sensibilities. This ought to be obvious, but it isn’t:  economists continue to proclaim objective truths about the right way to deal with economic issues, and management gurus remain ready to show us the one true path to management excellence.

The present day standard bearer of the Frankfurt School is the German social philosopher, Juergen Habermas  who is best known for his theory of communicative rationality – the idea that open dialogue, free from any constraints is the most rational way to decide on matters of importance.  For a super-quick introduction to the basic ideas  of communicative rationality and its relevance in organisational settings, see my post entitled, More than just talk: rational dialogue in project environments. For a more detailed (and dare I say, entertaining) introduction to communicative rationality with examples drawn from The Borg and much more, have a look at Chapter 7 of my book, The Heretic’s Guide to Best Practices, co-written with Paul Culmsee.

The demise of command and control?

Many professional managers see their jobs in purely technical terms,  involving things such as administration, planning, monitoring etc.  They tend to overlook the fact that these technical functions are carried out within a  particular social and cultural context.  More to the point,  and this is crucial, managers work under constraints of power and domination: they are not free to do what they think is right but have to do whatever their bosses order them to,  and, so  in turn behave with their subordinates in exactly the same way.

As Alvesson and Willmott put it:

Managers are intermediaries between those who hire them and those whom they manage. Managers are employed to coordinate, motivate, appease and control the productive efforts of others. These ‘others’ do not necessarily share managerial agendas…

Despite the talk of autonomy and empowerment, modern day management is still very much about  control.  However, modern day employees are unlikely to accept a command and control approach to being managed, so organisations have taken recourse to subtler means of achieving the same result. For example, organisational culture initiatives aimed at getting employees to “internalise”the values of the organisation are attempts to “control sans command.”

The point is,  despite the softening of  the rhetoric  of management its principal focus remains much the same as it was in the days of Taylor and Ford.

A critical look at the status quo

A good place to start with a critical view of management is in the area of decision-making. Certain decisions,  particularly those made at executive levels,  can have a long term impact on an organisation and its employees. Business schools and decision theory texts tells us that decision-making is  a rational process. Unfortunately, reality belies that claim: decisions in organisations are more  often made on the basis of politics  and ideology rather than objective criteria.  This being the case, it is important that decisions be subject to critical scrutiny.  Indeed it is possible that many of the crises of the last decade could have been avoided if the decisions that lead to them had been subjected a to critical review.

Many of the initiatives that are launched in organisation-land have their origins in  executive-level decisions that are made on flimsy grounds  such as “best practice”  recommendations  from Big 4 consulting companies. Mid-level managers  who are required to see these through to completion are then faced with the problem of justifying these initiatives to the rank and file. Change management in modern organisation-land is largely about justifying the unjustifiable or defending the indefensible.

The critique, however, goes beyond just the practice of management.  For example, Alvesson and Willmott also draw attention to things such as the objectives of the organisation. They point out that short-sighted objectives such as “maximising shareholder value” is what lead to the downfall of companies such as Enron.  Moreover, they also remind us of an issue that is becoming increasingly important in today’s world: that natural resources are not unlimited and should be exploited in a judicious, sustainable manner.

As interesting and important as these “big picture” issues are, in the remainder of this post I’ll focus attention on management practices that impact mid and lower level employees.

A critical look at management specialisations

Alvesson and Willmott analyse organisational functions such as Human Resource Management (HRM), Marketing and Information Systems (IS) from a critical perspective. It would take far too many pages to do justice to their discussion so I’ll just present a brief summary of two areas: HR and IS.

The rhetoric of HRM in organisations stands in stark contradiction to its actions. Despite platitudinous sloganeering about empowerment etc., the actions of most HR departments are aimed at getting people to act and behave in organisationally acceptable ways.  Seen in a critical light, seemingly benign HR initiatives such as organizational culture events or self-management initiatives are exposed as being but subtle means of managerial control over employees.  (see this paper for an example of the former and this one for an example of the latter).

Since the practice of IS focuses largely on technology,  much of the IS  research and practice tends to focus on technology trends and “best practices.” As might be expected, the focus is on “fad of the month” and thus turns stale rather quickly. As examples: the 1990s saw an explosion of papers and projects in business process re-engineering; the flavour of the decade in the 2000s was service-oriented architecture; more recently, we’ve seen a great deal of hot air about the cloud.  Underlying a lot of technology related decision-making is the tacit assumption that choices pertaining to technology are  value-free and can be decided on the basis of  technical and financial criteria alone.  The profession as a whole tends to take an overly scientific/rational approach to design and  implementation, often ignoring issues such as power and politics.  It can be argued that many failures of large-scale IS projects are due to the hyper-rational approach taken by many practitioners.

In a similar vein, most management specialisations can benefit from the insights that come from taking a critical perspective.  Alvesson and Willmott discuss marketing, accounting and other functions. However,  since my main interest is in solutions rather than listing the (rather well-known) problems,  I’ll leave it here, directing the interested reader to the book for more.

Towards an enlightened practice of management

In the modern workplace it is common for employees to feel disconnected from their work, at least from time to time if not always. In a prior post, I discussed how this sense of alienation is a consequence of our work and personal lives being played out in two distinct spheres – the system and the lifeworld. In brief, the system refers to the professional and administrative sphere in which we work and/or  interact with institutional authority and the lifeworld is is the everyday world that we share with others. Actions in the lifeworld are based on a shared understanding of the issue at hand whereas those in the system are not.

From the critical analysis of management specialisations presented in the book, it is evident that the profession, being mired in a paradigm consisting of prescriptive, top-down practices, serves to perpetuate the system by encroaching on the lifeworld values of employees. There are those who will say that this is exactly how it should be. However, as Alvesson and Wilmott have stated in their book, this kind of thinking is perverse because it is ultimately self-defeating:

The devaluation of lifeworld properties is perverse because …At the very least, the system depends upon human beings who are capable of communicating effectively and who are not manipulated and demoralized to the point of being incapable of cooperation and productivity.

Alvesson and Willmott use the term emancipation, to describe any process whereby employees are freed from shackles of system-oriented thinking even if only  partially  (Note: here I’m using the term system in the sense defined above – not to be confused with systems thinking, which is another beast altogether). Acknowledging that it is impossible to do this at the level of an entire organisation or even a department, they coin the term micro-emancipation to describe any process whereby sub-groups of organisations are empowered to think through issues  and devise appropriate  actions by themselves, free (to the extent possible) from management constraints or directives.

Although this might sound much too idealistic to some readers, be assured that it is eminently possible to implement micro-emancipatory  practices in real world organisations. See this paper  for one possible framework that can be used within a multi-organisation project along with a detailed case study that shows how the framework can be applied in a complex project environment.

Alvesson and Willmott warn that emancipatory practices are not without costs, both for employers and employees. For example, employees who have gained autonomy may end up being less productive which will in turn affect their job security.  In my opinion, view, this issue can be addressed through an incrementalist approach wherein both employers and employees work together to come up with micro-emancipatory projects at the grassroots level, as in the case study described in the paper mentioned in the previous paragraph.

…and so to conclude

Despite the rhetoric of autonomy and empowerment, much of present-day management is stuck in a Taylorist/Fordist paradigm. In modern day organisations command and control may not be obvious, but they often sneak in through the backdoor in not-so-obvious ways. For example, employees almost always know that certain things are simply “out of bounds” for discussion and of the consequences of breaching those unstated boundaries can be severe.

In its purest avatar, a critical approach to management seeks to remove those boundaries altogether.  This is unrealistic because nothing will ever get done in an organisation in which everything is open for discussion; as is the case in all social systems, compromise is necessary. The concept of micro-emancipation offers just this. To be sure, one has to go beyond the rhetoric of empowerment to actually creating an environment that enables people to speak their minds and debate issues openly.   Though it is impossible to do this at the level of an entire organisation, it is definitely possible to achieve it (albeit approximately)  in small workgroups.

To conclude: the book is worth a read, not just by management researchers but also by practicing managers.  Unfortunately  the overly-academic style  may be a turn off for practitioners, the very people who need to read it the most.

Written by K

July 18, 2013 at 12:17 am

Overcoming the corporate immune system – some lessons from the dengue virus

with 5 comments

Introduction

The term  corporate immune system  was coined by James Birkenshaw  as a way to describe the tendency of corporate head offices to resist entrepreneurial initiatives by their subsidiaries.  In the present day, the term has also been used  to refer to the tendency of organisations to reject or suppress novel ideas or processes that employees may come up with. This post is about the latter usage of the phrase.

The metaphor of an immune system is an apt one: apart from being a good description of what happens, it also suggests ways in which one can overcome or bypass managerial resistance to initiatives that are seen as threats.   In this post I build on Stefan Lindegaard’s  excellent article,  to discuss how the Dengue virus  can teach us a trick or two about how employees can  get around the corporate immune system.

The mechanics of Dengue infection

Dengue fever, also known as breakbone fever, is endemic to many tropical countries. Its symptoms are  fever, severe headaches, muscle and joint pains and a characteristic skin rash. Dengue is caused by a virus that is transmitted by the Aedes Aegyptii mosquito which can be identified by the white bands on its legs.   Although it originated in Africa,  the Aedes species is now found in most tropical and sub-tropical countries throughout the world.

There are four closely related strains (or serotypes)   of the Dengue virus– imaginatively named Dengue 1 through Dengue 4. This has interesting consequences as we shall see shortly.  First let’s have a quick look at what goes on in the human body after a bite from carrier mosquito. My discussion is based on this article from the Scitable website.

Once a person is bitten by a carrier mosquito, the virus starts to infect skin cells and specialised immune cells  (called Langerhans cells) that are near the site of the bite. The  infected Langerhans cells travel via the bloodstream to the lymph nodes which are responsible for producing  white blood cells (WBCs) that combat infections.

The WBCs are the body’s first line of defence against an infection. The problem is WBCs generally do not succeed in destroying the Dengue virus; worse, they actually end up getting infected by it.  The infected white blood cells then  help in spreading the virus to other organs in the body.

However, all is not lost because the body has another line of defence – the adaptive immune system – which produces antibodies that target specific intruders. Once the infection spreads, the adaptive immune system kicks in, producing antibodies that recognise and neutralise the virus.  The fever an infected person experiences is a manifestation of the battle between the antibodies and the virus. In a healthy person, the immune system eventually wins and the person recovers.

Now here’s the interesting bit: a person who has been infected by the virus gains long term immunity, but only against the particular Dengue serotype that he or she was infected by.  If  the person is bitten by a  mosquito carrying another serotype, the antibodies for the old serotype actually assist the new strain to spread within the body.  Essentially this happens because the antibodies for the old strain   see the new strain as the old one and thus attempt to engulf it. However, because the virus is different, the antibody cannot bind with it completely. It thus forms an antibody-virus complex within  which the virus is still capable of replicating.

These circulating antibody-virus complexes then infect other white blood cells which in turn carry the virus to other parts of the body. This results in a higher volume of virus in the bloodstream than would have occurred otherwise, and hence a more severe infection. This is well known: subsequent infections of Dengue often lead to considerably more severe symptoms than the first one.

The above description is sufficient for the present discussion, but you may want to see this article to learn more about this fascinating virus.

Overcoming the corporate immune system

The processes of primary and secondary Dengue infections hold some lessons for those who want to gain executive support for proposals that might be just a tad too radical for their workplaces.  A direct approach, wherein the idea is pitched directly to executives  is unlikely to work for at least a couple reasons:

  1. The generic corporate immune system (akin to white blood cells in the human body) will attempt to take it down. This is typified by the  generic, “It will never work here (so let’s not try it)” response.
  2. Let’s assume that you are at your persuasive best and manage to get past the generic first line corporate defence. You still cannot rest easy because, in time, managerial ingenuity will come up specific managerial objections to the idea (these are akin to strain-specific antibodies).

However, all is not lost, we can take inspiration from the secondary infection process described in the previous section. The second serotype is able to do a more thorough job in infecting its host because antibodies actually help in transporting the virus through the body.  This happens because the antibodies do not fully recognise the virus and thus bind with it incompletely.

So the trick to getting your idea past the corporate immune system is to cast it in terms that are familiar to managers and to get them to have a stake in it. Here’s one way to do this:

  1. Make a connection between your idea and an already well-established element or aspect of your organisation. Be sure to stress this connection in your pitch (see point 2). This way, the idea is seen as a logical continuation what already exists – i.e. it is seen as old rather than new, much as the old serotype antibodies see the new strain as the old one.
  2. Present your idea to a manager who may be in a position to help you, seeking her advice on it.
  3. Take the advice offered seriously – i.e. modify the idea in a way that incorporates the advice.
  4. Re-present the idea to the  manager, thanking her for their advice and emphasising how it makes a difference.
  5. If they are receptive, ask her if she’d would be willing to socialise the idea amongst her peers. If you have genuinely taken her advice,  chances are she’ll be willing to do this. After all, the idea is now hers too.

The above are generic  steps that can be tailored to specific situations. For example, the same principles apply when writing a business case for a new system or whatever – emphasise continuity and get people to be a part of the idea by offering them a stake in it. The bottom line is that the corporate immune response can be “tricked” into accepting novel ideas, much  as the human immune system is fooled by the Dengue virus.

Conclusion

The metaphor of a corporate immune system not only provides an evocative description of how organisations kill novel ideas, but also suggests how such organisational resistance can be overcome. In this post  I have described  one such strategy based on the fiendishly clever dengue virus.

Written by K

July 3, 2013 at 10:04 pm

A stupidity-based theory of organisations – a paper review

with 9 comments

Introduction

The platitude “our people are  our most important asset”  reflects a belief that the survival and evolution of organisations depends  on the intellectual and cognitive capacities of the individuals who comprise them.   However,  in view of the many well documented examples of actions that demonstrate a lack of  foresight and/or general callousness about the fate of organisations or those who work in them,  one has to wonder if such a belief is justified, or even if it is really  believed by those who spout such platitudes.

Indeed,  cases such as Enron or Worldcom  (to mention just two) seem to suggest that stupidity may be fairly prevalent in present day organisations. This point is the subject of a brilliant paper by Andre Spicer and Mats Alvesson entitled, A stupidity based theory of organisations.  This post is an extensive summary and review of the paper.

Background

The notion that the success of an organization depends on the intellectual and rational capabilities of its people seems almost obvious. Moreover, there is a good deal of empirical research that seems to support this. In the opening section of their paper, Alvesson and Spicer cite many studies which appear to establish that developing the knowledge (of employees) or hiring smart people  is the key to success in an ever-changing, competitive environment.

These claims are mirrored in theoretical work on organizations. For example Nonaka and Takeuchi’s model of knowledge conversion acknowledges the importance of tacit knowledge held by employees. Although there is still much debate about tacit/explicit knowledge divide, models such as these serve to perpetuate the belief that knowledge (in one form or another) is central to organisational success.

There is also a broad consensus that decision making in organizations, though subject to bounded rationality and related cognitive biases,  is by and large a rational process. Even if a decision is not wholly rational, there is usually an attempt to depict it as being so. Such behaviour attests to the importance attached to rational thinking in organization-land.

At the other end of the spectrum there are decisions that can only be described as being, well… stupid. As Rick Chapman discusses in his entertaining book, In Search of Stupidity, organizations occasionally make decisions that are  plain dumb However, such behaviour seldom remains hidden because of its rather obvious negative consequences for the organisation.  Such stories thus end up being  immortalized in business school curricula as canonical examples of what not to do.

Functional stupidity

Notwithstanding the above remarks on  obvious stupidity, there is another category of foolishness that is perhaps more pervasive but remains unnoticed and unremarked. Alvesson and Spicer use the term functional stupidity to refer to such  “organizationally supported lack of reflexivity, substantive reasoning, and justitication.”

In their words, functional stupidity amounts to the “…refusal to use intellectual resources outside a narrow and ‘safe’ terrain.”   It is reflected in a blinkered approach to organisational problems, wherein people display  an unwillingness  to consider or think about solutions that lie outside an arbitrary boundary.  A common example of this is when certain topics are explicitly or tacitly deemed as being “out of bounds” for discussion. Many “business as usual” scenarios are riddled with functional stupidity, which is precisely why it’s often so hard to detect.

As per the definition offered above, there are three cognitive elements to functional stupidity:

  1. Lack of reflexivity: this refers to the inability or unwillingness to question claims and commonly accepted wisdom.
  2. Lack of substantive reasoning: This refers to  reasoning that is based on a small set of concerns that do not span the whole issue. A common example of this sort of myopia is when organisations focus their efforts on achieving certain objectives with little or no questioning of the objectives themselves.
  3. Lack of justification: This happens when  employees do not question managers or, on the other hand, do not provide explanations regarding their  own actions. Often this is a consequence of power relationships in organisations. This may, for example, dissuade employees from “sticking their necks out” by asking questions that managers might deem out of bounds.

It should be noted that functional stupidity has little to do with limitations of human cognitive capacities. Nor does it have anything to do with ignorance, carelessness or lack of thought. The former can be  rectified through education and/or the hiring of consultants with the requisite knowledge,  and the latter via the use of standardised procedures and checklists.

It is also important to  note that  functional stupidity is not necessarily a bad thing. For example, by placing certain topics out of bounds, organisations can avoid discussions about potentially controversial topics and can thus keep conflict and uncertainty at bay.  This maintains  harmony, no doubt, but it also strengthens the existing organisational order which  in turn serves to reinforce functional stupidity.

Of course, functional stupidity also has negative consequences, the chief one being that it prevents organisations from finding solutions to issues that involve topics that have been arbitrarily deemed as being out of bounds.

Examples of functional stupidity

There are many examples of functional stupidity in recent history, a couple being the irrational exuberance in the wake of the internet boom of the 1990s, and the lack of  critical examination of the complex mathematical models that lead to the financial crisis of last decade.

However, one does not have to look much beyond one’s own work environment to find examples of functional stupidity.  Many of these come under the category of  “business as usual”  or “that’s just the way things are done around here” – phrases that are used to label practices that are ritually applied without much thought or reflection.  Such practices often remain unremarked because it is not so easy to link them to negative outcomes.  Indeed, the authors point out that “most managerial practices are adopted on the basis of faulty reasoning, accepted wisdom and complete lack of evidence.”

The authors cite the example of companies adopting HR practices that are actually detrimental to employee and organisational wellbeing.  Another common example  is when organisations place a high value on gathering information which is then not used in a meaningful way.    I have discussed this “information perversity” at length in my post on entitled, The unspoken life of information in organisations, so I won’t  rehash it here.  Alvesson and Spicer point out that information perversity is a consequence of the high cultural value placed on information: it is seen as a prerequisite to “proper” decision making. However,  in reality it is often used to justify questionable decisions or simply “hide behind the facts.”

These examples suggest that functional stupidity may be the norm rather than the exception. This is a scary thought…but I suspect it may not be surprising to many readers.

The dynamics of stupidity

Alvesson and Spicer claim that functional stupidity is a common feature in organisations. To understand why it is so pervasive, one has to look into the dynamics of stupidity – how it is established and the factors that influence it.  They suggest that the root cause lies in the fact that organisations attempt to short-circuit critical thinking through what they call economies of persuasion, which are activities such as corporate culture initiatives, leadership training or team / identity building, relabelling positions with pretentious titles – and many other such activities that are aimed at influencing employees  through the use of symbols and images rather than substance. Such symbolic manipulation, as the authors calls it, is aimed at increasing employees’ sense of commitment to the organisation.

As they put it:

Organizational contexts dominated by widespread attempts at symbolic manipulation typically involve managers seeking to shape and mould the ‘mind-sets’ of employees . A core aspect of this involves seeking to create some degree of good faith and conformity and to limit critical thinking

Although such efforts are not always successful, many employees do buy in to them and thereby identify with the organisation. This makes employees uncritical of the organisation’s  goals and the means by which these will be achieved. In other words, it sets the scene for functional stupidity to take root and flourish.

Stupidity management and stupidity self-management

The authors use the term stupidity management to describe managerial actions that prevent or discourage organisational actors (employees and other stakeholders) from thinking for themselves.   Some of the ways in which this is done include the reinforcement of positive images of the organisation, getting employees to identify with the organisation’s vision and myriad other organisational culture initiatives aimed at burnishing the image of the corporation. These initiatives are often backed by organisational structures (such as hierarchies and reward systems) that discourage employees from raising and exploring potentially disruptive issues.

The monitoring and sanctioning of activities that might disrupt the positive image of the organisation can be overt (in the form of warnings, say). More often, though, it is subtle. For example, in many meetings, participants participants know that certain issues cannot be raised. At other times, discussion and debate may be short circuited by exhortations to “stop thinking and start doing.”  Such occurrences serve to create an environment in which stupidity flourishes.

The net effect of  managerial actions that encourage stupidity is that employees start to cast aside their own doubts and questions and behave in corporately acceptable ways – in other words, they start to perform their jobs in an unreflective and unquestioning way. Some people may actually internalise the values espoused by management; others may psychologically  distance themselves from the values but still act in ways that they are required to. The net effect of such stupidity self-management (as the authors call it) is that employees stop questioning what they are asked to do and just do it. After a while, doubts fade and this becomes the accepted way of working. The end result is the familiar situation that many of us know as “business as usual” or  “that’s just the way things are done around here.”

The paradoxes and consequences of stupidity

Functional stupidity can cause both feelings of certainty and dissonance in members of an organisation. Suppressing  critical thinking  can result in an easy acceptance of  the way things are.  The feelings of certainty that come from suppressing difficult questions can be comforting. Moreover, those who toe the organisational line are more likely to be offered material rewards and promotions than those who don’t. This can act to reinforce functional stupidity because others who see stupidity rewarded may also be tempted to behave in a similar fashion.

That said,  certain functionally stupid actions, such as ignoring obvious ethical lapses, can result in serious negative outcomes for an organisation. This has been amply illustrated in the recent past. Such events can prompt formal inquiries  at the level of the organisation, no doubt accompanied by  informal soul-searching at the individual level. However, as has also been amply illustrated, there is no guarantee that inquiries or self-reflection lead to any major changes in behaviour. Once the crisis passes, people seem all too happy to revert to business as usual.

In the end , though, when stark differences between the rhetoric and reality of the organisation emerge  – as they eventually will– employees will  see the contradictions between the real organisation and the one they have been asked to believe in. This can result in alienation from and cynicism about the organisation and its objectives. So, although stupidity management may have beneficial outcomes in the short run, there is a price to be paid  in the longer term.

Nothing comes for free, not even stupidity…

Conclusion

The authors main message is that despite the general belief that organisations enlist the cognitive and intellectual capacities of their members in positive ways, the truth is that organisational behaviour often exhibits a wilful ignorance of facts and/or a lack of logic. The authors term this behaviour functional stupidity.

Functional stupidiy has the advantage of maintaining harmony at least in the short term, but its longer term consequences can be negative.   Members of an organisation “learn” such behaviour  by becoming aware that certain topics are out of bounds and that they broach these at their own risk. Conformance is rewarded by advancement or material gain whereas dissent is met with overt or less obvious disciplinary action. Functional stupidity thus acts as a barrier that can stop members of an organisation from developing potentially interesting perspectives on the problems the organisations face.

The paper makes an interesting and very valid point about the pervasiveness of wilfully irrational behaviour in organisations. That said, I  can’t help but think that the authors  have written it with tongue firmly planted in cheek.

Pseudo-communication in organisations

with 7 comments

Introduction

Much of what is termed communication in organisations is but a  one-way, non-interactive process of information transfer. It doesn’t seem right to call this communication, and other terms  such as propaganda  carry too much baggage. In view of this, I’ve been searching for an appropriate term for some time. Now –  after reading a paper by Terence Moran entitled  Propaganda as Pseudocommunication  – I think I have found one.

Moran’s paper discusses how propaganda, particularly in the social and political sphere,  is packaged and sold as genuine communication even though it isn’t –  and hence the term pseudo-communication.   In this post,I draw on the paper to show how one can distinguish between communication and pseudo-communication in organisational life.

Background

Moran’s paper was written in 1978, against a backdrop of political scandal and so, quite naturally, many of the instances of pseudo-communication he discusses are drawn from the politics of the time. For example, he writes:

As Watergate should have taught us, the determined and deliberate mass deceptions that  are promulgated via the mass media by powerful political figures cannot be detected, much less combated easily.

Such propaganda is not the preserve of politicians alone, though. The wonderful world of advertising illustrates how pseudo-communication works in insidious ways that are not immediately apparent. For example, many car or liquor advertisements attempt to associate the advertised brand with sophistication and style, suggesting that somehow those who consume the product will be transformed into sophisticates.

As Moran states:

It was reported in the Wall Street Journal of August 14, 1978 that the the Federal Trade Commission  finally has realized that advertisements carry messages via symbol systems other than language. The problem is in deciding how to recognise, analyse and legislate against deceptive messages

Indeed! And I would add that  the problem has only become worse in the 30 odd years since Mr. Moran wrote those words.

More relevant to those of  us who work in organisation-land, however, is the fact that  sophisticated pseudo-communication has wormed its way into the corporate world, a prime example being  mission/vision statements that seem to be de rigueur for corporations. Such pseudo-communications are rife with platitudes, a point that Paul Culmsee and I explore at length in Chapter 1 of our book.

Due to the increasing sophistication of pseudo-communication it can sometimes be hard to distinguish it from the genuine stuff.  Moran  offers some tips that can help us do this.

Distinguishing between communication and pseudo-communication

Moran describes several characteristics of pseudo-communication vis-à-vis its authentic cousin. I describe some of  these below with particular reference to pseudo-communication in organisations.

1. Control and interpretation

In organisational pseudo-communication  the receiver is not free to interpret the message as per his or her own understanding. Instead, the sender determines the meaning of the message and receivers are  expected to “interpret” the message as the sender requires them to. An excellent example of this are corporate mission/vision statements – employees are required to understand these as per the officially endorsed interpretation.

Summarising: in communication control is shared between the sender and receiver whereas in pseudo-communication, control rests solely with the sender.

2. Stated and actual purpose

To put it quite bluntly, the aim of most employee-directed corporate pseudo communication is to get employees to behave in ways that the organisation would like them to. Thus, although pseudo-communiques may use words like autonomy and empowerment  they are directed towards achieving organisational objectives, not those of employees.

Summarising: in communication the stated and actual goals are the same whereas in pseudo-communication they are different. Specifically, in pseudo-communication  actual purposes are hidden and are often contradictory to the stated ones.

3. Thinking and analysis

Following from the above  it seems pretty clear that the success of organisational pseudo-communication  hinges on employees not analysing messages in an individualistic or critical way. If they did, they would see it for them for the propaganda that they actually are. In fact, it isn’t a stretch to say that most organisational pseudo-communication is generally are aimed at encouraging groupthink at the level of the entire organisation.

A corollary of this is that in communication it is assumed that the receiver will act on the message in ways that he or she deems  appropriate whereas in pseudo-communication the receiver is encouraged to act in “organisationally acceptable” ways.

Summarising: in communication it is expected that receivers will analyse the message individually in a critical way so as to reach their own conclusions. In pseudo-communication  however, receivers are expected to think about the message in a standard, politically acceptable way.

4. Rational vs. emotional appeal

Since pseudo-communication works best by dulling  the critical faculties of recipients, it seems clear that it should aim evoke a emotional response rather than a rational (or carefully considered) one.  Genuine communication, on the other hand, makes clear the relationship between elements of the message and supporting evidence so that receivers can  evaluate it for themselves and reach their own conclusions.

Summarising:  communication makes an appeal to the receivers’ critical/rational side whereas pseudo-communication aims to  make an emotional connection with receivers.

5. Means and ends

In  organisational pseudo-communication such as mission/vision statements and the strategies that arise from it, the ends are seen as justifying the means. The means are generally assumed to be value-free in that it is OK to do whatever it takes to achieve organisational goals, regardless of the ethical or moral implications. In contrast, in (genuine) communication, means and ends are intimately entwined and are open to evaluation on rational and moral/ethical bases.

Summarising: in pseudo-communication, the ends are seen as justiying the means whereas in communication they are not.

6. World view

In organisational pseudo-communication the the organisation’s world is seen as being inherently simple, so much so that it can be captured using catchy slogans such as “Delivering value” or “Connecting people” or whatever. Communication, on the other hand,   acknowledges the existence of intractable problems and alternate worldviews and thus viewing the world as being inherently complex.  As Moran puts it, “the pseudo-communicator is always endeavouring to have us accept a simplified view of life.” Most corporate mission and vision statements will attest to the truth of this.

Summarisingpseudo communication over-simplifies or ignore  difficult or inconvenient issues whereas communication acknowledges them.

Conclusion

Although Moran wrote his paper over 30 years ago, his message is now more relevant and urgent than ever.  Not only is pseudo-communication prevalent in politics and advertising, it has also permeated organisations and even our social relationships. In view of this, it is ever more important that we are able to distinguish pseudo-communication from the genuine stuff.  Incidentally, I highly recommend that reading  the original paper -it is very readable and even laugh-out-loud funny in parts.

Finally, to indulge in some speculation: I wonder why pseudo-communication is so effective in the organisational world when even a cursory analysis exposes its manipulative nature. I think an answer lies in the fact that modern organisations use powerful, non-obtrusive techniques such as organisational culture initiatives to convince their people of the inherent worth of the organisation and their roles in it. Once this is done, it makes employees less critical and hence more receptive to pseudo-communication. Anyway, that is fodder for another post. For now, I leave you to ponder the points made above and perhaps use them in analysing (pseudo)communication in your own organisation.

Written by K

January 23, 2013 at 9:36 pm

%d bloggers like this: