Eight to Late

Sensemaking and Analytics for Organizations

Archive for the ‘Management’ Category

From ambiguity to action – a paper preview

with 12 comments

The powerful documentary The Social Dilemma highlights the polarizing effect of social media, and how it hinders our collective ability to address problems that impact communities, societies and even nations. Towards the end of the documentary, the technology ethicist, Tristan Harris, makes the following statement:

“If we don’t agree on what is true or that there’s such a thing as truth, we’re toast. This is the problem beneath all other problems because if we can’t agree on what is true, then we can’t navigate out of any of our problems.”

The central point the documentary makes is that the strategies social media platforms use to enhance engagement also tend to encourage the polarization of perspectives. A consequence is that people on two sides of a contentious issue become less likely to find common ground and build a shared understanding of a complex problem.

A similar dynamic plays out in organisations, albeit on a smaller and less consequential scale. For example, two departments – say, sales and marketing – may have completely different perspectives on why sales are falling.  Since their perspectives are different, the mitigating actions they advocate may be completely different, even contradictory. In a classic paper, published half a century ago, Horst Rittel and Melvin Webber coined the term wicked problem to describe such ambiguous dilemmas.

In contrast, problems such as choosing the cheapest product from a range of options are unambiguous because the decision criteria are clear. Such problems are sometimes referred to as tame problems.  As an aside, it should be noted that organisations often tend to treat wicked problems as tame, with less-than-optimal consequences down the line. For example, choosing the cheapest product might lead to larger long-term costs due to increased maintenance, repair and replacement costs.

The problem with wicked problems is that they cannot be solved using rational approaches to decision making. The reason is that rational approaches assume that a) the decision options can be unambiguously determined upfront, and b) that they can be objectively rated.  This implicitly assumes that all those who are impacted by the decision will agree on the options and the rating criteria. Anyone who has been involved in making a contentious decision will know that these are poor assumptions. Consider, for example, management and employee perspectives on an organizational restructuring.

In a book published in 2016, Paul Culmsee and I argued that the difference between tame and wicked problems lies in the nature of uncertainty associated with the two. In brief, tame problems are characterized by uncertainties that can be easily quantified (e.g., cost or time in projects) whereas wicked problems are characterized by uncertainties that are hard to quantify (e.g., the uncertainties associated with a business strategy).  One can think of these as lying at the opposite ends of an ambiguity spectrum, as shown below:

Figure 1: The Ambiguity Spectrum

It is important to note that most real-world problems have both quantifiable and unquantifiable uncertainties and the first thing that one needs to do when one is confronted with a decision making situation is to figure out, qualitatively, where the problem lies on the ambiguity spectrum:

Figure 2: Where does your problem lie on the ambiguity spectrum?

The key insight is that problems that have quantifiable uncertainties can be tackled using rational decision making techniques whereas those with unquantifiable uncertainties cannot. Problems of the latter kind are wicked, and require a different approach – one that focuses on framing the problem collectively (i.e., involving all impacted stakeholders) prior to using rational decision making approaches to address it. This is the domain of sensemaking, which I like to think of as the art of extracting or framing a problem from a messy situation.

Sensemaking is something we all do instinctively when we encounter the unfamiliar – we try to make sense of the situation by framing it in familiar terms. However, in an unfamiliar situation, it is unlikely that a single perspective on a problem will be an appropriate one. What is needed in such situations is for people with different perspectives to debate their views openly and build a shared understanding of the problem that synthesizes the diverse viewpoints. This is sometimes called collective sensemaking.

Collective sensemaking is challenging because it involves exactly the kind of cooperation that Tristan Harris calls for in the quote at the start of this piece.

But when people hold conflicting views on a contentious topic, how can they ever hope to build common ground? It turns out there are ways to build common ground, and although they aren’t perfect (and require diplomacy and doggedness) they do work, at least in many situations if not always. A technique I use is dialogue mapping which I have described in several articles and a book co-written with Paul Culmsee.

Figure 3: An example dialogue (or issue) map

Regardless of the technique used, the point I’m making is that when dealing with ambiguous problems one needs to use collective sensemaking to frame the problem before using rational decision making methods to solve it. When dealing with an ambiguous problem, the viability of a decision hinges on the ability of the decision maker to: a) help stakeholders distinguish facts from opinions, b) take necessary sensemaking actions to find common ground between holders of conflicting opinions, and c) build a base of shared understanding from which a commonly agreed set of “facts” emerge. These “facts” will not be absolute truths but contingent ones. This is often true even of so-called facts used in rational decision making: a cost quotation does not point to a true cost, rather it is an estimate that depends critically on the assumptions made in its calculation. Such decisions, therefore, cannot be framed based on facts alone but ought to be co-constructed with those affected by the decision.  This approach is the basis of a course on decision making under uncertainty that I designed and have been teaching across two faculties at the University of Technology Sydney for the last five years.

In a paper, soon to be published in Management Decision, a leading journal on decision making in organisations, Natalia Nikolova and I describe the principles and pedagogy behind the course in detail. We also highlight the complementary nature of collective sensemaking and rational decision making, showing how the former helps in extracting (or framing) a problem from a situation while the latter solves the framed problem. We also make the point that decision makers in organisations tend to jump into “solutioning” without spending adequate time framing the problem appropriately.  

Finally, it is worth pointing out that the hard sciences have long recognized complementarity to be an important feature of physical theories such as quantum mechanics. Indeed, the physicist Niels Bohr was so taken by this notion that he inscribed the following on his coat of arms: contraria sunt complementa (opposites are complementary). The integration of apparently incompatible elements into a single theory or model can lead to a more complete view of the world and hence, how to act in it. Summarizing the utility of our approach in a phrase: it can help decision makers learn how to move from ambiguity to action.

For copyright reasons, I cannot post the paper publicly. However, I’d be happy to share it with anyone interested in reading / commenting on it – just let me know via a comment below.

Note added on 13 May 2022:

The permalink to the published online version is: https://www.emerald.com/insight/content/doi/10.1108/MD-06-2021-0804/full/html

Written by K

May 3, 2022 at 7:59 am

Conversations and commitments: an encounter with emergent design 

leave a comment »

Many years ago, I was tasked with setting up an Asia-based IT development hub for a large multinational.   I knew nothing about setting up a new organisation from scratch. It therefore seemed prudent to take the conventional route – i.e., engage experts to help.

I had conversations with several well-known consulting firms. They exuded an aura of confidence-inspiring competence and presented detailed plans about how they would go about it. Moreover, they quoted costs that sounded very reasonable.  

It was very tempting to outsource the problem.

–x–

Expert-centric approaches to building new technical capabilities are liable to fail because such initiatives often display characteristics of wicked problems,  problems that are so complex and multifaceted that they are difficult to formulate clearly, let alone solve. This is because different stakeholder groups have different perspectives on what needs to be done and how it should be done.

The most important feature of such initiatives is that they cannot be tackled using rational methods of planning, design and implementation that are taught in schools, propagated in books, and evangelized by standards authorities and snake oil salespeople big consulting firms.

This points to a broader truth that technical initiatives are never purely technical; they invariably have a social dimension. It is therefore more appropriate to refer to them as sociotechnical problems.

–x–

One day, not long after my conversations with the consulting firms, I came across an article on Oliver Williamson’s Nobel prize winning work on transaction costs. The arguments presented therein drew my attention to the hidden costs of outsourcing.

The consultants I’d spoken with had included only upfront costs, neglecting the costs of coordination, communication, and rework. The outsourcing option would be cost effective only if the scale was large enough. The catch was that setting up a large development centre from scratch would be risky, both politically and financially. There was too much that could go wrong.

–x–

Building a new sociotechnical capability is a process of organisational learning. But learning itself is a process of trial and error, which is why planned approaches to building such capabilities tend to fail. 

All such initiatives are riddled with internal tensions that must be resolved before any progress can be made. To resolve these tensions successfully one needs to use an approach that respects the existing state of the organisation and introduces changes in an evolutionary manner that enables learning while involving those who will be affected by the change.  Following David Cavallo, who used such an approach in creating innovative educational interventions in Thailand, I call this process emergent design.

–x–

The mistake in my thinking was related to the fallacy of misplaced concreteness. I had been thinking about the development hub as a well-defined entity rather than an idea that needed to fleshed out through a process of trial and error. This process would take time; it had to unfold in small steps, through many interactions and conversations.

It became clear to me that it would be safest to start quietly, without drawing much attention to what I was doing. That would enable me to test assumptions, gauge the organisation’s appetite for the change and, most importantly, learn by trial and error.

I felt an opportunity would present itself sooner than later.

–x–

In their book, Disclosing New Worlds, which I have discussed at length in this post, Spinosa et. al. note that:

“[organisational] work [is] a matter of coordinating human activity – opening up conversations about one thing or another to produce a binding promise to perform an act … Work never appears in isolation but always in a context created by conversation.”

John Shotter and Ann Cunliffe flesh out the importance of conversations via their notion of managers as authors [of organisational reality].  Literally, managers create (or author) realities through conversations that help people make sense of ambiguous situations and / or open up new possibilities.

Indeed, conversations are the lifeblood of organisations. It is through conversations that the myriad interactions in organisational life are transformed into commitments and thence into actions.

–x–

A few weeks later, a work colleague located in Europe called to catch up. We knew each other well from a project we had worked on a few years earlier. During the conversation, he complained about how hard it was to find database skills at a reasonable cost.

My antennae went up. I asked him what he considered to be a “reasonable cost.” The number he quoted was considerably more than one would pay for those skills at my location.  

“I think I can help you,” I said, “I can find you a developer for at most two thirds that cost here. Would you like to try that out for six months and see how it works?” 

“That’s very tempting,” he replied after a pause, “but it won’t work. What about equipment, workspace etc.? More important, what about approvals.” 

“I’ll sort out the workspace and equipment,” I replied, “and I’ll charge it back to your cost centre. As for the approval, let’s just keep this to ourselves for now. I’ll take the rap if there’s trouble later.” 

He laughed over the line. “I don’t think anyone will complain if this works. Let’s do it!” 

–x–

As Shotter and Cunliffe put it, management is about acting in relationally responsive ways. Seen in that light, conversations are more than just talk; they are about creating shared realities that lead to action.

How can one behave in a relationally responsive way? As in all situations involving human beings, there are no formulas, but there are some guiding principles that I have found useful in my own work as a manager and consultant:

Be a midwife rather than an expert:  The first guideline is to realize that no one is an expert – not you nor your Big $$$ consultant. True expertise comes from collaborative action.  The role of the midwife is to create and foster the conditions for collaborative action to occur.  

Act first, seek permission later (but exercise common sense): Many organisations have a long list of dos and don’ts. A useful guideline to keep in mind is that it is usually OK to launch exploratory actions as long as they are done in good faith, the benefits are demonstrable and, most importantly, the actions do not violate ethical principles. The dictum that it is easier to beg forgiveness than seek permission has a good deal of truth to it. However, you will need to think about the downsides of acting without permission in the context of your organisation, its tolerance for risk and the relationships you have with management.

Do not penalize people for learning:  when setting up new capabilities, it is inevitable that things will go wrong.  If you’re at the coalface, you will need to think about how you will deal with the fallout. A useful approach is to offer to take the rap if things go wrong. On the other hand, if you’re a senior manager overseeing an initiative that has failed, look for learnings, not scapegoats.

Distinguish between wicked and tame elements of your initiative: some aspects of sociotechnical problems are wicked, others are straightforward (or tame). For example, in the case of the development centre, the wicked element was how to get started in a way that demonstrated value both to management and staff. The tame elements were the administrative issues: equipment, salary recharging etc (though, as it turned out, some of these had longer term wicked elements – a story to be told later perhaps).

Actively seek other points of view: Initially, I thought of the development centre in terms of a large monolithic affair. After talking to consultants and doing my own research, I realised there was another way.

Understand the need for different types of thinking: related to the above, it is helpful to surround yourself with people who think differently from you.

Consider long term consequences:  Although it is important to act (the second point made above), it is also important to think through the consequences of one’s actions, the possible scenarios that might result and how one will deal with them.

Act so as to increase your future choices: This principle is from my intellectual hero, Heinz von Foerster, who called it the ethical imperative (see the last line of this paper). Given that one is acting in a situation that is inherently uncertain (certainly the case when one is setting up a new sociotechnical capability), one should be careful to ensure that one’s actions do not inadvertently constrain future choices.

–x–

With some trepidation, we decided to go ahead with the first hire.

A few months later, my colleague was more than happy with how things were going and started telling others about it. Word got around the organisation; one developer became three, then five, then more. Soon I was receiving more enquiries and requests than our small makeshift arrangement could handle. We had to rent dedicated office space, fit it out etc, but that was no longer a problem because management saw that it made good business sense.

–x–

This was my first encounter with emergent design. There have been many others since – some successful, others less so.   However, the approach has never failed me outright because a) the cost of failure is small and b) learnings gained from failures inform future attempts.

Although there are no set formulas for emergent design, there are principles.  My aim in this piece was to describe a few that I have found useful across different domains and contexts. The key takeaway is that emergent design increases one’s chances of success because it eschews expert-driven approaches in favour of practices tailored to the culture of the organisation.

 As David Cavallo noted, “rather than having the one best way there can now be many possible ways. Rather than adapting one’s culture to the approach, one can adapt the approach to one’s culture.

–x–x–

Written by K

September 14, 2021 at 4:43 am

Making sense of management – a conversation with Richard Claydon

with one comment

KA 

Hi there. I’m restarting a series of conversations that I’d kicked off in 2014 but discontinued a year later for a variety of reasons. At that time, I’d interviewed a few interesting people who have a somewhat heretical view on things managers tend to take for granted. I thought there’s no better way to restart the series than to speak with Dr. Richard Claydon, who I have known for a few years.  Richard calls himself a management ironist and organisational misbehaviorist. Instead of going on and risking misrepresenting what he does, let me get him to jump in and tell you himself.

Welcome Richard, tell us a bit about what you do.

RC  

I position myself as having a pragmatic, realistic take on management. Most business schools have a very positivistic take on the subject, a “do A and get B” approach. On the other hand, you have a minority of academics – the critical theorists – who say, well actually if you do A, you might get B, but you also get C, D, E, F, G.  This is actually a more realistic take. However, critical management theory is full of jargon and deep theory so it’s very complex to understand. I try to position myself in the middle, between the two perspectives, because real life is actually messier than either side would like to admit.

I like to call myself a misbehaviourist because the mess in the middle is largely about misbehaviours – real but more often, perceived. Indeed, good behaviours are often misperceived as bad and bad behaviours misperceived as good. I should emphasise that my work is not about getting rid of the bad apples or performance managing people. Rather it’s about working out what people are doing and more importantly, why. And from that, probing the system and seeing if one can start effecting changes in behaviours and outcomes.

KA 

Interesting! What kind of reception do you get? In particular, is there an appetite for this kind of work – open ended with no guarantee of a results?

RC 

Six of one half a dozen or the other. I’ve noticed a greater appetite for what I do now than there was six or seven years ago. It might be that I’ve made what I do more digestible and more intelligible to people in the management space. Or it might be that people are actually recognising that what they’re currently doing isn’t working in the complex world we live in today. It’s probably a bit of both.

That said, I definitely think the shift in thinking has been accelerated by the pandemic. It’s sort of, we can’t carry on doing this anymore because it is not really helping us move forward. So, I am finding a larger proportion of people willing to explore new approaches.

KA 

Tell us a bit about the approaches you use.

RC 

As an example, I’ve used narrative analytics –   collecting micro narratives at massive scale across an organisation and then analysing them, akin to the stuff Dave Snowden does.  Basically, we collect stories across the organisation, cluster them using machine learning techniques, and then get a team of people with different perspectives to look at the clusters. This gives us multiple readings on meaning. So, the team could consist of someone with leadership expertise, someone with expertise in mental health and wellbeing, someone with a behavioural background etc.

We also use social network analysis to find how information flows within a organisation. The aim here is to identify three very different types of characters: a) blockers – those who stop information from flowing, b) facilitators of information flow, c) connectors – information hubs, the go-to people in the organisation and d) mavericks, those who are thinking differently. And if you do that, you can start identifying where interesting things are happening, where different thinking is manifesting itself, and who’s carrying that thinking across the organisation.

KA 

Interesting! What sort of scale do you do this at?

RC 

Oh, we can scale to 1000s of people – organisations that have 35000 to 40,000 people – well beyond the scale at which one can wander around and do the ethnography oneself.

KA 

How do you elicit these micro-narratives?

RC 

I’ll give you an example. For a study we did on remote working during COVID we simply wrote, when it comes to working from home in COVID, I like dot dot, dot, I don’t like dot dot, dot, I wish dot dot, dot, I wonder dot dot dot,  plus some metadata to slice and dice – age bands, gender etc.  Essentially, we try to ask a very open set of questions, to get people into a more reflective stance. That’s where you begin to get some really interesting stuff.

KA 

Can you tell us about some of the interesting things you found from this study?  The more, I guess, interesting and surprising things that you’ve seen that are  perhaps not so obvious from a cursory glance,

RC 

The one thing that was very clear from the COVID studies was that the organisation’s perception of work from home was the key to whether it actually worked or not. If management gives the impression that work from home is somehow not quite proper work, then you’re going to get a poor work from home experience for all. If management isn’t trusting a person to work from home, or isn’t trusting a team to work from home then you’ve got a problem with your management, not with your people. The bigger the trust gap, the worse the experience. Employees in such environments feel more overwhelmed, more isolated, and generally more limited and restricted in their lives. That was the really interesting finding that came out of this piece of work. 

KA 

That’s fascinating…but I guess should not be surprising in hindsight. Management attitudes play a large role in determining employee behaviours and attitudes, and one would expect this to be even more the case when there is less face-to-face interaction. This is also a nice segue into another area I’d like to get you to talk about:  the notion of organisational culture.  Could you tell us about your take on the concept?

RC 

How cynical do you want me to be?

KA 

Very, I expect nothing less!

RC 

Well, if you go back into why culture became such a big thing, the first person who talked about culture in organisations was Elliott Jaques, way back in the 50s. But it didn’t really catch on then. It became a thing in the early 80s. And how it did is a very interesting story.

Up until the early 70s, you had – in America at least – a sort of an American Dream being lived underpinned by the illusion of continuous growth.  Then came the challenges of the 70s, the oil crisis and numerous other challenges that resulted in a dramatic loss of confidence in the American system. At the same time, you had the Japanese miracle, where a country that had two nuclear bombs dropped on it thirty years earlier was, by the 1970s, the second biggest economy in the world. And there was this sort of frenzy of interest in what the Japanese were doing to create this economic miracle and, more important, what America could learn from it. There were legions of consultants and academics going back and forth between the two countries.

One of the groups that was trying to learn from the Japanese was McKinsey. But this wasn’t really helping build confidence in the US. On the contrary, this approach seemed to imply that the Japanese were in some way better, which didn’t go down particularly well with the local audience. There was certainly interest in the developments around continuous improvement,  The Toyota Way etc – around getting the workers involved with the innovation of products and processes, as well as the cultural notions around loyalty to the organisation etc.  However, that was not enough to excite an American audience.

The spark came from Peters and Waterman’s  book, In Search of  Excellence, which highlighted examples of American companies that were doing well.  The book summarised eight features that these companies had in common – these were labelled principles of a good culture and that’s where the Mckinsey Seven S model came from. It was a kind of mix of ideas pulled in from Peters/Waterman, the Japanese continuous improvement and culture stuff, all knocked together really quite quickly.  In a fortunate (for Peters and Waterman) coincidence, the US economy turned the corner at around the time that this book was published and sales took off. That said, it’s a very well written book. The first half of In Search of Excellence is stunning. If you read it you’ll see that the questions they asked then are relevant questions even today. Anyway, the book came out at exactly the right time: the economy had turned the corner, McKinsey had a Seven S model to sell and then two universities jumped into the game, Stanford and Harvard… and lo behold, organisational culture became a management buzz-phrase, and  remains so to this day.  Indeed, the idea that special cultures are driving performance has bubbled up again in recent years, especially in the tech sector. In the end, though, the notion of culture  is very much a halo effect, in that the proponents of culture tend to attribute  performance to certain characteristics (i.e. culture). The truth is that success may give rise to a culture, but there is no causal effect the other way round.

KA 

Thanks for that historical perspective. In my experience in large multinationals, I’ve found that the people who talked about culture the most were from HR. And, they were mostly concerned about enforcing a certain uniformity of thought across the organisation.  That was around that time I came across the work of some critical management scholars who you alluded to at the start of this conversation. In particular, Hugh Willmott’s, wonderful critique of organisational culture : strength is ignorance; slavery is freedom. I thought that was a brilliant take on why people tend to push back on HR driven efforts to enforce a culture mindset- the  workshops and stuff that are held to promote it. I’m surprised that people in high places continue to be enamoured by this concept when they really should know better, having come up through the ranks themselves.

RC 

Yea, the question is whether they have come through the ranks themselves. A lot of them have come through MBA programmes or have been parachuted in. This is why, when I teach in the MBA, I try to teach this wider appreciation of culture because I know what the positivists are teaching – they are telling their students that culture is a good lever to get the kind of desirable behaviours that managers want.

KA 

Totally agree, the solution is to teach diverse perspectives instead of the standard positivist party line. I try to do the same in my MBA decision-making class – that is, I challenge the positivistic mindset by drawing students’ attention to the fact that in real life, problems are not given but have to be taken from complex situations (to paraphrase Russell Ackoff). Moreover, how one frames the problem determines the kind of answer one will get. Analytical decision-making tools assume the decision problem is given, but one is never given a problem in real life. So, I spend a lot of time teaching sensemaking approaches that can help students extract problems from complex situations by building context around the situation.

Anyway, we’ve been going for quite a bit, there’s one thing I absolutely must touch upon before we close this conversation – the use of irony in management. I know, your PhD work was around this concept, and it’s kind of an unusual take. I’m sure my readers would be very interested to hear more about your take on irony and why it’s useful in management.

RC 

I think we’ve set the stage quite nicely in terms of the cultural discussion. So what I was looking at in my PhD was a massive cultural change in an Australian company, a steelworks. We had unfettered access to the company for six and a half years, which is kind of unheard of. So anyway, one of the interesting things we noticed during our fieldwork was that everybody was identifying the same group of people as being the ones that were giving them the best information, were the easiest to talk, had the most  useful data sources, etc.

We then noticed that these people seemed to have an ironic sensibility. What does that mean? Well, they poked fun at themselves, their teammates, managers and the organisation…and indeed, even our research, but in very subtle ways. However, these people were also doing their work exceptionally well: they had more knowledge about what the hell was going on than anybody else in the company. Everybody liked them, everybody wanted to work with them, everybody was coming to them as problem solvers. You know, they had all of this interesting stuff happening around them.

So, what does it mean to have an ironic stance or an ironic sensibility in the midst of a shifting culture while doing quite complex work in challenging conditions? Well, there are three elements to it, firstly there’s there’s a perspective that you take, secondly there’s a performance that you give, and thirdly there’s a personality or character you develop.

The ironic perspective is that you see the gap between the rhetoric and reality, you see the gaps that most others do not. Then you’ve got this feeling that maybe it’s only you that sees the gap, and that can be quite scary. Especially if you’re trying to transmit that there’s a gap to powerful people who haven’t seen it,  and may even think everything’s going well.

How do you do this without losing your head?  And I mean that both literally (as in going crazy) and metaphorically as in losing your job.

That’s where the ironic performance comes in  – you say one thing while actually meaning something else. You’re trying to get people to deconstruct your message and work out where the gap is for themselves rather than confronting them with it and saying, “look, here is the gap”. So, this is where all the witticisms and the play on words and the humour come in. These are devices through which this message is transmitted in a way that helps the ironist keep her head – both metaphorically and in terms of her own sanity. These people are critical to the organisation because they call things out in a way that is acceptable. Moreover, since such people also tend to be good at what they do, they tend to have an outsized influence on their peers as well as on management.

So, our argument was that these folks with an ironic sensibility, they’re not just useful to have around they’re absolutely vital, and you should do everything you can to find them and look after them in the contemporary organisation.

KA 

So, there’s a clear distinction between a cynical and an ironic personality, because the cynic will call it out quite bluntly, in a way that puts people off. The ironists get away with it because they call it out in a very subtle way that could be even construed as not calling it out. It requires a certain skill and talent to do that.

RC 

Yes, and there’s a different emotional response as well. The cynic calls it out and hates it; the ironist expects it and takes joy in its absurdity.

KA 

So, the ironist is a bit like the court jester of yore: given licence to call out bullshit in palatable, even entertaining ways.

RC 

I like that. The original ironist was Socrates – pretending to be this bumbling fool but actually ridiculously sharp. The pretence is aimed at exposing an inconsistency in the others’ thinking, and to start a dialogue about it. That’s the role the ironist plays in achieving change.

KA 

That’s fascinating because it ties in with something I’ve noticed in my travels through various organisations. I do a lot of dialogic work with groups – trying to use conversations to frame different perspectives on complex situations. When doing so I’ve often found that the people with the most interesting things to say will have this ironic sensibility – they are able to call out bullshit using a memorable one-liner or gentle humour, in a way that doesn’t kill a conversation but actually encourages it.  There is this important dialogic element to irony.

RC 

It’s what they call the soft irony of Socrates – the witticisms and the elegance that keeps a difficult conversation going for long enough to surface different perspectives. The thing is you can keep going because in a complex situation there isn’t a single truth or just one right way of acting.

KA 

It gets to a possible way of acting. In complex situations there are multiple viable paths and the aim of dialogue is to open up different perspectives so that these different paths become apparent. I see that irony can be used to draw attention to these in a memorable way.  These ironists are revolutionaries of sorts, they have a gift of the gab, they’re charismatic, they are fun to talk to. People open up to them and engage with them, in contrast to cynics whose bitterness tends to shut down dialogue completely.

RC 

Yeah, and the conversation can continue even when the ironists depart. As an extreme example, Socrates chose to die in the final, ironic act of his life. Sure he was old and his time was coming anyway, but the way he chose to go highlighted the gap between principles and practice in Athens in an emphatic way. So emphatic that we talk about it now, millenia later.   

The roll call is long:  Socrates drank hemlock, Cicero was murdered, Voltaire was exiled, Oscar Wilde went to jail, Jonathan Swift was sent to a parish in the middle of Ireland – and so on. All were silenced so that they wouldn’t cause any more trouble. So there’s always a risk that however witty, however elegant your rhetoric, and however hard you try to keep these conversations going and get people to see the gap, there’s always a risk that a sword will be plunged into your abdomen.

KA 

The system will get you in the end, but the conversation will continue! I think that’s a great note on which to conclude our chat.  Thanks very much for your time, Richard.  I really enjoyed the conversation and learnt a few things, as I always do when chatting with you.

RC 

It’s been a pleasure, always wonderful to talk to you.

Written by K

March 29, 2021 at 7:35 pm

%d bloggers like this: