Eight to Late

Sensemaking and Analytics for Organizations

Archive for the ‘sensemaking’ Category

Uncertainty, ambiguity and the art of decision making

with 3 comments

A common myth about decision making in organisations is that it is, by and large, a rational process.   The term rational refers to decision-making methods that are based on the following broad steps:

  1. Identify available options.
  2. Develop criteria for rating options.
  3. Rate options according to criteria developed.
  4. Select the top-ranked option.

Although this appears to be a logical way to proceed it is often difficult to put into practice, primarily because of uncertainty about matters relating to the decision.

Uncertainty can manifest itself in a variety of ways: one could be uncertain about facts, the available options, decision criteria or even one’s own preferences for options.

In this post, I discuss the role of uncertainty in decision making and, more importantly, how one can make well-informed decisions in such situations.

A bit about uncertainty

It is ironic that the term uncertainty is itself vague when used in the context of decision making. There are at least five distinct senses in which it is used:

  1. Uncertainty about decision options.
  2. Uncertainty about one’s preferences for options.
  3. Uncertainty about what criteria are relevant to evaluating the options.
  4. Uncertainty about what data is needed (data relevance).
  5. Uncertainty about the data itself (data accuracy).

Each of these is qualitatively different: uncertainty about data accuracy (item 5 above) is very different from uncertainty regarding decision options (item 1). The former can potentially be dealt with using statistics whereas the latter entails learning more about the decision problem and its context, ideally from different perspectives. Put another way, the item 5 is essentially a technical matter whereas item 1 is a deeper issue that may have social, political and – as we shall see – even behavioural dimensions. It is therefore reasonable to expect that the two situations call for vastly different approaches.

Quantifiable uncertainty

A common problem in project management is the estimation of task durations. In this case, what’s requested is a “best guess” time (in hours or days) it will take to complete a task. Many project schedules represent task durations by point estimates, i.e.  by single numbers. The Gantt Chart shown in Figure 1 is a common example. In it, each task duration is represented by its expected duration. This is misleading because the single number conveys a sense of certainty that is unwarranted.  It is far more accurate, not to mention safer, to quote a range of possible durations.

Figure 1: Gantt Chart (courtesy Wikimedia)

Figure 1: Gantt Chart (courtesy Wikimedia)

In general, quantifiable uncertainties, such as those conveyed in estimates, should always be quoted as ranges – something along the following lines: task A may take anywhere between 2 and 8 days, with a most likely completion time of 4 days (Figure 2).

Figure 2: Task completion likelihood (3 point estimates)

Figure 2: Task completion likelihood (3 point estimates)

In this example, aside from stating that the task will finish sometime between 2 and 4 days, the estimator implicitly asserts that the likelihood of finishing before 2 days or after 8 days is zero.  Moreover, she also implies that some completion times are more likely than others. Although it may be difficult to quantify the likelihood exactly, one can begin by making simple (linear!) approximations as shown in Figure 3.

Figure 3: Simple probability distribution based on the estimates in Figure 2

Figure 3: Simple probability distribution based on the estimates in Fig 2

The key takeaway from the above is that quantifiable uncertainties are shapes rather than single numbers.  See this post and this one for details for how far this kind of reasoning can take you. That said, one should always be aware of the assumptions underlying the approximations. Failure to do so can be hazardous to the credibility of estimators!

Although I haven’t explicitly said so, estimation as described above has a subjective element. Among other things, the quality of an estimate depends on the judgement and experience of the estimator. As such, it is prone to being affected by errors of judgement and cognitive biases.  However, provided one keeps those caveats in mind, the probability-based approach described above is suited to situations in which uncertainties are quantifiable, at least in principle. That said, let’s move on to more complex situations in which uncertainties defy quantification.

Introducing ambiguity

The economist Frank Knight was possibly the first person to draw the distinction between quantifiable and unquantifiable uncertainties.  To make things really confusing, he called the former risk and the latter uncertainty. In his doctoral thesis, published in 1921, wrote:

…it will appear that a measurable uncertainty, or “risk” proper, as we shall call the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We shall accordingly restrict the term “uncertainty” to cases of the non-quantitative type (p.20)

Terminology has moved on since Knight’s time, the term uncertainty means lots of different things, depending on context. In this piece, we’ll use the term uncertainty to refer to quantifiable uncertainty (as in the task estimate of the previous section) and use ambiguity to refer to nonquantifiable uncertainty. In essence, then, we’ll use the term uncertainty for situations where we know what we’re measuring (i.e. the facts) but are uncertain about its numerical or categorical values whereas we’ll use the word ambiguity to refer to situations in which we are uncertain about what the facts  are or which facts are relevant.

As a test of understanding, you may want to classify each of the five points made in the second section of this post as either uncertain or ambiguous (Answers below)

Answer: 1 through 4 are ambiguous and 5 is uncertain.

How ambiguity manifests itself in decision problems

The distinction between uncertainty and ambiguity points to a problem with quantitative decision-making techniques such as cost-benefit analysis, multicriteria decision making methods or analytic hierarchy process. All these methods assume that decision makers are aware of all the available options, their preferences for them, the relevant evaluation criteria and the data needed. This is almost never the case for consequential decisions. To see why, let’s take a closer look at the different ways in which ambiguity can play out in the rational decision making process mentioned at the start of this article.

  1. The first step in the process is to identify available options. In the real world, however, options often cannot be enumerated or articulated fully. Furthermore, as options are articulated and explored, new options and sub-options tend to emerge. This is particularly true if the options depend on how future events unfold.
  2. The second step is to develop criteria for rating options. As anyone who has been involved in deciding on a contentious issue will confirm, it is extremely difficult to agree on a set of decision criteria for issues that affect different stakeholders in different ways.  Building a new road might improve commute times for one set of stakeholders but result in increased traffic in a residential area for others. The two criteria will be seen very differently by the two groups. In this case, it is very difficult for the two groups to agree on the relative importance of the criteria or even their legitimacy. Indeed, what constitutes a legitimate criterion is a matter of opinion.
  3. The third step is to rate options. The problem here is that real-world options often cannot be quantified or rated in a meaningful way. Many of life’s dilemmas fall into this category. For example, a decision to accept or decline a job offer is rarely made on the basis of material gain alone. Moreover, even where ratings are possible, they can be highly subjective. For example, when considering a job offer, one candidate may give more importance to financial matters whereas another might consider lifestyle-related matters (flexi-hours, commuting distance etc.) to be paramount. Another complication here is that there may not be enough information to settle the matter conclusively. As an example, investment decisions are often made on the basis of quantitative information that is based on questionable assumptions.

A key consequence of the above is that such ambiguous decision problems are socially complex – i.e. different stakeholders could have wildly different perspectives on the problem itself.   One could say the ambiguity experienced by an individual is compounded by the group.

Before going on I should point out that acute versions of such ambiguous decision problems go by many different names in the management literature. For example:

All these terms are more or less synonymous:  the root cause of the difficulty in every case is ambiguity (or unquantifiable uncertainty), which prevents a clear formulation of the problem.

Social complexity is hard enough to tackle as it is, but there’s another issue that makes things even harder: ambiguity invariably triggers negative emotions such as fear and anxiety in individuals who make up the group.  Studies in neuroscience have shown that in contrast to uncertainty, which evokes logical responses in people, ambiguity tends to stir up negative emotions while simultaneously suppressing the ability to think logically.  One can see this playing out in a group that is debating a contentious decision: stakeholders tend to get worked up over issues that touch on their values and identities, and this seems to limit their ability to look at the situation objectively.

Tackling ambiguity

Summarising the discussion thus far: rational decision making approaches are based on the assumption that stakeholders have a shared understanding of the decision problem as well as the facts and assumptions around it. These conditions are clearly violated in the case of ambiguous decision problems. Therefore, when confronted with a decision problem that has even a hint of ambiguity, the first order of the day is to help the group reach a shared understanding of the problem.  This is essentially an exercise in sensemaking, the art of collaborative problem formulation. However, this is far from straightforward because ambiguity tends to evoke negative emotions and attendant defensive behaviours.

The upshot of all this is that any approach to tackle ambiguity must begin by taking the concerns of individual stakeholders seriously.  Unless this is done, it will be impossible for the group to coalesce around a consensus decision. Indeed, ambiguity-laden decisions in organisations invariably fail when they overlook concerns of specific stakeholder groups.  The high failure rate of organisational change initiatives (60-70% according to this Deloitte report) is largely attributable to this point

There are a number of techniques that one can use to gather and synthesise diverse stakeholder viewpoints and thus reach a shared understanding of a complex or ambiguous problem. These techniques are often referred to as problem structuring methods (PSMs). I won’t go into these in detail here; for an example check out Paul Culmsee’s articles on dialogue mapping and Barry Johnson’s introduction to polarity management. There are many more techniques in the PSM stable. All of them are intended to help a group reconcile different viewpoints and thus reach a common basis from which one can proceed to the next step (i.e., make a decision on what should be done). In other words, these techniques help reduce ambiguity.

But there’s more to it than a bunch of techniques.  The main challenge is to create a holding environment that enables such techniques to work. I am sure readers have been involved in a meeting or situation where the outcome seems predetermined by management or has been undermined by self- interest. When stakeholders sense this, no amount of problem structuring is going to help.  In such situations one needs to first create the conditions for open dialogue to occur. This is precisely what a holding environment provides.

Creating such a holding environment is difficult in today’s corporate world, but not impossible. Note that this is not an idealist’s call for an organisational utopia. Rather, it involves the application of a practical set of tools that address the diverse, emotion-laden reactions that people often have when confronted with ambiguity.   It would take me too far afield to discuss PSMs and holding environments any further here. To find out more, check out my papers on holding environments and dialogue mapping in enterprise IT projects, and (for a lot more) the Heretic’s Guide series of books that I co-wrote with Paul Culmsee.

The point is simply this: in an ambiguous situation, a good decision – whatever it might be – is most likely to be reached by a consultative process that synthesises diverse viewpoints rather than by an individual or a clique.  However, genuine participation (the hallmark of a holding environment) in such a process will occur only after participants’ fears have been addressed.

Wrapping up

Standard approaches to decision making exhort managers and executives to begin with facts, and if none are available, to gather them diligently prior to making a decision. However, most real-life decisions are fraught with uncertainty so it may be best to begin with what one doesn’t know, and figure out how to make the possible decision under those “constraints of ignorance.” In this post I’ve attempted to outline what such an approach would entail. The key point is to figure out the kind uncertainty one is dealing with and choosing an approach that works for it. I’d argue that most decision making debacles stem from a failure to appreciate this point.

Of course, there’s a lot more to this approach than I can cover in the span of a post, but that’s a story for another time.

Note: This post is written as an introduction to the Data and Decision Making subject that is part of the core curriculum of the Master of Data Science and Innovation program, run by the Connected Intelligence Centre at UTS. I’m coordinating the subject this semester, and am honoured to be co-teaching it with my erstwhile colleague Sean Heffernan and my longtime collaborator Paul Culmsee.

Written by K

March 9, 2017 at 10:04 am

Data science and sensemaking – tales from two hackathons

with 5 comments

It isn’t that they can’t see the solution. It is that they can’t see the problem” – GK Chesterton

Introduction

Examples of vendor-generated hype about data science are not hard to find,   I found one on the very first site I visited:  a large technology and services vendor who, in their own words, claim their analytics solutions help organisations “engage with data to answer the toughest business questions, uncover patterns and pursue breakthrough ideas.”  I’ve deliberately avoided linking to the guilty party because there are many others that spout similar rhetoric.

Unfortunately it seems to work:  according to Gartner, “by 2020, predictive and prescriptive analytics will attract 40% of enterprises’ net new investment in business intelligence and analytics.” This trend is accompanied by a concomitant increase in demand for data science education, fuelled by  remarks along the lines that data science is “The Sexiest Job of the 21st Century.”

By and large, data science education tends to focus on algorithms and technology, but its practice involves much more. The vendor who claims that technology can help organisations grapple with “toughest business questions” and “pursue breakthrough ideas” is singularly silent about where these questions or ideas come from. Data is meaningless without a meaningful hypothesis.  Problem is, in the real world questions or hypotheses aren’t obvious; one has to work to formulate them. As the management icon Russell Ackoff once said, “Outside of school, problems are seldom given; they have to be taken, extracted from complex situations…”

The art of taking problems is what sensemaking is all about.

Unfortunately, it is a skill that is typically ignored by data science educators.

Why?

Probably because it is hard to teach…but the good news is that it can be learnt. Like most tacit skills, sensemaking is best learnt by doing, that is, by formulating problems in real-world situations.  Before I get to that, however, let’s take a brief detour.

Real world problems are characterised by ambiguity

An important aspect of real-world problems – as opposed to classroom ones – is that they are invariably fraught with ambiguity. For example, a customer’s requirements may be vague or the available data incomplete and messy. What this means is that there is no guarantee one will be able to formulate a well-posed problem, let alone get a useful answer.   Worse, unlike a risk-based situation in which uncertainty can be quantified, one cannot even figure out the odds of success.

The human brain processes quantifiable uncertainty (aka risk) and ambiguity very differently. The former, which can be calculated, is dealt with by the prefrontal cortex which is responsible for decision making and goal-oriented thinking. Ambiguity, on the other hand, is processed by the amygdala, which deals with emotions.  The upshot of this is that ambiguity evokes an emotional response, the most common one being anxiety.

Although some people are innately better at coping with anxiety than others, it is possible to get better at it by repeatedly putting oneself in high-pressure (yet safe) situations that are ambiguous.  For data science students, hackathons provide a perfect opportunity to do this.

Ambiguity in data science – tales from two hackathons

Over the last two months, I’ve had the privilege of being a part of the Master of Data Science Innovation (MDSI) program run by the Connected Intelligence Centre at UTS.   The course director, Theresa Anderson, sees hackathons as a great way for students to learn how to handle ambiguity.  So, apart from regular coursework assignments, students are encouraged to participate in external hackathons sponsored by industry and government organisations.   This gives them opportunities to gain practical experience in formulating problems in ambiguous and high-pressure environments.

Datacake at GovHack

A few MDSI student teams participated in a GovHack event earlier this year. Here’s what William Azevedo,  a member of team that called themselves Datacake, wrote about his team’s problem formulation journey at the event :

The challenge is simple: the competitors should form teams, identify a problem and use data from government agencies from Australia and New Zealand to present a solution to the problem. Naturally, this solution should bring some benefit to the society.

I’m not sure I’d use the word simple…but the importance of problem formulation comes through quite clearly.  Here’s how he and his team (called Datacake) went about it:

 As a starting point, our team published an online survey to understand how safe people feel when walking on the streets, especially at night. As we didn’t have much time, we spread the message via social networks. In a couple of hours, we received 44 answers. It gave us enough information to back our idea.

Notice the process used in defining the problem – the team realised they did not know enough to define a meaningful problem so they went and got relevant data. Following this:

Our team analysed the answers of the survey, engaged in passionate discussions, took tips from the mentors, had lots of coffee and designed some cool diagrams on the blackboard.

…and then his description of the Aha moment when a good idea emerged:

Then the magic happened. We had this idea of merging information about crime, demographics, weather, land zoning and street illumination to provide a map of the safe and unsafe areas within a suburb.

An important point is that sensemaking is best done collaboratively. Since the problem is ambiguous or even undefined (as in this case) no individual has a privileged access to the “truth.” It is therefore important to bring diverse perspectives to bear on the problem. Indeed, sensemaking may be thought of as collaborative problem formulation and solving. In view of this it is interesting to hear what other members of Team Datacake had to say about their problem formulation process.  Here’s a comment from Anthony So:

During the whole weekend we really forced ourselves to go deep and asked “Why is it happening? Why is it happening? Why is it happening?” every time we found an interesting pattern. We really wanted to understand the true root causes of those accidents. We didn’t want to stay at a descriptive level. We knew the answers were behavioural. We knew there were multiple problems and therefore require different answers and solutions. We did different techniques to do so: machine learning, stats, data visualisation. It didn’t matter which we used the only important point was how can we get to answers of those questions.

The specific area they looked at was pedestrian safety. They found that obvious variables, such as driver fatigue and hazards were not significant, so they started looking for other potential factors. Here’s how Anthony put it:

For instance we built a classification model on the severity of the accidents involving children but we didn’t use it to make predictions. We used it to identify the important features (and unimportant) for those cases. We found out that some of the variables related to the environment (Primary_hazardous_feature, Surface_condition, Weather…) and to the drivers (Fatigue_involved_in_crash…) were not important. This gave us a good indication that those accidents are mostly related directly to the behaviour of the children. So we kept diving further and further and found 3 postcodes with higher numbers of accidents than others. We focused on those 3 areas and we kept going deeper and deeper…

In the end Datacake came up with a few suggestions for improving pedestrian safety. They were awarded a prize for their efforts, so the problem they formulated and solved was clearly valuable to the sponsors.

Peppermoney Hackathon

A couple of weekends ago, Pepper Money, Australia’s largest non-bank lender sponsored a day long internal hackathon for MDSI students, with a hefty winner-take-all prize as an incentive. The challenge was quite open-ended, and had to do with helping the organisation develop a consistent brand voice. Participants were given a small corpus of text files from the organisation’s public and social media sites and were given very general guidelines on how to proceed. Details were left entirely to the teams.

As one might expect, most teams spent the first few hours struggling to define a relevant and tractable problem – relevance being paramount for the client and tractability for the teams.  Being a mentor at the event, I was able observe how different teams handled this. Among other things, I was particularly impressed by how some teams with very little text mining experience were able to – in a few hours – come up with a good problem, an approach to solve it…and, most importantly, make decent progress by day’s end.

I won’t go into details except to say that the approaches were diverse, ranging from the somewhat philosophical to the very technical. A couple of examples:

I was amazed at the diversity of solutions the groups came up with, and so were the other mentors and the sponsor. Blair Hudson, Innovation Portfolio Manager at Pepper Money, summed the day up very well when he said:

#PepxUTS was our first hackathon event, challenging students to build data science solutions in a day to allow everyone at Pepper to communicate using a consistent brand voice. Our Co-Group CEOs both joined in for judging and awarded the winners. It was a rewarding day for all involved

(For some vignettes from the day, check out the #PepxUTS hashtag on Twitter.)

The day’s experiences left me ever more convinced that hackathons are an excellent vehicle for learning and demonstrating the practical utility of sensemaking skills.

Wrapping up

The two case studies highlight the benefits of sensemaking skills, both for students and organisations.  On the one hand,  students who participated got valuable experience in formulating problems collaboratively in high-pressure, high-ambiguity situations. This is a skill that cannot be learnt in classrooms, MOOCs or even in online data challenges (like Kaggle) where problems tend to be clearly defined. On the other hand, sponsoring organisations have benefited from new insights into longstanding problems.

Finally, it should be clear that although I’ve focused on educational settings,  what I’ve said for students applies to organisational settings too: there’s nothing to stop organisations from using hackathons as a means to help their employees learn sensemaking skills.

To conclude, the main point I want to make is that the most important situations we encounter at work (and even in our personal lives) are usually fraught with ambiguity. Our first reaction is to jump into problem solving mode because it feels like the right thing to do. In reality, one is generally better off stepping back and taking the time to think the situation through, preferably with a group of diversely skilled individuals. All too often this sensemaking step is neglected, and teams end up solving an irrelevant problem.

To paraphrase Chesterton, in order to see the right solution, one must first see the right problem.

Acknowledgements

Many thanks to Blair Hudson, William Azevedo and Anthony So for their contributions to this piece.

Written by K

October 18, 2016 at 6:23 pm

What is sensemaking?

with 12 comments

I’ve recently set up a consulting practice specializing in sensemaking and analytics. Most people understand the analytics bit, but many have questions about sensemaking. I got that question so many times that I decided to do a short (2.5 minute) whiteboard video explaining what the term means to me (and my definition is not the same as Wikipedia’s).

Here it is:

 

For those who prefer the written word, here’s the script (minus the advertising):

“Most organizations are very good at solving problems. This is no surprise: much of training, right from school to university, focuses on teaching us the skills required to solve problems. Now regardless of the specific technique used, the problem-solving process is essentially a logical or analytical one. It goes something like this:

  • Gather information about the problem.
  • Analyse the information.
  • Formulate candidate solutions.
  • Implement the solution of choice.

This so-called GAFI method works by breaking problems down into manageable parts, solving each of the parts separately and then assembling these into a solution. The method works very well for most scientific and engineering problems – even one  as complicated as sending a spacecraft to Saturn. Indeed, it is so successful that it underpins all of science and modern technology.

However, there is a serious gap in the GAFI method – it assumes that problems are given, it does not tell us how to formulate problems. And as the management luminary, Russell Ackoff once said:

Outside of school, problems are seldom given; they have to be taken, extracted from complex situations…”

The art of taking problems is what sensemaking is all about.

Unlike analytical thinking, which is purely logical, sensemaking involves such as collaboration, imagination and a healthy tolerance for ambiguity. It is an art that is absolutely essential for surviving…no, thriving, in the increasingly complex world of the 21st century.

The two modes of thinking – sensemaking and analytical – are as different as chalk and cheese but both are necessary for a successful outcome. We like to think of them as lying at opposite ends of a spectrum of thinking styles. When approaching a new situation or problem, one should always begin at the sensemaking end and move towards the analytical end as one understands the problem better. Unfortunately time pressures in corporate environments often force managers and employees into analytical mode without a full appreciation of the problem they are attempting solve. As a result the solutions are often less than optimal. Sensemaking techniques equip organisations with tools that cover the entire problem lifecycle, from definition to solution.”

As a closing remark (that might be construed as advertising…) I’ll mention that I’ve discussed a number of these techniques on Eight to Late. Here are a couple of examples:

The Approach: a dialogue mapping story

The dilemmas of enterprise IT

…and, of course, you can always have a look at my book or ping me for a no-obligation chat to find out more 🙂

Written by K

March 15, 2016 at 6:02 pm

%d bloggers like this: