Eight to Late

Sensemaking and Analytics for Organizations

The DRS controversy and the undue influence of technology on decision-making

with 7 comments

The Decision Review System (DRS) is a technology that is used to reduce umpire errors  in cricket.  It consists of the following components:

  1. High-speed visualisation to  track the trajectory of a ball as it goes past or is hit by a batsman
  2. Infra-red and sound-based devices to detect whether or not the bat has actually made contact with the ball.

There were some misgivings about the technology when it was first  introduced a few years ago, but the general feeling was that it would be beneficial (see this article by Rob Steen, for example).  However, because of concerns raised about the reliability of the technology, the International Cricket Council did not make the use of DRS mandatory.

In the recent Ashes series between England and Australia, there have been some questionable decisions that involved DRS. In one case, a  human umpire’s decision was upheld even though DRS evidence did not support it and in another an umpire’s decision was upheld when DRS evidence only partially supported it, See the sidebar in this news item for a summary of these decisions.

Now, as Dan Hodges, points out in an astute post,  DRS does not make decisions – it only presents a human decision-maker (the third umpire) with more, and allegedly better, data than is available to another human decision-maker (the on-field umpire). This is a point that is often ignored when decision support systems are used in any kind of decision-making, not just in sports: data does not make decisions, people do. Moreover, they often reach these decisions based on factors that cannot be represented as data.

This is as  it should be: technology can at best provide us with more and/or better data but, in situations that really matter, we would not want it making decisions on our behalf. Would we  be comfortable with machine diagnoses of our X rays or CT scans?

Taking a broader view, it is undeniable that technology has influenced the decisions we make: from the GPS that directs us when we drive, to Facebook, Linkedin and other social media platforms that  make suggestions regarding  who we might want to “friend” or who “connect with.” In his book, To Save Everything, Click Here,  Evgeny Morozov argues that this is not a positive development. He takes aim at what he calls technological solutionism, the tendency to view all problems as being amenable to technology-based solutions,  ignoring other aspects such as social, human and ethical concerns.

Morozov’s interest is largely in the social and political sphere so many of his examples are drawn from social networking and search engine technologies. His concerns relate to the unintended consequences of  these pervasive technologies- for example, the loss of privacy that is the consequence of using social media or the subtle distortion of human behaviour through the use of techniques like gamification.

The point I’m  making  is rather more modest:  it is that technology-based decision-making tools can present us with more/better/refined data, but they cannot not absolve us of our responsibility for making decisions.  This is particularly evident in the case of ambiguous issues. Indeed, this is why decision-making on such matters  has ethical, even metaphysical implications.

And so it is that sports needs human umpires, just as organisations need managers who can make decisions that they are willing to stand by, especially when  situations are ambiguous and data is open to interpretation.

Written by K

August 14, 2013 at 7:46 pm

7 Responses

Subscribe to comments with RSS.

  1. There’s an old joke in American baseball about three umpires talking at a bar. The first one says, “I call ’em as I see ’em”. The second one says, “I call ’em as they are!” And the third one says, “They ain’t nothing until I call ’em!”

    In other words, the umpire’s decision is a declaration, not a description.

    Like

    Jeff Conklin

    August 15, 2013 at 1:55 am

    • Hi Jeff,

      Thanks for your comment – a terrific summary of the main point.

      Regards,

      Kailash.

      Like

      K

      August 15, 2013 at 7:47 am

  2. K,
    Interesting take as always, you tie together several threads. On my mind lately has been
    1) getting the right data rather than volumes that don’t improve the decision process
    2) reliance on passively collected data that may lose the context and surely becomes problematic regarding privacy.

    I plan to look up a couple of your references.

    Like

    Bill Nichols

    August 15, 2013 at 3:23 am

    • Hi Bill,

      Thanks for your comment. Good to hear from you, it has been a while.

      A couple of remarks on the excellent points you have made:

      1. I agree that it is important to get the right data. However, in many situations it may be difficult to gauge the relevance of a piece of data a priori. I suspect this is why people often go on a data-gathering spree – they can’t tell what is relevant so they collect everything they can lay their hands on. Of course this only leads to confusion and/or analysis paralysis.

      2. Your point about context and privacy is a very important one. Indeed, most databases attempt to force-fit data into a predefined context that can be modelled as a neat schema. The problem with this approach, particularly for data about individuals, is that humans operate in multiple contexts. So there is always the danger that data collected in one context will be used in another context in which the data is not applicable or relevant- for example, when social media data is used in making judgements about suitability for employment.

      Regards,

      Kailash.

      Like

      K

      August 15, 2013 at 7:47 am

  3. Good point on the need to keep humans in the loop in areas such as this. Looking at DRS and the motives behind it (to stop glaring errors at the top level) I think the ICC has missed the point, and I think the recent furore over DRS highlights two other areas we can learn from.

    The first is the situation where the world had access to a technology (specifically the Snickometer that can indicate contact when there is no hot spot and also differentiate bat-to-pad). We were left with a situation where ‘snicko’ gave a clear indication and hot spot didn’t. The lesson – even if only used as tertiary reference point if there is a tool that your client base is widely using to measure your performance, then you should be tracking that metric as well even if only to be able to defend a decision.

    Another issue was the tactical use of DRS. The purpose of DRS was to overturn howling errors and yet we have this situation where the 3rd umpire needs to be called in. I agree that we need to limit the teams questioning every call, but the 3rd umpire should have the ability to let the field umpires know there has been a howler. The lesson – if your leadership makes a mistake that is picked up lower down the organization there should be an expectation and a process to swiftly get the correct information up to the top and acted upon.

    Like

    Utsire

    August 17, 2013 at 4:50 am

    • Hi Utsire,

      Thanks for reading and for elaborating on some very relevant points that I overlooked.

      I entirely agree that when multiple technologies are involved, one needs to consider all the relevant data that it provide, unless there is a good justification for ignoring a particular data point. In the case you mention, no such justification was given.

      The second point you make is one close to my heart: every organisation ought to have mechanisms through which information that contradicts decisions / expectations of leaders can be brought to their attention. Sadly, such mechanisms are not so common.

      Regards,

      Kailash.

      Like

      K

      August 17, 2013 at 2:00 pm

  4. […] Kailash Awati consider how a decision support system is used in Cricket, and by extension, how they should be used in business. […]

    Like


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: