The DRS controversy and the undue influence of technology on decision-making
- High-speed visualisation to track the trajectory of a ball as it goes past or is hit by a batsman
- Infra-red and sound-based devices to detect whether or not the bat has actually made contact with the ball.
There were some misgivings about the technology when it was first introduced a few years ago, but the general feeling was that it would be beneficial (see this article by Rob Steen, for example). However, because of concerns raised about the reliability of the technology, the International Cricket Council did not make the use of DRS mandatory.
In the recent Ashes series between England and Australia, there have been some questionable decisions that involved DRS. In one case, a human umpire’s decision was upheld even though DRS evidence did not support it and in another an umpire’s decision was upheld when DRS evidence only partially supported it, See the sidebar in this news item for a summary of these decisions.
Now, as Dan Hodges, points out in an astute post, DRS does not make decisions – it only presents a human decision-maker (the third umpire) with more, and allegedly better, data than is available to another human decision-maker (the on-field umpire). This is a point that is often ignored when decision support systems are used in any kind of decision-making, not just in sports: data does not make decisions, people do. Moreover, they often reach these decisions based on factors that cannot be represented as data.
This is as it should be: technology can at best provide us with more and/or better data but, in situations that really matter, we would not want it making decisions on our behalf. Would we be comfortable with machine diagnoses of our X rays or CT scans?
Taking a broader view, it is undeniable that technology has influenced the decisions we make: from the GPS that directs us when we drive, to Facebook, Linkedin and other social media platforms that make suggestions regarding who we might want to “friend” or who “connect with.” In his book, To Save Everything, Click Here, Evgeny Morozov argues that this is not a positive development. He takes aim at what he calls technological solutionism, the tendency to view all problems as being amenable to technology-based solutions, ignoring other aspects such as social, human and ethical concerns.
Morozov’s interest is largely in the social and political sphere so many of his examples are drawn from social networking and search engine technologies. His concerns relate to the unintended consequences of these pervasive technologies- for example, the loss of privacy that is the consequence of using social media or the subtle distortion of human behaviour through the use of techniques like gamification.
The point I’m making is rather more modest: it is that technology-based decision-making tools can present us with more/better/refined data, but they cannot not absolve us of our responsibility for making decisions. This is particularly evident in the case of ambiguous issues. Indeed, this is why decision-making on such matters has ethical, even metaphysical implications.
And so it is that sports needs human umpires, just as organisations need managers who can make decisions that they are willing to stand by, especially when situations are ambiguous and data is open to interpretation.