I’m returning to the open discussion thread format, which I tried out earlier in the year. In that first thread, I raised the topic of death and digital media, taking, as a point of departure, a viral tweet from a student who discovered his online professor was, in fact, dead. A great discussion unfolded, and I thought the experiment was a success. Here, then, is another, long overdue discussion thread for you. Comments are open to anybody who wants to jump in.
This time, we take our point of departure from a somewhat less grave situation:
July 9th 2021
I hope the context—an American minor league baseball game—doesn’t put too many of you off. What you’re watching is a rather egregious third strike call made by a robot umpire. I’ve been thinking a lot about this clip ever since Will Oremus posted it on twitter with the following observation: “this is a beautiful visual metaphor for the perils of automation, right down to the body language of the batter who can't believe he struck out because of a dumb robot umpire and the human umpire who's like ‘hey i just work here, take it up with the algorithm.’”
Here was my own initial comment: “What strikes me here is that disbelief and resignation replace anger. Anger is pointless, it has no one to attach to. This is just a minor league game, but extrapolating to other spheres of society, the consequences appear demoralizing.”
You don’t need to be an avid baseball fan to know that under ordinary circumstances such call would likely elicit a vehement response from the batter and his manager, with a good chance that one or both might be thrown out of the game. Of course, similar scenes play out in other sports involving judgement calls from umpires or referees.
My interest in what happens here, of course, extends beyond the game of baseball (although I am interested in what this means for the game). I want to use it as a case study for the integration of automation into society more broadly. So here are some possible questions and angles of approach.
Is the emotional dynamic here suggestive of the way automated systems may demoralize human beings in the face of their determinations? What exactly is being outsourced to the machine? Most evidently judgment. But also responsibility? Is this were the anger comes in? Does outsourcing judgment amount to an evasion of responsibility on the part of human beings?
In this case, the machine erred and obviously so. But I’m perhaps more interested in what happens when the machine becomes more accurate on average than human umpires. You can read this article in the New Yorker for some more context, but it appears that this may already be the case. Would it then be reasonable to replace human umpires? This seems to presume that accuracy is the preeminent or even only good at stake. But is it? Are there other goods that may be lost in the pursuit of accuracy?
What would be the media ecological approach? How would we benefit from seeing this as not just the introduction of a discrete technology into an existing context but as something with ecological, that is ecosystem-wide consequences for the game as a whole?
Also, as philosopher Alvin Noë put it in the New Yorker article, “What we’re seeing in baseball is something that is kind of a core dispute in Western civilization. It really is about ‘What is objectivity?’ Is objectivity something that is physical? Is it mathematical? Is it knowable?” Is the kind of judgment involved here susceptible to computationally derived solutions? Does it exclude other functions of the strike zone in relation to the game? Are there cases where the correct solution is not the good one?
Okay, so there you have it. A general field of discussion with a few more specific questions to consider. Again, all are welcome to comment. I look forward to seeing where this goes. FYI: I won’t be chiming in much through the first couple of hours, but will be able to participate more actively a bit later in the afternoon. Cheers,Michael
For your security, we need to re-authenticate you.
Click the link we sent to , or click here to log in.