Diane Michelfelder and Technological Paternalism

As part of Ethics Week 2013, VT’s Philosophy Department invited Diane Michelfelder to deliver ”Should your Volvo be your mother? Some thoughts on technological paternalism and design.” The broad question Dr. Michelfelder raised was what are the moral implications of technology adoption, but she more specifically spoke on the issue of “technological paternalism,” a concept advanced by Spikermann and Pallas. Whereas the traditional concept of paternalism is forcing a person to do something that is in her best interest but perhaps against her will (think of Mayor Bloomberg’s failed drink regulations as a strong example of statist paternalism), technological paternalism is the way gadgets influence a person’s behavior for her own good, often in ways she doesn’t fully perceive. The seatbelt sign or low fuel light on your car’s dashboard are invasive examples. Antilock brakes are less apparent behavior modifiers, especially for those who’ve never driven an entirely manual vehicle.

While cars were Michelfelder’s vehicle for this discussion, she offered three reasons why we should care about technological paternalism in all aspects of our daily lives:

  1. Companies develop these technologies, profit from them, and sometimes do not let us know that they’re there. Or conversely, they use them to make their products more enticing, like Volvo
  2. Paternalistic legislation is difficult to pass and enforce in the US (but we still try, e.g. gun control, nutrition information on food packaging)
  3. Our status as rational beings is increasingly under scrutiny from a variety of fronts (that is, we can’t make sound decisions for ourselves)

Michelfelder sees companies (1) doing the moral work that governments (2) cannot do because we (3) need more technology to save us from ourselves. The eroding effect is that we adults, in Joel Feinberg’s estimation, become more infantile and in need of more and more paternalistic interventions to complete tasks that humans once did with no or less assistance. She admits that we do not have to buy cars that sense whether we’re dosing off at the wheel or that detect pedestrians crossing the street, but we live in a world where people do. These technologies are part of our ambient environment and affect the way we live phenomenologically.

The three major downsides to technological paternalism, as helpful as it may be, are:

  1. A loss of attention because we become less critically aware of our environments
  2. A loss of share-ability of the material world because everything is tailored to our personal needs (we no longer need others to help us)
  3. A loss of trust in our own senses because we grow to trust technology more than ourselves

The result, she claims, is that we have weaker relationships with other people because, when taken together, these losses deplete our ability to acknowledge, empathize with, and be generous to other people. Hence, we are less likely to consider ethics when acting.

A few things I found interesting about this line of reasoning that will certainly crop up in future blog posts:

  1. If we define technology as tools to help us accomplish existing tasks more easily and new tasks we couldn’t handle before, then isn’t all tech paternalistic?
  2. Object oriented ontology is making its mark on rhetoric studies. Where is the rhetoricity of the paternalism that is built into devices? In the creator, the object, or both?
  3. How does this thinking work with and against narratives of the internet as a global village? (more on this one soon)

Leave a Reply