What happens when trust breaks down completely

I had an interesting hour long discussion with a pair of Flint residents who had lived through the water crisis.  When they finished telling me their experience, which included repeatedly feeling ignored when they expressed their concerns, I thanked them for their time.  One of them very politely explained that he didn’t think I believed them either, which came as a bit of a shock.  Some of their concerns might have been scientifically incorrect, but they were morally right when they talked about how they had been mistreated?

What happens when people are morally right, but scientifically incorrect?  The thought came up again after reading this story in the Post over the weekend:

http://www.washingtonpost.com/sf/world/2017/08/07/perus-glaciers-have-made-it-a-laboratory-for-adapting-to-climate-change-its-not-going-well/?utm_term=.cc7d33105c53

The story makes the point that scientists trying to study the melting of glaciers in South America, or engineers trying to install warning devices to deal with the potential flash floods they create, find themselves threatened by communities that believe the technical professionals and their equipment create the droughts caused by climate change.

Science can function internally without public trust, so long as there is just enough trust to pay the bills.  Intellectual credibility will keep policy makers writing the checks.   But science cannot change society without some amount of moral credibility.

The fact that the public may or may not have trusted science in Flint after the lead was detected may or may not lead to a larger health problem.  The issue of science not trusting the public may create another health problem.   These problems are not separable from the larger story of mistrust in a city that was written off before the lead appeared in the drinking water.

Similarly, in Peru we see scientific trust as completely inseparable from public trust.  The scientists and engineers may have been able to do a better job of winning public trust- but that also may have been impossible when trust in other institutions was completely erased by corruption.

Science is not democratic.  But science cannot impact a society without the trust of a truly democratic society.  And for that reason, science is in a forced engagement with the public, and may find it almost impossible to function as an autocracy dictating to the public.

Why does science view the external critique so negatlively?

“I did not even know that the British Empire is dying, still less did I know that it is a great deal better than the younger empires that are going to supplant it.”  (George Orwell, “Shooting an Elephant.”)

Reading through Donovan Hohn’s New York Times Article on Marc Edwards, “Flint’s Water Crisis and the ‘Troublemaker’ Scientist,” bought to mind George Orwell’s comments on the British Empire in his famous story “Shooting an Elephant.”

Orwell’s story focuses on the moment when he, as a young British Colonial Police Officer in Burma, realized that the British Empire was morally wrong and unsustainable.  He describes how he had to kill an elephant that had gone rogue, killed a man, and then calmed down.  He kills the elephant not because it poses a danger, but because a watching crowd of Burmese is watching him, and he has to maintain his status.  Because the story was published in 1936, a year before he went to fight as a volunteer with the Republicans in Spain, he takes a moment to mention that the flawed British Empire is still not at the level of fascism.

The narrative of Professor Edwards, and science in general, as the hero of the Flint Water Crisis, or of other environmental crisis, is a difficult one to dismiss.  Hohn does portray Professor Edwards as a hero, but as one with flaws, and as one whose hero status has come at the expense of others whose contributions were critical.  In spite of this, Professor Edwards’ status as the hero of the crisis has remained strong in scientific and policy-making circles.  His experience, and outspokenness, are held up as an ideal for how scientists are to behave in such situations.

Part of the reason why a larger narrative, where Professor Edwards is not shown as one of many contributors, is that the scientist-hero narrative is so extraordinarily useful.  It is useful not just to the scientific community, which holds him up as a hero even as he brutally critiques the values of 21st century academic authority.  A figure like Dr. Edwards is incredibly useful to anyone fighting for clean water, or accurate environmental data.  The Flint residents whose suspicions Dr. Edwards confirmed are less useful in creating this narrative.

For all of these flaws, then, Dr. Edwards is pushed through as the hero of the crisis, because of the very real risk that a flawed empire based on hierarchical expertise will be replaced by one that permits even more destructive behavior.  Bruno LaTour’s 2004 article, “Has Critique Run out of Steam?  From Matters of Fact to Matters of Concern,” raises a similar point.  LaTour wonders if critiques of scientific knowledge, which have the potential to improve science, were instead allowed to become excuses for undermining certainty, and positive action, in areas like climate.

The question for a scientist interested in the environment and health protection reading a critique of scientific expertise is this: will it lead to a better science, or will it lead to an undermining of scientific authority so complete that action on the environment, or public health, becomes impossible.  Perhaps part of the answer depends on how successful science is in absorbing an embracing the critique- but would even that be complete?

(Note: Originally published with an incorrect title for George Orwell’s story.)

Does looking at Flint and DC give a limited view of expertise and the public?

We’ve been looking, over most of this semester, in great detail at what went wrong in Flint and DC during their lead water crisis, and at the flawed rules for lead in water safety.  We’re getting one model of expertise and interaction with the public, but is it a complete model?

Looking at the discussion of how Rachel Carson prepared “Silent Spring” was a revelation.   Here we have a technical expert who had a goal, and a clearly calculated model for sharing her knowledge of the subject with the public.  (I’ve realized that I really need to read “Silent Spring,” even if the scientific information is not as relevant anymore.)

The interesting thing is Ms. Carson might not be considered an expert in the scientific sense- she had no major discoveries.  She was, however, a skillful synthesizer of what was discovered by the scientific community.  Does that make her a truer “expert,” in the sense that she did not have an agenda of pushing her own work?  Are masterful science writers or science policy makers experts, publics, or somewhere in between?

There’s the case of climate scientists, who are at the opposite extreme from the experts we’ve been looking at.  Instead of scientists refusing to take knowledge from non-experts seriously, we’re dealing with non-scientists refusing to take “expert knowledge” seriously,   Are these just the result of a failure to message as effectively as Ms. Carson did?  Or is somehow the flip side of the expert/non-expert divide we’ve been working around?

Do scientists learn to ignore non-experts by learning to ignore grad students?

Reading through Robert Musil’s “Rachel Carson and Her Sisters,” I was struck by how closely the author argued that “Silent Spring” was heavily influenced by her network of colleagues throughout her life.

I couldn’t help but wonder why we are so reluctant to define scientists in this manner.  We find that popular accounts of scientists are often built around a “lone genius” stereotype of Einstein coming up with relativity on his own.  We do this even though we know that Einstein was swimming in a community of scientists questioning the limits of classical mechanics.

Does this model change how we perceive scientists and engineers interacting with non-experts?  We see discussion of the interaction of Marc Edwards and other experts with the Flint water crisis.  These obscure that Professor Edwards functioned as part of a team of researchers at Virginia Tech when looking at the lead levels.  We are writing out graduate students, postdoctoral researchers, and technicians from the story.

If we are writing out highly trained people who work with the lead scientist on a project like this, doesn’t that make it easier to write out the non-expert as well?  Ignoring the narrative of a lab head sharing authority with his or her grad students makes it easier to embrace a narrative where non-scientists are passive bystander.

Does ignoring that scientists function internally with each other, with different levels of experience and expertise, make it easier to write off non-scientists as non-experts?

Reading about the Flint charges

Any time there is a major story on Flint, I find myself checking the Detroit newspapers to see the local coverage.  The Free Press had a story on just how rare the charges are:

http://www.freep.com/story/news/local/michigan/2017/06/14/criminal-charges-michigan-officials/395575001/

There first thing that I found surprising that the charges aren’t linked to the lead in the water, but to the related problems that led to a Legionnaire’s Disease case that was a result of water treatment issues.

The main point of the article, however, was that officials being prosecuted for endangering public health is very rare, while prosecutions for corruption are fairly normal.  Yet both are fundamentally failures to live up to public trust.

Perhaps it’s a little easier to make the line to pressing charges when the case is “He or she took a suitcase full of cash,” than it is to make the case with “He  or she knew this was serious and didn’t make the phone call they should have made.”

Or is there something else at work here?  As a scientist, I’m deeply uncomfortable with the idea of criminalizing a mistake in professional judgement.  The scientific community was deeply upset by the prosecution of geologists for “failing” to state the dangers of the after-shock of an earthquake in Italy a few years ago.

http://www.nature.com/news/italian-seismologists-cleared-of-manslaughter-1.16313

Do we artificially extend the protection we should give to scientific judgement to cover moral judgement?  This seems to me to be linked to why this is considered unusual .

 

Is forming the public as “other” part of professional identity?

Ian Welsh and Brian Wynne’s article “Science, Scientism and Imaginaries of Publics in the UK: Passive Objects, Incipient Threats” raises many questions about how scientists, engineers, and other professionals respond to public opinion.  What I found interesting was how implicit the assumption seemed to be that the public was an “other” to scientific experts.   There were different models of how scientists viewed the public, but there did not seem to be a model where scientists viewed themselves as part of a public in decision-making.   Welsh and Wynne assume that expertise always creates a boundary.

This becomes puzzling when one appreciates that scientists presumably drink the same water, and breath the same air, as the general public.   On the other hand, as an engineer working on commercial aircraft, I have seen this first hand.  We flew on the same aircraft as the general public, with all of the same hassles.  On one hand, we did have some of the knowledge that the public did not have.  We knew, for instance, that an aircraft actually gets more fresh air per person than most buildings do- but we also knew why the aircraft could still be uncomfortable.  Yet the response was often to be dismissive of any suggestion that flying was not as comfortable as it could be.

  • And this was when we were very much “like” our fellow travelers, since both Boeing engineers and air travelers are middle-class  What happens when the public is not middle-class college graduates?   What happens when experts have educational, economic, racial, and religious backgrounds that are different from the people their expertise represents?   Is this gap even bridgeable?