“Trust us…”??

When growing up, it seems that all of our parents gave advice on whom to trust: “firemen and policemen keep us safe”, “don’t take candy from strangers”, “respect your elders because they know what is best”.  As we get older and begin to learn more how the world works, we are better able to make our own judgments about who is trustworthy, or at least who should be. Ever since I was young, I have had a love of science, so I was drawn to my teachers and scientist family member to feed my interest. I had naively believed that they represented the greater scientific community, in that they had integrity in the information they relayed to me, and were interested in continuing to be a trusted source of information for a ‘new generation’ like myself. Even during my fourth year as a Ph.D. student, I still had that naïve hope that scientists would be truthful, moral, and unbiased in their data collection and conclusions. What a difference 9 weeks makes.

A recurring theme in the past few weeks has been the idea that scientists and engineers are important creators and distributors of information into the public and policy arenas. As Resnick (2011) clearly outlines, trustworthiness is a virtue by which practitioners must strive towards. To me, this ‘public trust’ of scientists and engineers (Resnick 2011) suggests that there is an unwritten (and tacit) contract by which data are provided that would help those in society make informed decisions about their well-being, and any attempt by professionals otherwise would be dishonest and unethical (Harris et al. 2009). Reading through Harris et al. (2009), and thinking back on issues involved in the DC Lead Crisis (e.g. Edwards 2010), I wonder if the benefits of dishonesty (lying, data fabrication, etc.) outweigh the risks of getting caught? This is a purely ‘devil’s advocate’ approach, but the benefits of various parties involved in the DC Lead Crisis cannot be ignored: Tee Guidotti made money as a WASA consultant,  the CDC was able to claim ‘authoritative’ status in their erroneous MMWR report (and use it as ammunition against any claims contrary to their findings), and EPA R3 was able be awarded for their despicable behavior. As Dr. David Lewis indicated during last week’s class, it is challenging to stand up (as he and Edwards) did against a monolith of bureaucracy, lies, and money as the sole source of truth and objectivity. Given this unfortunate fact, these organizations (and the decision makers within) seem to gain this inflated sense of invincibility, so there is no incentive for them to even attempt to be trustworthy (i.e. they can ‘lawyer up’ or resort to character assassination).

In spite of all of this, who were the losers? In the DC case, it was the public. There were potentially thousands of children who were impacted (Edwards 2010), and parents were unable to make informed choices on behalf of their children (Harris et al. 2009). Of course, Dr. Edwards had his time, finances, and character hurt as well, but the difference was that he was able to rise above it all and fight the system using his ethical judgment and scientific training. The actions of the CDC, WASA, DC DOH, and multiple players within was simply a ploy to discourage an informed response from those members of the public whose health was in the greatest jeopardy (and thus had the least capability to stand up and be heard). These groups hedged a bet that no one would call them out, and mostly they won. Years after the crisis broke, it seems that these organizations would want to try to improve their image, given the obvious lack of trust they now have from the public, but yet continue to laud their incompetent science through not retracting any reports and giving themselves ‘gold medals’ (Edwards 2010). The public trust is important, but what CDC, WASA, DC DOH, and other government agencies seem to forget is the “public trust fund” and tax dollars that gave them the jobs in the first place.

Works Cited:

Edwards, M. 2010. “Experiences and observations from the 2001-2004 ‘DC LeadCrisis’”: Testimony. U.S. House of Representatives’ Committee on Science and Technology, 111th Congress.

Harris, C. E., Jr., et al. 2009. “Trust and Reliability.” In Engineering Ethics: Concepts & Cases, pp. 115-134. Belmont, CA: Wadsworth.

Resnick, D.B. 2011. Scientific research and the public trust. Science and Engineering Ethics 17: 399-409.

 

Leave a Reply

Your email address will not be published. Required fields are marked *