Share the Power

In Street Science, Corburn deftly illustrates how the “technocratic” model of doing science in the public sphere fails the public and also often falls short of producing “usable knowledge.”  Corburn asserts that a big part of the problem is the power dynamic between the scientist and the public.  Our class discussion yesterday raised similar issues regarding the distribution of power and decision making.  These presentations piqued my interest and reminded me of two arenas where I’ve heard or seen this concept advanced before, which I think we can learn from.

In some fields at least, I think it is possible to conceive of the relationship between academia and the engineering consultant/practitioner as another level of the scientist-public dynamic that we discussed today.  It is easy as a practitioner to become disillusioned with the type and quality of research that is performed by our universities.  Research is too often so far afield from the problems and difficulties faced by the working engineer.  This is easily attested to by paging through many journals. This is not to say that knowledge isn’t pushed forward when we think outside the box. However, the funding structures and decisions about what research to pursue are often made with little input from the practicing world.  A good example of how to break this paradigm exists in the geotechnical program at VT.  We host the Center for Geotechnical Practice and Research (CGPR), which is a partnership between private consulting firms and the department.  This center funds multiple research projects each year.  Key to our discussion here, the funding decisions are made at yearly meetings between the practitioners and professors where the need and merit of various topics are discussed.  This is a simple, yet powerful, model of sharing power in academics, not to mention a great way to fund grad students.

A second and more closely related parallel springs from my experience with the course GRAD 5114 – Contemporary Pedagogy offered by the Graduate School at VT.  As anyone who has taken it knows, the course revolves around the concept of learner-centered teaching.  This course challenges the traditional approach to education, in which the teacher is seen as superior and the bearer of all knowledge, while the student is disinterested, ignorant, and an empty vessel to pour knowledge into.  These descriptions of the teacher and student are much the same as those used by the deficit model of scientist-public interactions.  Learned-centered teaching breaks these notions, showing that students have much to contribute to the education process, if given the agency to do so.  They learn much more when allowed to share responsibility for their learning with the teacher.  This is the same concept as the public contributing to science/research through their knowledge.  I could keep drawing parallels but would prefer to end by observing that much of the advice given to teachers trying learner-centered methods is applicable to “street science” as well.  This might include:

  • Involving the public/students in the work might be messy or unpredictable.
  • You must be willing to learn from or with them.
  • Expect resistance, both from the public/students and from other scientists/teachers.  People often don’t like new things.
  • Keep trying.  The rewards are worth the difficulties.

Ethics and Engineering Failures

During my first semester as a Ph.D. student in 2010, I authored a report with my advisors entitled “Lessons Learned from Dam Failures.”  This was a wonderful experience for me both as an engineer and an academic.  It was fascinating to see the many ways in which failure can occur and to consider the lessons that we as engineers should learn from past mistakes made by our profession.

Davis’s use of the Challenger failure to discuss codes and ethics in engineering reminded me of these dam failures in a couple of ways.  First, the engineers involved with some of the failures in the early 1900s expressed a need for the wider accountability within the civil engineering profession.  For dam engineering, this eventually came in the form of state dam safety organizations and review boards.  I would be surprised, however, if the adoption of ASCE’s first code was not influenced to some extent by the prominent dam failures of that era, especially Austin Dam and South Fork Dam.  Killing over 2,200 people, the latter was one of the worst engineering disasters in our country’s history.

My second observation has bearing on our consideration of almost all ethics violations, including the case studies we’re studying as a class.  Davis points out that it is almost always difficult to pinpoint a single cause of a failure.  In our report, we quote Sowers (1979) who said, concerning a landslide,

Often the final factor [in a slope failure] is nothing more than a trigger that sets a body of earth in motion that was already on the verge of failure. Calling the final factor the cause is like calling the match that lit the fuse that detonated the dynamite that destroyed the building the cause of the disaster.

As Davis and Sowers both point out, we need to be careful when considering failures, like the Challenger or the DC Lead scandal or the TCC case, not to narrow our vision to a single cause or try to blame only one person or organization.  Almost always, the situation is vastly more complicated with many shades of responsibility.

Spheres of Influence

After Thursday’s class, I began thinking about the boundaries of our ethical responsibility.  Do they exist, and if so, where?  Do we have obligation to every issue and problem we face?  That option is paralyzing.  Is there a caring, ethical, yet tenable, path forward?

Let me throw out the concept of spheres of influence to help with this dilemma.  I’ve heard it expressed something like this: We have more moral obligation the closer a person or situation gets to us, both in terms of physical location and intellectually.   Our greatest duty is to those in our closest communities and diminishes as one goes progressively further out.  This never excuses me from neglecting danger or harm immediately before me, regardless of whether it falls within my expertise or not.  Following this principle, I can’t walk right past the person in great duress but I may not have to step up to face every issue.  Going out to wider “spheres,” my responsibility narrows depending on my expertise.

Zooming out to the national or global scale, my biggest personal responsibility for justice and ethical issues is related to my professional expertise of slope stability, for example.  In this way I’m not ethically bound to confront water pollution issues such as those in DC but I should be ready to do something about landslides killing people.

For ethical problems outside my sphere of influence, I can still learn, becoming an educated member of the public.  In doing so, I can help others be informed about important issues and possibly advocate for justice.  For example, my wife and I are concerned about the state of agriculture in our country and the perils of agribusiness to us and our people.  On a large scale, this issue lies outside of our sphere of influence.  On a local scale, we can make decisions such as buying local food that are within our influence.  On the other hand, this issue might be an important moral issue for another scientist, say in agriculture, to be active in confronting.

I see this general framework as a way to act ethically without “passing the buck” yet at the same time not becoming paralyzed by the myriad ethical issues that face us each day.

Ethics in the Consulting World

This week’s readings on organizations have made me reflect on my experience working in a mid-sized geotechnical and geoenvironmental consulting firm.  While some of the statements by Harris et al. and Alford resonate, I’m not sure that all types of companies and engineers fit within their paradigms.

For example, I’m not sure that every organization/company is feudalistic.  In general, my experience in consulting was not that way.  I rarely if ever felt like a vassal in under the thumb of an all-powerful engineering manager.  That’s not to say things were perfect, or that politics weren’t present.  But we had a relatively fluid structure and significant amount of autonomy within the firm. Because of its size, our firms owners were also the primary managers and also the principal engineers.  This meant that there was no real separation existed between the engineering staff and the managerial staff.

I was also intrigued by the distinction between managerial and engineering decisions made by Harris et al.  In the context of consulting engineering that I was exposed to,  it really difficult to think of many situations were this type of dilemma would have come up.  Our typical job was provide geotechnical recommendations to an outside client.  If they chose to ignore these recommendations, the client could do so, for good or for bad.  During the design phase of projects, we would provide both general and specific recommendations as required by our contract with little managerial consideration, aside from engineering judgment calls – more about this in a minute.  On the construction end of a project, the managerial and the engineering are often separated among different firms.  Managerial duties fall to the construction company or manager while engineering duties are performed by separate firms with contracts directly with the owner or sometimes through the construction manager.  The final decision falls to the owner with the combined advice of the other professionals on the project team.

Geotechnical engineers, arguably more than any other branch of civil engineering, must make engineering judgment calls based on limited data.  Are these ethical decisions?  Can they be examples of the engineering vs. managerial conflict presented by Harris?  At least sometimes, yes.

Let me give an example.  My firm was asked to provide construction observation and materials testing services (a lucrative and badly needed project) for a new shopping center being built by a good client of ours.  Design recommendations had been provided by another firm and included bearing foundations directly on the shale bedrock present on the site.  Based on nearby experience, this shale was known by our firm to be expansive, meaning that it could cause serious structural issues to the buildings in the future.  We did not have data showing that the conditions were problematic, but in our best engineering judgment could not go along in silence with a project we felt was doomed to problems in the future.  We tried convincing the client to change the design but that proved cost-prohibitive for them.  In the end we decided to turn down the work based on our engineering judgment and the managerial consideration of the potential liability incurred by being part of the project team.  The managerial concerns of pleasing a client and potential profits were superseded by concerns rooted in engineering.

The legal quandary faced by whistle-blowers, which Alford describes, also has a parallel for many small consulting firms.  These firms often face lawsuits on projects where they did nothing wrong, aside from being associated with a project where something went awry.  Both parties are in a relative sense the little guy in our expensive legal system.  While the lawsuits may come for very different reasons, neither party has the financial wherewithal to weather long lawsuits that rarely decide in their favor.  In their position as defendants, it is almost always cheaper to settle the case by paying up than fighting it in court.  The question begs: Is it unethical to implicitly admit wrongdoing by settling just to save money?  More importantly, what can be done about our system to remedy these situations?

Thinking clearly about moral humility

In his November 2011 TED talk Nitin Nohria claims: “We haven’t really understood moral over-confidence.”  While I agree with Nohria’s call for moral humility, I believe it is more truthful to say that we have forgotten our tendency for immorality.  As Nohria terms it, we suffer from moral overconfidence.  We think that only “those bad people” do wrong things.

By forgotten, I mean that people, religions, and philosophies have long recognized that humankind is NOT inherently good and is certainly capable of, in fact rather prone, to evil.  Consider, for example, the concept of “total depravity” – that man is completely unable to do good apart from God’s help – taught by the likes of Augustine and Calvin based on their understanding of the Bible.  While some see this belief as arrogant or judgmental, these men were actually practicing moral humility, seeing the penchant for evil in themselves and the people around them.  Whether or not you believe in this or similar doctrines, I think Nohria’s point helps to show us the truth about ourselves.  We too easily over-estimate ourselves and our own ability to “do the right thing.”

Lewis makes a similar point in “The Inner Ring” by pointing out how quickly we can become “scoundrels,” often times unwittingly and despite our intentions.  In Chapter 1 of his book Mere Christianity, Lewis makes a similar point to Nohria,

“None of us are really keeping the Law of Nature…I am only trying to call attention to a fact; the that this year, or this month, or, more likely, this very day, we have failed to practice ourselves the kind of behavior we expect from other people…They [human beings] know the Law of Nature; they break it.  These two facts are the foundation of all clear thinking about ourselves and the universe we live in.”

Both Nohria and Lewis call us to think clearly and truthfully about our own hearts, and in the context of this class, the decisions we make as engineers and scientists.  A big part of the battle is simply admitting our own wrong-doing or tendency for it.  We are more likely to see the moral/ethical implications of our engineering decisions if we cultivate this type of mindset.  But Humility and the courage to act out of that humility are not easily come by.  Nohria seems to tell us to simply try harder and somehow we’ll make it.  Others, myself included, know that we need outside help to overcome this hurdle.

Engineer-ese

Since part of this class is about the ethics of communicating clearly with the public, I thought it appropriate to post this video that I received a link to during the first week of class.  While many may have already seen it, it’s always good to step back and evaluate how we sometimes sound to the public as engineers and scientists.

Enjoy!

A New Chapter – Engineering Ethics

So I’m off on another chapter in my short and sporadic blogging career – a class in engineering ethics.  I’m actually quite enthused about the class and its pedagogy.

I was inspired earlier today by the stories of Palchinsky and Cuny told in the first chapter of Harris et al. (2009).  In addition to being top notch engineers, these men devoted much of their lives to helping people.  They observed need within their workplace or world and then acted within their capabilities as engineers to alleviate that need, in both cases to their own ultimate peril.  We too seldom in engineering are encouraged to consider the plight of those who have received less privilege than ourselves, or how our engineering decisions can have an impact to change those circumstances.

Another point in the Harris book also struck me, regarding the importance of case studies in fields like engineering.  Our problems are real and use oriented.  They impact real people and are inherently not abstract.  Yet too often we approach ethics from an ivory tower with moral theory and philosophical posturing.  The study of real case studies helps us to break that temptation and live in the concrete reality of complicated situations in a messed-up world.  I’m especially excited about exploring this through the two case studies in this class.  Pedagogically speaking, I’m intrigued by the idea of plunging students into a case like that in Tonawanda and am curious to see how this immersion-type learning actually works out in person.