Something’s missing

I appreciate the even-handedness of the discussion of moral theory that we’ve had in class.  We’ve all been given some more tools to better assess the ethical decisions and dilemmas before us, and better yet to understand the philosophies that others bring to the table.  We’ve seen that none of these theories are foolproof, yet each provide a useful perspective on our moral decisions.

I find myself drawn to the concept of Care Ethics.  It moves away from the cold, calculating ideals of many other theories to a place where moral decisions are made with both our head and our heart.  While it may be relatively new as an “officially” articulated moral theory, it seems that care ethics has been around in practice for a very long time.  It seems to embody the way that many people actually make decisions whether consciously or not.  Care ethics also seems very close to the Christian idea that only “faith working through love” counts for anything (Galatians 5:6).  Actions are to be guided by love, which is really just a stronger form of the “care” ideal.

However at a deeper level, something is missing from the discussion.  Care ethics, at least as presented by van de Poel and Royakkers, is lacking justification for its claims.   It champions the moral significance of our relationships with other people.  People have dignity and thus care for them is at the heart of moral action.  It must be asked, why are relationships good?  From whence comes humankind’s inherent dignity?  Similar questions arise with utilitarianism, duty ethics, and virtue ethics:  

  • What definitions of good, evil, pleasure, pain, and happiness are to be used within the utilitarian rubric?
  • Why is Mill’s freedom principle morally right?
  • On what basis do we accept Kant’s concept of good will as true or moral?
  • Who defines what is virtuous or not?
  • How does one judge what is virtue to begin with?

The answers to these and many other questions regarding the starting points of moral theory are not philosophically self-evident.  They don’t answer themselves.  They beg for the existence of objective moral truths or force us to acknowledge that at the very core our moral judgments are based on a relativistic foundation.  In their discussion of moral theory, van de Poel and Royakkers don’t tackle the question of existence of absolute vs. subjective truth.  And maybe they didn’t intend to.  Similarly this class has whetted our appetite to the intriguing world of moral philosophy, and its job is not to answer these questions.  But I do believe as we wade through the world of moral philosophy, we must all struggle with these epistemological issues – what we believe and how we know it to be true – before or at least concurrently with our striving to choose an appropriate moral worldview or lens.  If not we will ultimately find ourselves confused about the most basic assumptions of our moral framework or stuck in the morass of relativism.

Disillusioned in Blacksburg?

Cartoon from Union of Concerned Scientists (

Dear Government Agency,

I was taught to believe that your purpose for existence was to watch out for the public good.  I thought that your mission was to serve selflessly, unbiased, unswayed by the concerns of private interest.  You were supposed to be the policeman on the corner to protect us from the big bad bullies.  Where were you when the coke plant next door belched toxic fumes into the air every 30 minutes and claimed to be law-abiding?  Where were you when the paper company polluted our streams?  Why didn’t you act when you knew the drinking water was polluted?  I just don’t understand.

Disillusioned in Blacksburg

While this isn’t exactly my reaction, it is the sentiment I’ve heard echoing about our classroom and from others who have taken this class before, not to mention folks like the Bresslers or CACWNY.  Regardless of our own prior opinions of the competencies of government agencies, we’ve all seen the blatant failures of the EPA, NYDEC, CDC etc. over the past few weeks.

The situation begs the question “Is there anything to be done?”  In Thursday’s video conference, Erin Heaney asserted that the activist work CACWNY is doing is a civic duty.  Since we can’t hope for government officials to do their jobs without oversight, we must police them as citizens.  This may be a workable solution, but it is certainly not an ideal one.

My thoughts turned to the way that engineering duties are allocated on state highway projects in Ohio where I worked for seven years.  On road projects, technicians must test construction materials and observe activities to assure compliance with specifications.  These duties are roughly analogous to the enforcement of pollution control laws. In most cases, the Ohio Department of Transportation (ODOT) does not carry out these tasks in house.  Rather the responsibility is subcontracted to private sector engineering firms.  The engineering design of roads and bridges is often contracted out in the same way with the state providing oversight but most of the work being done by private firms.  Organizations like ODOT still wield most of the power, but some of their “big stick” is transferred to these firms.  The firms have all the incentives of private practice to do their jobs well and efficiently.  Unlike their public sector counterparts, the engineers and technicians in these private firms can easily be fired if they underperform or fail to do at their work.

I’m sure there would be plenty of sticky details applying this sort of model to environmental regulation.  Maybe it is done this way in some locales.  Still, privatization of enforcement seems like it might be a practical way to get the work done.  It also would allow the work to be performed by local firms with more familiarity with a region’s political, economic, and social climate than regulators from a state or federal agency.  Perhaps it might boil down to convincing the government to share the power – not necessarily an easy task.

Validate your experiment here..

A news blurb about “The Reproducibility Initiative” caught my attention a few days ago. Started by cancer researcher Elizabeth Iorns, this program offers scientists a means of validating their experiments at outside laboratories.  Validation tests are performed by one of more than 1000 expert providers (including VT for some tests) through the Science Exchange.  The tests are performed blind so there are no worries about “stealing” ideas or results and are provided on a fee-for-service basis.  Certificates are provided for experiments that are found to have reproducible results.  This type of validation has apparently been endorsed by the likes of Nature and PLOS One.

(image from The Science Exchange)

Is this type of validation the wave of the future in scientific research?  Will all studies be subject to outside review at this level?  What are the benefits?

  • Anomalous results would be highlighted by the verification tests.
  • May help to improve the quality of published research.
  • Gives outsiders an independent verification that data is correct.  The public doesn’t have to solely trust in affiliation, funding source, publication reputation, etc. (Corburn p.67) to assure them.
  • Consortia like Science Exchange give even small organizations access to high-tech research level tests and knowledge without having to own the equipment and expertise themselves.

Are there drawbacks to such rigorous validation?  Probably, yes.

  • It doesn’t solve most issues related to the science-public power dynamic.
  • Requiring validation would be inherently distrustful, not to mention expensive.
  • Core ethical issues of how to practice science are not addressed, rather honesty is forced through policing of results.

Any thoughts??

Share the Power

In Street Science, Corburn deftly illustrates how the “technocratic” model of doing science in the public sphere fails the public and also often falls short of producing “usable knowledge.”  Corburn asserts that a big part of the problem is the power dynamic between the scientist and the public.  Our class discussion yesterday raised similar issues regarding the distribution of power and decision making.  These presentations piqued my interest and reminded me of two arenas where I’ve heard or seen this concept advanced before, which I think we can learn from.

In some fields at least, I think it is possible to conceive of the relationship between academia and the engineering consultant/practitioner as another level of the scientist-public dynamic that we discussed today.  It is easy as a practitioner to become disillusioned with the type and quality of research that is performed by our universities.  Research is too often so far afield from the problems and difficulties faced by the working engineer.  This is easily attested to by paging through many journals. This is not to say that knowledge isn’t pushed forward when we think outside the box. However, the funding structures and decisions about what research to pursue are often made with little input from the practicing world.  A good example of how to break this paradigm exists in the geotechnical program at VT.  We host the Center for Geotechnical Practice and Research (CGPR), which is a partnership between private consulting firms and the department.  This center funds multiple research projects each year.  Key to our discussion here, the funding decisions are made at yearly meetings between the practitioners and professors where the need and merit of various topics are discussed.  This is a simple, yet powerful, model of sharing power in academics, not to mention a great way to fund grad students.

A second and more closely related parallel springs from my experience with the course GRAD 5114 – Contemporary Pedagogy offered by the Graduate School at VT.  As anyone who has taken it knows, the course revolves around the concept of learner-centered teaching.  This course challenges the traditional approach to education, in which the teacher is seen as superior and the bearer of all knowledge, while the student is disinterested, ignorant, and an empty vessel to pour knowledge into.  These descriptions of the teacher and student are much the same as those used by the deficit model of scientist-public interactions.  Learned-centered teaching breaks these notions, showing that students have much to contribute to the education process, if given the agency to do so.  They learn much more when allowed to share responsibility for their learning with the teacher.  This is the same concept as the public contributing to science/research through their knowledge.  I could keep drawing parallels but would prefer to end by observing that much of the advice given to teachers trying learner-centered methods is applicable to “street science” as well.  This might include:

  • Involving the public/students in the work might be messy or unpredictable.
  • You must be willing to learn from or with them.
  • Expect resistance, both from the public/students and from other scientists/teachers.  People often don’t like new things.
  • Keep trying.  The rewards are worth the difficulties.

Ethics and Engineering Failures

During my first semester as a Ph.D. student in 2010, I authored a report with my advisors entitled “Lessons Learned from Dam Failures.”  This was a wonderful experience for me both as an engineer and an academic.  It was fascinating to see the many ways in which failure can occur and to consider the lessons that we as engineers should learn from past mistakes made by our profession.

Davis’s use of the Challenger failure to discuss codes and ethics in engineering reminded me of these dam failures in a couple of ways.  First, the engineers involved with some of the failures in the early 1900s expressed a need for the wider accountability within the civil engineering profession.  For dam engineering, this eventually came in the form of state dam safety organizations and review boards.  I would be surprised, however, if the adoption of ASCE’s first code was not influenced to some extent by the prominent dam failures of that era, especially Austin Dam and South Fork Dam.  Killing over 2,200 people, the latter was one of the worst engineering disasters in our country’s history.

My second observation has bearing on our consideration of almost all ethics violations, including the case studies we’re studying as a class.  Davis points out that it is almost always difficult to pinpoint a single cause of a failure.  In our report, we quote Sowers (1979) who said, concerning a landslide,

Often the final factor [in a slope failure] is nothing more than a trigger that sets a body of earth in motion that was already on the verge of failure. Calling the final factor the cause is like calling the match that lit the fuse that detonated the dynamite that destroyed the building the cause of the disaster.

As Davis and Sowers both point out, we need to be careful when considering failures, like the Challenger or the DC Lead scandal or the TCC case, not to narrow our vision to a single cause or try to blame only one person or organization.  Almost always, the situation is vastly more complicated with many shades of responsibility.

Spheres of Influence

After Thursday’s class, I began thinking about the boundaries of our ethical responsibility.  Do they exist, and if so, where?  Do we have obligation to every issue and problem we face?  That option is paralyzing.  Is there a caring, ethical, yet tenable, path forward?

Let me throw out the concept of spheres of influence to help with this dilemma.  I’ve heard it expressed something like this: We have more moral obligation the closer a person or situation gets to us, both in terms of physical location and intellectually.   Our greatest duty is to those in our closest communities and diminishes as one goes progressively further out.  This never excuses me from neglecting danger or harm immediately before me, regardless of whether it falls within my expertise or not.  Following this principle, I can’t walk right past the person in great duress but I may not have to step up to face every issue.  Going out to wider “spheres,” my responsibility narrows depending on my expertise.

Zooming out to the national or global scale, my biggest personal responsibility for justice and ethical issues is related to my professional expertise of slope stability, for example.  In this way I’m not ethically bound to confront water pollution issues such as those in DC but I should be ready to do something about landslides killing people.

For ethical problems outside my sphere of influence, I can still learn, becoming an educated member of the public.  In doing so, I can help others be informed about important issues and possibly advocate for justice.  For example, my wife and I are concerned about the state of agriculture in our country and the perils of agribusiness to us and our people.  On a large scale, this issue lies outside of our sphere of influence.  On a local scale, we can make decisions such as buying local food that are within our influence.  On the other hand, this issue might be an important moral issue for another scientist, say in agriculture, to be active in confronting.

I see this general framework as a way to act ethically without “passing the buck” yet at the same time not becoming paralyzed by the myriad ethical issues that face us each day.

Ethics in the Consulting World

This week’s readings on organizations have made me reflect on my experience working in a mid-sized geotechnical and geoenvironmental consulting firm.  While some of the statements by Harris et al. and Alford resonate, I’m not sure that all types of companies and engineers fit within their paradigms.

For example, I’m not sure that every organization/company is feudalistic.  In general, my experience in consulting was not that way.  I rarely if ever felt like a vassal in under the thumb of an all-powerful engineering manager.  That’s not to say things were perfect, or that politics weren’t present.  But we had a relatively fluid structure and significant amount of autonomy within the firm. Because of its size, our firms owners were also the primary managers and also the principal engineers.  This meant that there was no real separation existed between the engineering staff and the managerial staff.

I was also intrigued by the distinction between managerial and engineering decisions made by Harris et al.  In the context of consulting engineering that I was exposed to,  it really difficult to think of many situations were this type of dilemma would have come up.  Our typical job was provide geotechnical recommendations to an outside client.  If they chose to ignore these recommendations, the client could do so, for good or for bad.  During the design phase of projects, we would provide both general and specific recommendations as required by our contract with little managerial consideration, aside from engineering judgment calls – more about this in a minute.  On the construction end of a project, the managerial and the engineering are often separated among different firms.  Managerial duties fall to the construction company or manager while engineering duties are performed by separate firms with contracts directly with the owner or sometimes through the construction manager.  The final decision falls to the owner with the combined advice of the other professionals on the project team.

Geotechnical engineers, arguably more than any other branch of civil engineering, must make engineering judgment calls based on limited data.  Are these ethical decisions?  Can they be examples of the engineering vs. managerial conflict presented by Harris?  At least sometimes, yes.

Let me give an example.  My firm was asked to provide construction observation and materials testing services (a lucrative and badly needed project) for a new shopping center being built by a good client of ours.  Design recommendations had been provided by another firm and included bearing foundations directly on the shale bedrock present on the site.  Based on nearby experience, this shale was known by our firm to be expansive, meaning that it could cause serious structural issues to the buildings in the future.  We did not have data showing that the conditions were problematic, but in our best engineering judgment could not go along in silence with a project we felt was doomed to problems in the future.  We tried convincing the client to change the design but that proved cost-prohibitive for them.  In the end we decided to turn down the work based on our engineering judgment and the managerial consideration of the potential liability incurred by being part of the project team.  The managerial concerns of pleasing a client and potential profits were superseded by concerns rooted in engineering.

The legal quandary faced by whistle-blowers, which Alford describes, also has a parallel for many small consulting firms.  These firms often face lawsuits on projects where they did nothing wrong, aside from being associated with a project where something went awry.  Both parties are in a relative sense the little guy in our expensive legal system.  While the lawsuits may come for very different reasons, neither party has the financial wherewithal to weather long lawsuits that rarely decide in their favor.  In their position as defendants, it is almost always cheaper to settle the case by paying up than fighting it in court.  The question begs: Is it unethical to implicitly admit wrongdoing by settling just to save money?  More importantly, what can be done about our system to remedy these situations?

Thinking clearly about moral humility

In his November 2011 TED talk Nitin Nohria claims: “We haven’t really understood moral over-confidence.”  While I agree with Nohria’s call for moral humility, I believe it is more truthful to say that we have forgotten our tendency for immorality.  As Nohria terms it, we suffer from moral overconfidence.  We think that only “those bad people” do wrong things.

By forgotten, I mean that people, religions, and philosophies have long recognized that humankind is NOT inherently good and is certainly capable of, in fact rather prone, to evil.  Consider, for example, the concept of “total depravity” – that man is completely unable to do good apart from God’s help – taught by the likes of Augustine and Calvin based on their understanding of the Bible.  While some see this belief as arrogant or judgmental, these men were actually practicing moral humility, seeing the penchant for evil in themselves and the people around them.  Whether or not you believe in this or similar doctrines, I think Nohria’s point helps to show us the truth about ourselves.  We too easily over-estimate ourselves and our own ability to “do the right thing.”

Lewis makes a similar point in “The Inner Ring” by pointing out how quickly we can become “scoundrels,” often times unwittingly and despite our intentions.  In Chapter 1 of his book Mere Christianity, Lewis makes a similar point to Nohria,

“None of us are really keeping the Law of Nature…I am only trying to call attention to a fact; the that this year, or this month, or, more likely, this very day, we have failed to practice ourselves the kind of behavior we expect from other people…They [human beings] know the Law of Nature; they break it.  These two facts are the foundation of all clear thinking about ourselves and the universe we live in.”

Both Nohria and Lewis call us to think clearly and truthfully about our own hearts, and in the context of this class, the decisions we make as engineers and scientists.  A big part of the battle is simply admitting our own wrong-doing or tendency for it.  We are more likely to see the moral/ethical implications of our engineering decisions if we cultivate this type of mindset.  But Humility and the courage to act out of that humility are not easily come by.  Nohria seems to tell us to simply try harder and somehow we’ll make it.  Others, myself included, know that we need outside help to overcome this hurdle.


Since part of this class is about the ethics of communicating clearly with the public, I thought it appropriate to post this video that I received a link to during the first week of class.  While many may have already seen it, it’s always good to step back and evaluate how we sometimes sound to the public as engineers and scientists.


A New Chapter – Engineering Ethics

So I’m off on another chapter in my short and sporadic blogging career – a class in engineering ethics.  I’m actually quite enthused about the class and its pedagogy.

I was inspired earlier today by the stories of Palchinsky and Cuny told in the first chapter of Harris et al. (2009).  In addition to being top notch engineers, these men devoted much of their lives to helping people.  They observed need within their workplace or world and then acted within their capabilities as engineers to alleviate that need, in both cases to their own ultimate peril.  We too seldom in engineering are encouraged to consider the plight of those who have received less privilege than ourselves, or how our engineering decisions can have an impact to change those circumstances.

Another point in the Harris book also struck me, regarding the importance of case studies in fields like engineering.  Our problems are real and use oriented.  They impact real people and are inherently not abstract.  Yet too often we approach ethics from an ivory tower with moral theory and philosophical posturing.  The study of real case studies helps us to break that temptation and live in the concrete reality of complicated situations in a messed-up world.  I’m especially excited about exploring this through the two case studies in this class.  Pedagogically speaking, I’m intrigued by the idea of plunging students into a case like that in Tonawanda and am curious to see how this immersion-type learning actually works out in person.