Public PhilosophyThe Ethical Failures Behind the Boeing Disasters

The Ethical Failures Behind the Boeing Disasters

Two Boeing 737 MAX 8 airplanes crashed shortly after takeoff, on October 28, 2018 near Jakarta, Indonesia and March 10, 2019, near Addis Ababa, Ethiopia. The disasters cost the lives of 346 passengers and crew. Black box data recovered from the two planes indicate that bad engineering practices and surprisingly simple design errors contributed to both calamities. The Boeing 737 MAX 8 only recently went into service, in May 2017.

The question I wish to raise is whether anyone at Boeing behaved unethically in approving the plane for sale. My tentative answer is yes. I believe that at least three ethical principles may have been violated by Boeing engineers and managers.

The Fundamental Canon

According to the first “fundamental canon” of the National Society of Professional Engineers (NSPE) Code of Ethics, engineers “shall hold paramount the safety, health and welfare of the public.” According to preliminary findings from the ongoing investigations leaked to the New York Times, both disasters were caused by a single faulty sensor, which triggered a new automatic anti-stall system to repeatedly push the plane’s nose down. Several newspapers have reported that Boeing until recently charged extra for relatively simple and cheap warning displays in the cockpit that alert pilots to divergent sensor readings. If such displays had been installed on the two 737 MAX 8s that crashed, it is more likely (but not certain) that pilots would have been able to diagnose the malfunctioning anti-stall system. An aircraft manufacturer that attempts to increase its profit by charging extra for relatively simple but vital safety devices does not “hold paramount the safety, health, and welfare of the public.”

Does it matter that the decision to charge extra for the displays was most likely made by managers in the sales department rather than by engineers? This is likely to depend on what opinions engineers expressed as the decision was made. The NSPE Code clearly states: “If engineers’ judgment is overruled under circumstances that endanger life or property, they shall notify their employer or client and such other authority as may be appropriate.”

There is, of course, a limit to how much money aircraft manufacturers can be asked to spend on making their products safe, but that does not seem to have been a relevant consideration in this case. Compare, for instance, the automobile industry. Consumers are permitted to buy cars that are less safe than the safest models on the market, but regulators do not permit manufacturers to offer cheap and simple safety systems as optional upgrades. Seatbelts, ABS brakes, and airbags are mandatory equipment in all new cars sold in almost all countries. The plausible idea that engineers shall “hold paramount the safety … of the public” explains why this is so.

Informed Consent

Pilots were never informed that the new version of the 737 MAX 8 model had been equipped with the new automatic anti-stall system, nor that it could be activated by a faulty reading of a single sensor. Because pilots did not know that the automatic anti-stall system existed, they were unable to understand why the onboard computers repeatedly pushed the nose of the jet down. This can be construed as a violation of the principle of informed consent. Just as doctors are obliged to ask patients for informed consent prior to any medical intervention, aircraft manufacturers arguably have a similar obligation to make sure that pilots responsible for the safe operation of their products are properly informed about all critical systems, and consent to using systems that take away control from the pilots ultimately responsible for the safety of the passengers.

The principle of informed consent is widely accepted in medical ethics but arguably deserves more attention by engineering ethicists. It is, for instance, uncontroversial to demand that cell phone manufacturers ought to ask customers for consent before their gadgets share the cell phone’s position with third parties. This moral requirement can be understood as an application of the principle of informed consent. That said, the principle of informed consent is sometimes not as easy to apply in engineering contexts as in medical ethics. The doctor-patient relationship is more direct and predictable than the engineer-user relationship. Engineers seldom interact directly with the user and technological devices are sometimes (mis)used in ways that cannot be reasonably foreseen by engineers.

The Precautionary Principle

The third ethical principle violated by Boeing is the precautionary principle. Several days after the 737 MAX 8 was grounded by aviation authorities around the world, Boeing CEO Dennis Muilenburg called President Trump to assure him that there was no need to ground the model in the United States. It was still unclear what had caused the crashes, Muilenburg claimed. From this epistemic premise, he inferred that it was too early to take action. For several days, the Federal Aviation Administration agreed with this policy. The regulators claimed that foreign civil-aviation authorities had not “provided data to us that would warrant action.”

According to a plausible formulation of the precautionary principle I defend in The Ethics of Technology, “reasonable precautionary measures” should be taken by engineers and others “to safeguard against uncertain but nonnegligible threats.” Few would dispute that it would have been a reasonable precautionary measure to ground the 737 MAX 8 immediately after the second crash. If two brand new airplanes of the same model crash shortly after each other under what appears to be similar circumstances, regulators do not have to wait until they know for sure what caused the crashes before they take action. The second crash changed the epistemic situation enough to warrant action, even if it did not prove that the anti-stall system was to blame.

To avoid some of the objections associated with the precautionary principle, it is appropriate to think of it as an epistemic principle rather than as a principle that should directly guide our actions. In essence, it is better (from a moral point of view) to believe that something is unsafe when it is not, than to believe that something is safe when it is not. If construed as a belief-guiding principle grounded on moral consideration, the precautionary principle is compatible with the principle of maximizing expected value. We should first adjust our beliefs about the world by applying the precautionary principle and then maximize expected value relative to those modified beliefs.

Addendum: In my recently published textbook Ethics for Engineers, I discuss all three principle mentioned in this post in greater detail.

Martin Peterson

Martin Petersonis Sue G. and Harry E. Bovay Jr. Professor of the History and Ethics of Professional Engineering in the  Department of Philosophy at Texas A&M University. His most recent book isEthics for Engineers(New York: Oxford University Press 2019).

8 COMMENTS

  1. Ralph Nader’s Unsafe at Any Speed comes to mind. Manufacturing and service industries ought be ethical first. How many ergonomics engineers work for any company? Does a company steer between underdoing and overdoing ethics? My book, Unified Phlosophy: Interdisciplinary Metaphysics, Cyberethics, and Liberal Arts, notes the human factors engineering approach to philosophy, metaphysics, and ethics. Let’s expand ethics: we ought significant restrict and always think about the extent of pilotless aircraft, cars, etc., and piloted but unsafe aircraft. The mechanically safest plane, cars, etc., are unethical is totally computerized and robotized.

  2. People in general, not just ethicists or philosophers, react after hearing news about tragedies. This is especially true when fatalities occur. Currently, philosophers and others are discussing a company’s unethical behavior leading to the crash of its misdesigned commercial jet. Decades ago, it was the misdesigned auto where the gas tank exploded on impact. But professionals misdesign cities and other aspects of civilization which do not result in immediate fatalities. Urban sprawl is unethical, gobbling up the environment and forcing people to travel long distances requiring much energy.
    Bureaucracy is unethical. Persons deal with clerk after clerk, procedure after procedure, department after department. We have misdesigned processes and complicated life. Was it Parkinson’s Law that said work expands to meet time? Ergonomics needs expanded definition. I would replace “human factors” engineering with “limit factors” engineering. I am writing about this. Thomas Aquinas, who was too early to get an engineering degree from MIT, Caltech, or other engineering school, warned that laws ought be ethical, and not just laws.

  3. Martin Peterson and others: I share your concern with potential unethical actions by somebody or somebodies that lead to the two disastrous 737 crashes and am well aware of many engineering codes’ statements that protection of health, safety, and welfare is paramount. However, we need to know a lot more about this disaster and the players before we can invoke a code obligation to protect the public.

    I am not being legalistic, just realistic.

    Maybe what I think we need to know is in your/Martin’s book – congratulations, by the way, on its publication.

    Some questions:

    1) Were any on the engineers licensed? If yes, that code provision or something very close to it probably applies.
    2) Were any of the engineers members of NSPE? If yes, that code provision applies.
    3) Were any of the engineers members of discipline-specific engineering societies like ASME, IEEE, etc.? If yes, that code provision applies.
    4) Does Boeing have a code and, if so, what does it say? There may be a “protect public” provision

    Given manufacturing/industrial exemptions, many engineers are oblivious to licensure, professional societies, and codes. Even so, they may have a sense of humanity. However, I studied the GM ignition switch disaster and question the basic humanity of that engineering team – it denied and dithered for a decade while people died or were injured.

  4. These accidents were a culmination of many factors including flight deck crew that were not prepared to handle a simple sensor malfunction. Maintenance crew that did not diagnose and repair a simple sensor malfunction the day before one of the accidents.Why were some pilots able to identify the malfunction quickly and others could not at all? Pilots are there for the very reason that machines do malfunction. They need to be prepared to handle these situations.

    • Agreed.

      I’m not sure how the entire blame is being put on Boeing for this.

      In the USA, there were 100’s of flights a day of the 737-max 8, yet no crashes. Compared to the maybe dozens a day in these other countries combined? Those stats tell me one thing – Pilot Error!

      • In Boeing internal memos released it came to light that Lion air wanted more training on the max but Boeing pulled a “Jedi mind trick” and talked them out of it.
        They deserve the majority of the blame.

  5. It is a known fact that Boeing is schedule driven. why not reverse the procedure and put quality first, and then work to attain quantity. what does the Boeing company have to lose with this switch plan? the latest 737-9 incident is a good example of change that is needed in order to prevent the continuing cost loss to the Boeing company.

Leave a Reply to Stu Walesh,, PhD, PE Cancel reply

Please enter your comment!
Please enter your name here

WordPress Anti-Spam by WP-SpamShield

Topics

Advanced search

Posts You May Enjoy

Photo of Thom Brooks

Meet the APA: Thom Brooks

Thom Brooks is Professor of Law and Government at Durham University’s Law School where he was Dean for five long years. His background is...