Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
April 21, 2019 01:34 am

'How the Boeing 737 Max Disaster Looks to a Software Developer'

Slashdot reader omfglearntoplay shared this article from IEEE's Spectrum. In "How the Boeing 737 Max Disaster Looks to a Software Developer," pilot (and software executive) Gregory Travis argues Boeing tried to avoid costly hardware changes to their 737s with a flawed software fix -- specifically, the Maneuvering Characteristics Augmentation System (or MCAS):It is astounding that no one who wrote the MCAS software for the 737 Max seems even to have raised the possibility of using multiple inputs, including the opposite angle-of-attack sensor, in the computer's determination of an impending stall. As a lifetime member of the software development fraternity, I don't know what toxic combination of inexperience, hubris, or lack of cultural understanding led to this mistake. But I do know that it's indicative of a much deeper problem. The people who wrote the code for the original MCAS system were obviously terribly far out of their league and did not know it. So Boeing produced a dynamically unstable airframe, the 737 Max. That is big strike No. 1. Boeing then tried to mask the 737's dynamic instability with a software system. Big strike No. 2. Finally, the software relied on systems known for their propensity to fail (angle-of-attack indicators) and did not appear to include even rudimentary provisions to cross-check the outputs of the angle-of-attack sensor against other sensors, or even the other angle-of-attack sensor. Big strike No. 3... None of the above should have passed muster. None of the above should have passed the "OK" pencil of the most junior engineering staff... That's not a big strike. That's a political, social, economic, and technical sin... The 737 Max saga teaches us not only about the limits of technology and the risks of complexity, it teaches us about our real priorities. Today, safety doesn't come first -- money comes first, and safety's only utility in that regard is in helping to keep the money coming.The problem is getting worse because our devices are increasingly dominated by something that's all too easy to manipulate: software.... I believe the relative ease -- not to mention the lack of tangible cost -- of software updates has created a cultural laziness within the software engineering community. Moreover, because more and more of the hardware that we create is monitored and controlled by software, that cultural laziness is now creeping into hardware engineering -- like building airliners. Less thought is now given to getting a design correct and simple up front because it's so easy to fix what you didn't get right later. The article also points out that "not letting the pilot regain control by pulling back on the column was an explicit design decision. Because if the pilots could pull up the nose when MCAS said it should go down, why have MCAS at all? "MCAS is implemented in the flight management computer, even at times when the autopilot is turned off, when the pilots think they are flying the plane."

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/fYjgA2VuJ1w/how-the-boeing-737-max-disaster-looks-to-a-software-developer

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot