Soji: "- Un-Boeing-like Accident?"

Aviation news and discussion for anything aviation related.
Post Reply
User avatar
Senior Veteran
Senior Veteran
Posts: 1347
Joined: Fri Oct 02, 2015 9:27 am
Favorite Aircraft: Antonov An-225 Mriya
Location: Ukraine
OS: Linux
Has thanked: 887 times
Been thanked: 295 times

Soji: "- Un-Boeing-like Accident?"

Post by u2fly » Sun Mar 17, 2019 3:45 pm

Soji Yamakawa wrote:
  • 03/12/2019

    Un-Boeing-like Accident?

    We don't know if the crash of Ethiopian Airline B-737 Max 8 was caused by the same cause as Lion Air's crash a while ago.  However, even if the causes of the two accidents are different, the background of the accidents maybe traced back to the change of the design philosophy of Boeing.

    Lion Air's accident reminded me of an Airbus 300-600R crash at Nagoya airport in 1994.  I don't know if you remember, but the crash was obviously from the design error by Airbus.  Airbus 300-600R in go-around mode, mixed control-input from the human-pilot and auto-pilot together.  Auto-pilot controlled the pitch trim, and the elevator input from the pilot controlled the elevator.  On April 26, 1994, the pilot of Airbus 300-600R accidentally pulled the go-around lever, setting the airplane into the go-around mode.  The pilot wanted to continue landing, but the auto-pilot pulled the nose up to go around.  The pilot pushed the yoke harder, fully deflecting the elevator to the nose-down position, while the auto-pilot moved the elevator trim to nose-up position to counter the elevator.  Eventually the elevator trim overpowered the human-pilot.  At that time, the aircraft no longer could lower nose.  The aircraft climbed violently, stalled, and then crashed killing all souls on board.  The obvious cause was that the aircraft could be put in a state that mixed input from the auto-pilot and the human-pilot together.

    On the other hand, Airbus did everything to deny the design error.  They announced right away that the accident was caused by the human error.  Their point was since the human-pilot may make a mistake, the auto-pilot must be trusted in case of emergency.  Therefore the accident was caused by the human error.

    You don't have to be a Ph.D to understand Airbus's logic was absurd.  We can discuss if the input from the human-pilot should be prioritized over the auto-pilot or vise-versa.  Both human-pilot and auto-pilot has some probability of making a mistake.  We need to bet on one of them.  Human cannot make accident absolutely zero.  All we can do is to minimize the probability of an accident.  If, as Airbus said, Airbus took a bet on the auto-pilot, and the human-pilot happened to be right, it was unfortunate, but not a design error.

    Well, let's say Airbus was right that the auto-pilot should be prioritized.  Then, the inputs from the human-pilot and the auto-pilot must not be mixed.  If Airbus wants to trust the auto-pilot over the human-pilot, input from the human-pilot should be shut off.  Or, if they trust the human-pilot over the auto-pilot, the input from the auto-pilot must be shut off.  In either case, the inputs must not be mixed.  However, Airbus 300-600R mixed them in go-around mode.  Therefore, it was an obvious design error.

    To my surprise, Japanese counterpart of NTSB, accepted what Airbus was saying and concluded that the accident was caused by the human error.  Hello??  As a Ph.D in Mechanical Engineering, I think it is obviously caused by the design error.  Airbus should have been held responsible.  By the way, Airbus should have kept Concorde flying instead of developing A380.

    I guess someone working for Japanese NTSB got richer.  Airbus probably made a secret change in the control system of Airbus 300-600R.  Typically it is called an SB, or Service Bulletin.  No similar accident was reported after this Nagoya accident.

    Anyway, the Boeing's reaction to this accident was this:  Boeing trusts the human-pilot over the auto-pilot in case of emergency.  Therefore a similar accident is impossible in Boeing aircrafts.

    It was supposed to be.

    I had some chances to talk with employees of Boeing before in conferences.  They were saying Boeing likes pilots, and they trust the pilot in case of emergency.  They never employ side-stick because pilots love control yoke right in front of the pilot seat.  Boeing aircrafts disables auto-pilot and complies with the pilot's input when a strong pressure is applied to the control yoke.  Boeing was always pilot-centered.  That's what I heard then.

    But, what about Lion Air's accident?  The reaction from Boeing was a copy of the reaction from Airbus after the Nagoya accident.  Boeing said that the solution was to make sure the pilot was informed and trained to deal with this situation.  No, it's not.  I wouldn't say it is an design error.  The human-pilot may make an error in case of emergency.  It is true.  If the design philosophy trusts the auto-pilot over the human-pilot, it is a valid philosophy.  There is not right or wrong.  But, we need to keep in mind that the computer may make a mistake, too.

    I'm a programmer.  I know that a programmer makes mistakes because I make mistakes.  I'm not the god.  A control program used by the auto-pilot is written by a human-programmer and may include an error. It is nothing to surprise if the auto-pilot makes an error.  Just like trusting the human-pilot will never eliminate an airplane crash, neither trusting the auto-pilot will never eliminate an airplane crash.  All we can do is make an airplane crash less likely to happen.

    Which philosophy do I like?  Obviously I would trust the human-pilot.  A human-pilot can better adapt to an unknown flight condition.  But, if the auto-pilot steps on the programming error, it cannot correct itself.  Well, we can theoretically make it wireless-updatable.  But, do you want to start updating the control software of an airplane while flying?  "The flight-control computer needs reboot.  It would take 10 minutes to restart."  Would you click "Yes" to this?  Machine learning?  It does good, if the flight condition is an interpolation of the known training conditions, but it may go berserk if it happens to be an extrapolation.

    So, I liked the Boeing's pilot-centered design.

    In Lion Air's crash, the human-pilot tried to pull the nose up while the auto-pilot was trying to dive.  The auto-pilot was thinking the aircraft was about to stall.  The pilot was trying to prevent the aircraft to dive into the ocean.  From the aircraft point of view, it had no idea who was right.  In the aircraft-accident history, there were numerous cases that the human-pilot ended up pulling the nose up too high and stalled and crashed.  The aircraft trusted the auto-pilot.  And, the auto-pilot happened to be wrong this time.  It was unlucky.

    Again, it is not a design error.  Both the human-pilot and the auto-pilot can make a mistake.  Who to trust depends on the design philosophy.  If the design philosophy trusts the auto-pilot over the human-pilot, it is a choice, not an error.  I don't like it though.

    If Boeing trusted the human-pilot like it did before, Lion Air's accident could have been averted.  If current Boeing trusts the auto-pilot over the human-pilot, it is not an error, but I feel it is unfortunate.
P.S.: According underlined message, in latest YSFlight 2017 - 2018 versions training missions broken — some controls locked in hard positions, so there no chances land safely...
/!\ READ YSFLIGHT HANDBOOK (online + PDF) | updated 2019/04/25

U.S. Senator John McCain wrote:
“Spending #NewYearsEve w/ brave #Ukrainian Marines at a forward combat outpost - we stand w/ them in their fight against #Putin's aggression”
(31 DEC 2016)

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest