Nine facets of AF447

Because we think they had a material bearing on the AF447 disaster, in Air Crashes and Miracle Landings we alluded to private matters one would not normally divulge. Firstly, the ones leading to the captain’s sleep deficit, and even his very likely euphoric and soporific frame of mind when early in the flight he just might have bothered to adjust course to avoid the storms, as did the captains of other airliners that night.

In fact, we give details of nine facets of the disaster which in combination resulted in tragedy.

Unlikely to Recur
Lessons learnt should mean that never again will there be such a stall following a high-altitude autopilot disengagement due a loss of credible airspeed data. Pilots will be reminded that all they need do—apart from keeping the wings level (not easy without the computer) and continuing to fly level at the already proven power setting—is nothing.

They are also being taught how to recover from a high-altitude stall. Besides reducing power to prevent the nose being pushed up in a stall it might also be necessary to remove the resulting upward trim on the horizontal stabilizer in the most unlikely event of someone Bonin-like having been continuously “pulling back” on his stick. Unfortunately, when copilot Bonin, the PF,  announced that was what he had been doing all along, they were dropping at ten thousand feet a minute, with only half that height remaining. It was quite hopeless, for even after putting the nose down and having regained airspeed, considerable time, or rather height, would be still needed to break that vertiginous fall.

The Future—Artificial Intelligence (AI)
Research is being done on an autopilot featuring artificial intelligence that can, for instance, guesstimate airspeed from more data sources, such as GPS and inertial guidance systems, before “giving up” and disengaging. The team developing it found it could even learn from experience gained on different models of aircraft. That would apply to cases where pilots had flown on manually with no problems after the autopilot disengaged due to lack of airspeed data.
Such an autopilot could continue, as the lawyers say, “under advisement” and avoid the pilots being thrown in at the deep end, where the startle effect and lack of manual flying dexterity are liable to make them do something untoward, as tragically happened here.

Frequently cited as the ultimate example of how automation has made pilots lose their flying skills and crash, this disaster was also due to other incidental human and technical factors. As Sullenberger so wisely says, “Bad outcomes are almost never the result of a single fault, a single error, a single failure—instead they are the end result of a causal chain of events.”
Rather than an example of pilots losing their flying skills this could also be a prime example of how complex underlying situations can be.

See the book to understand the interplay of nine facets of the disaster in detail, and how the delay in the  captain returning to the cockpit meant the more experienced non-flying copilot was distracted calling him, and he arrived too late to grasp the situation. Investigators are wary of speculating over matters of personality, though the NTSB did delve into private life of the copilot who swished off the tail of the Airbus as it encountered wake turbulence on taking off from New York’s JFK. However, the French investigators are particularly wary as the pilots’ union opposes publication of any details including the transcription of the CVR.