AI and Aviation

 



First, let’s talk about the Boeing 737 MAX. An add-on to the control system was made to guard against a natural inclination to go into a climb and then stall, due to the position of the engines, forced by the low undercarriage, a relic of the 70s. The add-on used a single angle of attack sensor on the outside of the aircraft. Boeing was rightly pilloried for abandoning the redundancy required for all aviation systems. It should also have been pilloried for an add-on which had the power to crash the plane, even though the plane’s other systems could see it was wrong – the altimeter would have been steady, and clearly reducing when the add-on forced a dive (the dumbness of the whole thing is just incredible). The combination of speed and altitude would have matched thrust  for horizontal flight.

 

It is being suggested that a very crude form of AI be stuck on top of an existing complex system as a way of eliminating the pilot. It would be a good idea to understand the complexity the pilot is managing before dismissing the person’s role – particularly the unconscious processing and the reliance on their own instrumentation – the “seat of the pants” accelerometer, their hearing, their understanding of the physics of what is happening, the visual overview and unconscious processing of the instruments, including the tracking of change. When that is done, the pilot can be dismissed with confidence. 

 

The A-380 with the blown engine is a good example of what happens now. The plane is heavily instrumented, and can override the pilot. When there is a major emergency, pages of warnings appear. The pilot must remain calm, and address the causes, not the downstream effects – in other words, ignore 90% of what the control system is telling them. This will be hard to replicate, without replicating the model in their unconscious mind that the pilot carries.


But that can’t be done with existing AI! Precisely. We don’t understand the problem and the existing systems well enough. Cue Four Pieces Limit.

 

But we have data! Yes, the last plane that had this very problem crashed and burned, with the loss of all souls. We have the problem now. Do we accept the data, or do we try to do something about it? 


If we have the data about a problem, and seem to be heading for it, we can make a hypothesis about how to avoid it, but existing AI doesn’t do hypotheses (while obviously pilots do them all the time).

 

There is another problem. There have recently been several cases where the pilot has dived the plane into the ground – German Wings, a Chinese version with a vertical dive, and probably the Malaysian Airlines plane that disappeared. Pilots love their job – prestige, in command, the chance to fly an A-380. Telling them they will soon be redundant may let loose the black dog of depression – after an A-380 it is a bit of a comedown to be flying Cessnas, or even being a taxi-driver. It would be wise to evaluate the forces being let loose before doing so

Comments

Popular Posts