Function creep and systems abuse

In recent news a US pilot accidentally shot his gun in the cockpit of the plane. Since 9/11 pilots have been given guns to increase safety and this is the first time a gun from this program has been discharged (ABC News). Using this as an example Obsessed writes a very clear argument about the flaws inherent in arming pilots.

We can assume that a trained pilot, when facing piloty thingies, will act like a trained pilot. WE CANNOT ASSUME THAT A TRAINED PILOT WILL ACT LIKE A TRAINED LION-TAMER WHEN FACING A WILD LION.

The example also shows that once installed, any social or technical system has the potential to fail. All the right intentions were present in the arguments to supply pilots with guns and, I will venture a guess that the pilot really regrets the incident. Despite all these regrets and good intentions the pilot is to blame for the shot and will most probably be seriously punished.

But what about those who advocated and argued for the system itself? They will most probably be able to swear themselves free from legal, social and moral responsibility by blaming all the results on the pilot. This is not an untypical response from those who create and regulate systems. But it is also a way of shirking responsibility. Those who create and regulate systems must become more aware of the effects of their decisions and not be allowed to hide behind good intentions. The side effects enabled by the system – in this case the gun being shot at the wrong time – must be factored into the decision.

This is not the same as requiring that systems builders prepare for every impossible situation but only that they be required to take into account the added risks entailed by system abuse. Stated simply, the pilot would not have been able to discharge a gun in the cockpit if there was no gun in the cockpit.

Leave a Reply

Your email address will not be published. Required fields are marked *