The South African National Defence Force is probing whether a software glitch led to an antiaircraft cannon malfunction that killed nine soldiers and seriously injured 14 others during a shooting exercise. (TWeb)
Tragic errors such as these are only to be expected – the more dependent we become on software, the more natural that software errors will occur. So if we include software in lethal weaponry the side effects of a programming flaw will not surprisingly lead to fatalities.
A software flaw almost started world war three in 1983. On that year the Soviet missile warning system indicated that five intercontinental ballistic missiles had been launched from a base in Montana. Correct procedure was to launch the USSR response. The man responsible was Russian Strategic Rocket Forces lieutenant colonel Stanislav Yevgrafovich Petrov. He deviated from standard Soviet doctrine by positively identifying a missile attack warning as a false alarm. (Wikipedia).
This human analysis of a computer error probably prevented the beginning of a nuclear war.
The bravery involved in questioning technology needs to be encouraged and cultivated to make sure that when computer errors occur they can be overridden by the human element. Indeed systems should be built to allow human intervention.