The Machine Stopped

The Machine Stopped

As a software tester I get how difficult software development is, and why software technology so often fails us. The problem space of testing for all possible scenarios is too vast to traverse within the time frame of a fundable development project. It’s likely that some scenario will be missed in the journey from design concept – which always look great on paper – to actual implementation – which often fails to meet expectations, usually because the expectations are not accurately communicated.

Therefore I was not surprised to learn that two recent Boeing 737 plane crashes were the result of software failure. This article by a former crash investigator explains it a bit, even comparing the software that aids airplane pilots to the software you use on your mobile phone. If you’ve ever been frustrated using your phone, imagine how airplane pilots must feel, operating their software-laden fly-by-wire systems. The stakes are obviously much higher for them when those devices fail.

As the article points out, there is a paradox where reliance on safety technology sometimes makes us less safe. Relying on computer software, which is bound to be error-prone, seems insane. It’s only possible because our genius system of corporate capitalism deflects liability away from individuals and underwrites risk through insurance payments. But take it from someone who has been testing software for most of his adult life: all of those automated and networked computer systems that pervade our lives are full of bugs. You won’t see me getting into a self-driving car any time soon.

Write a comment...