Mindbullets Logo 2025

When technology fails to fail

Critical system failures that don't trigger corrective action cause disasters

Yet another airliner has fallen out of the sky without warning. And the reason is fail-safe systems that fail to alert the pilots to the real problem when they fail.

If that sounds a little confusing, that’s exactly what human operators experience, when autonomous systems default to manual control – the ultimate fallback when there are critical failures.

Even double-redundant systems can be completely taken out by say, a solar flare, or a massive electromagnetic pulse; something as simple as a direct lightning strike can do it. When that happens, the human backup needs to be completely in sync with the current status of the system, whether it’s a driverless car negotiating a tricky curve, or a jet on autopilot.

Part of the problem is, we’ve come to trust and rely on automated systems to such an extent, that we’re completely taken by surprise – and confused – when they don’t work as expected. After all, they hardly ever go wrong. But when they do, we’re often at a loss.

When things are running smoothly, as they do 99.9% of the time – that’s our service level guarantee – we get complacent. We know we can always take over in the event of a disaster, but if we never have to, how well are we prepared for that one-in-a-million failure? Our training needs to change.

In fact, the perfect automated system is one that employs fuzzy logic, and trips out occasionally, to keep us on our toes. This is one situation where zero defect is actually too little of a bad thing. We need a bit of unpredictability to keep the humans sharp.

Technology that never fails is not ideal – it creates the ultimate disaster when it does.

Warning: Hazardous thinking at work

Despite appearances to the contrary, Futureworld cannot and does not predict the future. Our Mindbullets scenarios are fictitious and designed purely to explore possible futures, challenge and stimulate strategic thinking. Use these at your own risk. Any reference to actual people, entities or events is entirely allegorical. Copyright Futureworld International Limited. Reproduction or distribution permitted only with recognition of Copyright and the inclusion of this disclaimer.