It’s a scenario that would make Tesla’s Ceo, Elon Musk, shudder: a future where self-driving cars are the norm but a catastrophic electronic breakdown traps thousands of people inside them.
This dystopian vision of the future was one sketched out by science fiction writers at an event this week where experts were asked to prepare Britain for threats ranging from pandemics to cyber and nuclear attacks.
The writers joined researchers and policymakers working in crisis management and resilience at the gathering organised by RBOC (Resilience Beyond Observed Capabilities), a network of academics whose funders include the Ministry of Defence (MoD).
Instead of using so much human knowledge and ingenuity to deal with an inevitable future … why not use this knowledge to prevent the dystopian future we worry so much about.
This is like riding a runaway train to a cliff with broken tracks and instead of trying to figure out how to stop the train, everyone is trying to figure out where the best seats are.
So the best way to stop a runaway train is usually to derail it. I’m not saying we won’t get to the point where derailing the train becomes a socially acceptable idea, but I don’t think we’re there just yet.
When you think about it … no matter what we do or don’t do, the train will get derailed anyway.
But it also implies that we have no control at all. We are able to apply brakes, to stop or slow the engine or do things to slow the carriage … we just don’t want to because we think the ride is fun and instead blissfully ignore the end of the track and happily increase speed.
Perhaps in theory we do, but that involves a whole load of people changing their behaviour at the same time, and that’s pretty difficult to achieve.
I agree and I also agree that we probably won’t change on our own any time soon. The point I was trying to make was that if we objectively step back and look at it all … we are going to change in the near future whether we want to or not. Right now we have a choice to want to change … there is coming a time when we won’t have any choice but to change. This also doesn’t imply that the future changes will be good or bad, beneficial or detrimental, enlightened or dystopian. Right now if we stopped or slowed down, we can figure out where we’re headed or where we’ll end up but if we continue, once we fly off those tracks and into the abyss, who knows where we’ll land.
Paging Charlie Brooker…
They could save a lot on consulting fees with a cheap Netflix plan and binge watching Black Mirror.
See the problem is they’re not trying to prevent the dystopian future, they still want that, they’re just trying to figure out how to manage the inevitable disasters it causes.
Maybe we could also have writers creating optimistic sci fi worlds. To change the public’s perception of what the future might hold.
i read that as “more Star Trek” and agree completely.
How much you want to bet they don’t listen to the writers as soon protecting profit is part of the issue then wonder why the plan didn’t work.