Driverless cars: the road ahead is difficult
Driverless cars are a great idea, but we need to cut through the hype: there are tough technical and social challenges that need to be solved first
Driverless cars: the road ahead is difficult
We all want driverless cars. For consumers, they promise a safer, cheaper and stress-free form of transport. For city planners, they are a chance to make more efficient use of road space, and to clear car parks out of city centres. For companies like Google and Tesla, they are a chance to make money.
For the more excitable parts of the tech press, they are just around the corner. Maybe even already with us: Google’s prototypes are already on the roads of California and luxury carmakers now fit autopilots that can handle motorway driving in their new cars.
For all that I want this future to happen, I am convinced that the road to get there will be longer and more difficult than we’re being led to believe. The recent news of the first fatal crash implicating a car’s autopilot only reinforces that view.
New technology does not inevitably lead to social, economic or regulatory change, nor does demand magically result in innovation. Fully automated metro trains have been running in Lille since before I was born, but are still niche. I am willing to bet that train drivers will still be around long after I’m gone: the technical challenges in replacing them all with computers are tough, and the demand is too soft. There’s nothing inevitable about self-driving cars either.
So if my daily sauna-commute on London’s Central Line is to be replaced by the cool and comfortable air-conditioned car I want, there are some serious issues the carmakers will need to grapple with. (They are probably well aware of these already, but the tech press don’t seem to be.)
The challenges involved in making fully autonomous vehicles actually work are enormous.
Current prototypes only work when everything goes right. They can’t really cope with roadworks or temporary lights. They can’t recognise policemen. They can’t drive routes that haven’t been meticulously 3D mapped in advance.
The fact they are driving on public roads is because an army of mappers and software engineers is making it possible.
Driving is full of unpredictable and bizarre situations. The Tesla crash appears to have been caused by the car’s sensors failing to see a white truck against a white sky. Fully autonomous vehicles, ones that can drive even without a person on board, will have to be able to cope with everything that the road can throw at them. If you want a car to be fully autonomous, it’s not enough to make it able to guide itself through 99, or even 99.99% of circumstances.
To solve these problems you need to add complexity: more lines of computer code, extra sensors, additional levels of redundancy. The trouble is that complex systems aren’t just difficult because they are big and unwieldy, but because the complexity itself causes problems.
Charles Perrow’s brilliant study of accidents explains how complex systems can cause accidents. As you add complexity, you add probability that something will go wrong (as there are more systems that can break down), and also add probability that there will be unforeseen interactions between systems (as there are more connections between them). Dangerous situations can emerge from the complexity of the system itself. Even adding safety mechanisms can make things worse.
There isn’t a simple fix to this problem - just a long, hard slog to make each component of the system safer, more reliable, more predictable. I suspect this means human drivers sticking around for longer than people think, even if the progress in self-driving tech means they will have less and less to do.
This is actually an opportunity. Autopilot technology that gradually improves is still extremely useful.
For users, there is a promise of gradually easier and safer driving - parking, changing lanes, merging, navigating confusing intersections.
From the perspective of carmakers, gathering data from billions of miles driven by their customers lets their software gradually learn the myriad bizarre situations that everyday driving brings about, with the safety net of a human driver to take over if the autopilot gets confused.
It is an opportunity to slowly and methodically iron out the kinks so that when they eventually do launch, self-driving cars are genuinely ready for everything the road can throw at them - rather than an embarrassing piece of vapourware that isn’t ready for prime time. But I think this will take more time than people realise, because unlike most tech innovations, lives are at stake.
The next challenge is one of control. People get far more flustered by the fear of flying, food additives and childhood vaccines than they are by driving, smoking or over-the-counter medication.
In all of these cases, the issue is that taking a risk is different to being subjected to one, not that the public haven’t understood the magnitude of the risk. A risk you haven’t volunteered for, which you can’t control, or which you don’t understand is a lot more frightening.
Air travel, processed food and vaccines are tolerated because they are genuinely very safe, with sources of risk ruthlessly weeded out. They are also very useful.
Self-driving cars won’t be accepted unless they are manifestly safer and more useful than traditional vehicles. The carmakers have an opportunity here to have a proper process of engagement with the public. And that goes both ways: yes, they will need to educate the public about the risks. But they will also have to learn from how the public perceive risks - and hold themselves to significantly tougher standards than we, as a society, currently hold cars to. We will all gain from this.
Some foolhardy predictions
The technology for fully autonomous vehicles will take longer to mature than expected. The complexity of the task will turn out to be a huge challenge. The requirement for a human driver to be present will be with us for a long time.
In the short term, Tesla will patch their software so that the autopilot can only engage if the driver’s hands stay on the wheel. (Mercedes-Benz already does this.) Drivers will be - and feel - more in control, but the industry will have taken a small step back from fully autonomous driving.
Accidents like the Tesla crash, will continue to cause consternation, even while routine car crashes (1,732 reported road deaths in the UK in 2015) barely make the headlines.
In the longer term, even if they solve the technical challenges, self-driving cars will only take off if carmakers learn the lessons of the airline industry: when your customers aren’t in control, you have to keep them very safe indeed.
Tesla Model S by mangopulp2008 on Flickr (CC-BY-NC-ND)
Google Self-Driving Car by smoothgrover22 on Flickr (CC-BY-SA)
Airliners by ARTS_fox1fire on Flickr (CC-BY-NC-ND)