How safe are autonomous cars?
The ride-sharing company Uber has temporarily suspended its tests with self-driving cars after one of the company's vehicles killed a pedestrian in the US. Computers behind the wheel are not ready for use on the roads yet, some commentators criticise. Others argue that human drivers pose a greater risk to safety.
Still far too dangerous
Until driverless car technology is more mature these vehicles will remain an unacceptable safety risk, warns Mica Endsley, former Chief Scientist of the United States Air Force, in the Financial Times:
“Most automation is not 100 per cent reliable, and is unable to handle the wide variety of situations that can happen in the real world. ... Much more work is needed to ensure that autonomous vehicles can sense and understand their driving environment, to develop driver displays that will overcome the fundamental challenges of low engagement and driver complacency, and to create training programmes to help drivers better understand the autonomy. Until that work is done, highly autonomous vehicles are still too dangerous.”
Computers are the better drivers
The accident should not lead to the whole technology of autonomous vehicles being called into question, Spiegel Online warns:
“Engineers, accident researchers and insurance mathematicians all agree: the sensors or software in a self-driving car may occasionally fail, but the number of cases in which accidents are caused by human drivers is far higher. Experts calculate that the number of accidents would be reduced by ninety percent if cars were driven only by computers. ... The computer pilot is so superior to its human counterpart that we really shouldn't be asking whether computers should be allowed to drive cars, but rather why they still don't.”