Safety experts warn over driverless car ‘grey area’

Safety experts warn over driverless car ‘grey area’

Trials of fully-automated driverless vehicles were launched in London in April

So – it turns out not all automated vehicles will be fully automated… and not knowing exactly what self-driving cars are capable of could result in a spike in accidents.

AV drivers may grapple between the “grey areas” of actual functionality and what they believe their vehicles are capable of, which could lead to confusion among drivers, according to a white paper released this week by safety researchers the Automated Driving Insurer Group (ADIG) and Thatcham Research.

The paper, titled “Regulating Automated Driving” highlights the enormous differences between assistance systems and fully-automated systems.

Experts are now calling for regulators to enforce manufacturers to make clear distinctions between the two.

“[Assisted systems] are fast emerging and unless clearly regulated, could convince drivers that their car is more capable than it actually is,” said Thatcham’s CEO Peter Shaw.

Cars with assisted driving still require immediate driver intervention in emergency situations, and not knowing where the functions of your car start and finish could lead to a short-term increase in crashes, says Shaw.

Robot cars by 2018

Experts predict you’ll soon be able to drive from door to door without needing to touch the wheel, as self-driving cars could be on the road as early as next year.

But fully-automated vehicles are significantly different to current cars that come with assisted functions such as cruise control, Autonomous Emergency Braking (AEB) systems, and Traffic Jam Assist, that automatically accelerates and brakes depending on congestion.

Although technologically advanced, today’s functions still need motorists to keep their eye on the road and be ready to take back control if the car hits a situation, and drivers who don’t understand this distinction could find themselves in danger.

Clear and explicit marketing 

Autopilot, Intelligent Drive and Pilot Assist are just a few of the different names given to assisted systems, which vary depending on the manufacturer.

The paper warns these are open to interpretation.

“Vehicle manufacturers should be judicious in badging and marketing such systems, avoiding terms which could be misinterpreted as denoting full autonomy,” it says. “Hybrid systems which creep into the intermediate grey area between assisted and automated should also be avoided.”

But ADIG and Thatcham Research say the rules need to go further to regulate that vehicles should only be explicitly marketed as automated when:

  • the car has sufficient capabilities to deal with virtually all situations and the driver can safely disengage
  • the vehicle will come to a complete stop when facing any situation it can’t handle
  • the system avoids all conceivable crash types
  • in the event of a partial system failure, it continues to function adequately
  • accident fault can be determined by the car’s data, without ambiguity

Find out about motorists’ biggest worries regarding self-driving vehicles

Leave a Reply

Your email address will not be published. Required fields are marked *