Automated vehicles will need to be programmed with a clear and agreed set of rules for decision making according to new research published this week by law firm Gowling WLG.
The firm is a partner on the Government backed UK Autodrive project to trial the use of connected and self driving vehicles on the streets of Milton Keynes and Coventry. Its new report ‘The Moral Algorithm’ suggests that harmonised safety regulations will be needed for decisions made by automated vehicles (AVs).
It says that policy should be developed regarding how vehicles’ moral algorithms should operate when it comes to major safety situations, obedience to the rules of the road and behaviour around other road users.
However it points out that concerns over the so called ‘trolley problem’ – where a vehicle must choose between hitting defined individuals in the event of an unavoidable incident – may have been exaggerated.
“At present the public debate has centred on finely grained decisions made by AVs in life and death situations. But this may be a distraction, since at present the technology is not capable of making such decisions and may not be for some time,” the report says.
Arup’s UK Autodrive project director Tim Armitage said: “As with any complex new technology, AVs cannot be specifically programmed to respond to every possible scenario. This simply isn’t practical when a machine is expected to interact with humans in a complex environment on a day to day basis.”
“AVs will drive to the speed limits and will not be distracted from the task of safe driving; they will make practical decisions based on their programming, but they cannot be expected to make moral decisions around which society provides no agreed guidance.
“To allow AVs to demonstrate their capacity for practical decision making in complex environments, and to begin to establish public trust through contact, the first step is allowing testing in relatively simple and well defined environments.”
The report also calls for the creation of an independent regulator to balance the legality, safety and commerciality issues surrounding automated vehicles. It adds that there is a pressing need for regulation to secure public trust and ensure that those developing the technology are not working at legal risk.
Commenting on the research Gowling WLG partner Stuart Young said: “It is important not to equate regulation with a burden. It can, in fact, facilitate new markets and important developments.
“Unless completely new legislation that accommodates new products in advance of them being produced is implemented, this is likely to impose huge additional risks on the companies producing them as a result of regulatory uncertainty.”
This article first appeared on ITS UK Review.