Automated driving – what's coming down the road?
The Centre for Connected & Autonomous Vehicles (CCAV) has asked the Law Commission of England & Wales and the Scottish Law Commission to undertake a far-reaching review of the legal framework for automated vehicles, and their use as part of public transport networks and on-demand passenger services. CCAV is a joint policy team formed in 2015 by the Department of Transport and the Department for Business, Energy & Industrial Strategy. The aim of CCAV is to make the UK a premier development location for connected and automated vehicles.
The purpose of the project is to assist in developing a regulatory framework for automated vehicles which will operate on Britain’s existing road network, with the safety of both passengers and the wider public as the key objective.
The technology of automated driving is an extremely fast-moving and constantly developing area which frequently raises legal issues for which there is no direct precedent. This project is unlike any previous project undertaken by either Commission as, for the first time, we have been asked to make recommendations on how the law should be adapted to circumstances which, in the main, do not yet exist.
One of the biggest challenges is to introduce regulation on automated vehicles at the right time. Premature intervention could stifle innovation, but late intervention could jeopardise safety.
Project route map
Automated vehicles do not readily fit within current legal frameworks. Many existing requirements apply to human drivers. The task of the Law Commissions is to recommend a legal framework which can remain effective for vehicles which may no longer require a human driver. Our work will be part of a national conversation on this important future technology. It is not our task to determine whether increased automation in driving is a good thing or not, or to cover consequential policy issues related to the advent of automation and its impact on future workforce planning for Britain.
Begun in 2018, the project is due to run until 2021. The first public consultation and subsequent analysis of responses has already been completed. A second consultation paper is due to be published in the autumn. A third paper will be published in 2020 which will draw on the responses to the first two papers and allow the Commissions to formulate recommendations for a report in 2021.
The project covers road-based vehicles. It will not cover drones or vehicles for use solely on the pavement. By automated vehicles we refer to a vehicle that is capable of driving itself – not being controlled or monitored by an individual – for at least part of a journey. Although integral to delivering effective policy in this area, the project will not cover data protection and privacy, theft and cybersecurity, or land use policy. These areas are being addressed by others as part of the UK Government’s industrial strategy.
The crucial level: paper 1
The first consultation paper looked at issues of safety assurance, civil and criminal liability and the adaptation of road rules for artificial intelligence decision-making. This paper drew a fundamental distinction between “driver assistance” features, which appear in a substantial number of modern cars, and “automated driving systems” which enable the car to drive itself without control or monitoring for at least part of a journey. It is the latter – the automated driving systems – which are the focus of our project.
This paper gave an introduction to various technical terms, including an explanation of SAE levels. The SAE levels are produced by the Society of Automotive Engineers International and are analytical tools which describe the respective roles of the human driver and the automated driving system. They are guides, and crucially do not define legal consequences.
At one end of the scale we have SAE level 0, where there is no driving automation and the human driver performs all the functions, right through to SAE level 5, which is full driving automation and where the vehicle can drive itself anywhere a human could, with no human input. A key transition happens when the task of monitoring the driving environment passes from the human driver (at level 2) to the automated driving system (at level 3 and above) – also referred to as “conditional automation”. This raises many difficult issues in respect of ensuring a safe and timely takeover by the human driver.
Responses: safety first
We received 178 responses to this first paper, comprising 2,000 pages of written responses. They were from a wide range of consultees including individuals, car manufacturers and developers, transport research experts, safety and disability groups, the police, insurers, local government, academics and lawyers. We have published a 190-page analysis, which can be accessed via the law reform projects page at www.scotlawcom.gov.uk
The strongest message emerging was the critical importance of setting up a safety assurance scheme for automated driving systems. Its task would be to ensure that the initial system is safe, then to oversee all software updates, continuing roadworthiness, and information and training. Such a scheme would complement the current system of international approvals. In addition, many road user groups favoured a new road traffic investigation branch to look at the causes of road collisions rather than allocating blame. The UK currently has specialist units to investigate the causes of aviation, maritime and rail accidents, and they enjoy high reputations for their independence and technical expertise.
The Automated and Electric Vehicles Act 2018 received Royal Assent in July 2018. The Act requires insurers to pay compensation to victims where the automated vehicle is driving itself and an accident occurs. The operational provisions of the Act have not yet been brought into force, and the Commissions will be keeping the situation under review. On other civil liability issues, consultes thought there was a need to review product liability law and its application to “pure” software, sold separately from any physical product. This is relevant to “over the air” software updates which add driving automation features.
Turning to the field of criminal liability, many consultees stressed the need to apply sanctions proportionately, to deal with patterns of incidents, not single occurrences. They emphasised a co-operative approach, allowing developers the chance to put things right. The great majority of consultees supported the idea that each advanced driving system should be backed by an entity which would need a minimum capital requirement and insurance to deal with recalls. These are issues which will be considered further in future papers. Many consultees pointed to flaws in the law of corporate manslaughter and corporate homicide: it does not apply to non-fatal injuries, and is difficult to use against large companies.
Handovers between machines and humans are a matter of acute public concern. Consultees pointed out how quickly passive humans become distracted, and how far distracted drivers lack the situational awareness to drive effectively. They also highlighted widespread consumer misunderstanding about the difference between advanced driver assistance and automated driving.
Finally, the paper addressed the challenges of adapting road rules into a more precise “digital highway code” to govern the actions of highly automated vehicles. We asked three sample questions about whether automated vehicles should ever mount the pavement, exceed the speed limit or “edge through” pedestrians. All these issues proved controversial. On mounting the pavement, a small majority thought this would be acceptable to allow emergency vehicles to pass. Views were split on exceeding the speed limit, and a majority thought that it would never be acceptable to edge through pedestrians.
It appears to us that a digital highway code that attempts to set precise rules for every instance is not possible, nor is it realistic to ask developers to deterministically prescribe the behaviours of automated driving systems in advance for every scenario. However, it is possible to provide a more structured dialogue between developers and regulators which allows developers to raise issues of concern. A potential model would be to follow Singapore’s approach by setting up a working group bringing together a panel of developers and regulators chaired by an independent expert.
Passenger services: paper 2
The second consultation paper will be published later this year and will focus on automated vehicles used to provide passenger services, both within and alongside the current public transport network. We will consider how best to achieve the aims of the UK Government’s Inclusive Transport Strategy and the importance of widening access to mobility for older and disabled people.
Traditionally, road passenger services have been divided into public service vehicles, taxis, private hire cars and rental vehicles, with separate regulatory systems applying to each. These distinctions are becoming increasingly blurred. Rather than fit automated services into the current regulatory structures, we envisage a new form of regulation for automated road passenger services.
We welcome responses to the second consultation paper later in the year. We hope that solicitors will take the opportunity to contribute to the evolution of this area of the law. All papers and other information relating to the project can be found on the Law Commissions’ websites: lawcom.gov.uk and scotlawcom.gov.uk