Liability Perspective for Users of Autonomous Vehicles in the EU

Many examples of artificial intelligence have the potential for use by personal users, and this use has implications for liability. As an example of personal use, autonomous vehicles are particularly prominent. Therefore, it is important to establish the obligations and liability of the user in autonomous vehicles. The liability laws of the Member States and the differences between these laws come to the fore since the EU lacks harmonized rules for autonomous vehicle users. This leads us to the question of whether an EU-wide approach should be adopted or not.

An article by Didem Polad

Human-machine interaction has become the typical paradigm of artificial intelligence (AI) development. As a result, the actions of personal users have a significant impact on the performance of AI-enabled products but also pose potential risks. Many products incorporating software and AI are geared toward non-professional personal use. One of the most famous examples of such usage is autonomous vehicles (AV). According to the criteria set by the Society of Automotive Engineers (SAE), the levels of autonomy in vehicles range from 0 to 5. The term “autonomous vehicle” typically refers to levels 3 and onwards, with increasing levels indicating greater vehicle autonomy and reduced driver control.

Achieving a high level of technical sophistication in fully AVs is a challenging task. The dream of having fully AVs soon may not be realistic. The relationship between the driver and AVs will soon become increasingly important, which serves as an example of human-machine interaction and how liability is distributed between the two. Therefore, how the interaction between driver and vehicle may differ from conventional vehicles and defining the driver’s obligations and liability is important. Currently, the legal rules regarding the liability for personal users are established mainly at the national level. Such rules, however, may result in a fragmented legal landscape, preventing the free movement of goods and services. Therefore, there can be an essential question to be asked: Are there and should there be specific liability rules for personal users of AVs at the EU level?

Pioneer in the EU: German Autonomous Driving Act

The German Road Traffic Act provides for a presumed fault liability for drivers of conventional vehicles. The law was amended in 2017 and 2021 to introduce rules for AVs. German road traffic rules on AVs distinguish between level 3 and level 4 vehicles. Level 3 vehicle users are referred to as “drivers” under § 1a(4) of the Road Traffic Act, as is for conventional vehicles, whereas level 4 vehicle users are termed “technical supervisors” under § 1d(3) of the Road Traffic Act. The driver’s liability is presumed fault liability pursuant to § 18(1)(1) Road Traffic Act, while the liability of the technical supervisor is assumed to be fault liability pursuant to § 823(1) German Civil Code. The difference is that in the event of damage, the burden of proof is on the driver in level 3 vehicles and the injured party in level 4 vehicles.

The obligations of the driver and technical supervisor differ pursuant to § 1b(2) and § 1f(2) of the Road Traffic Act. Drivers are obliged to take control of the vehicle when the vehicle requests or when they notice or should notice an obvious situation where the intended conditions of the automated driving functions are no longer met. The technical supervisor should evaluate alternative driving maneuvers, deactivate the vehicle when requested by the autonomous system, and, if necessary, initiate necessary traffic safety measures. At this point, it is safe to claim that the obligations of the level 4 technical supervisor are less than those of the level 3 vehicle driver.

Looking at the obligations of the technical supervisor and the driver, taking over the vehicle in an obvious situation seems to apply to a level 3 driver. A level 3 driver may or may not take control of the vehicle in an obvious situation and may cause an accident in either case. The driver should be able to grasp the obvious situation to control the vehicle. However, defining what is obvious can be challenging. Since the law does not provide a specific definition of an obvious situation, the assessment of what an obvious situation may be set in the case law according to the concrete case.

Besides, the autonomous system can give a warning to take control of the vehicle, which applies to both level 3 and level 4. At level 4, the technical supervisor is only required to evaluate alternative driving maneuvers or deactivate the vehicle if the vehicle gives a warning. In contrast, the level 3 driver is required to take full control of the vehicle. The necessity for the driver to take more control after the warning than the technical supervisor lies in the difference in the level of autonomy of the two levels of vehicles. Failure to take control of the vehicle despite a proper warning may result in liability. That said, it could be argued that since the autonomous system can record all the data, it may be easier to prove whether it was the driver or a system malfunction that caused the accident.

Many different components of the vehicle can cause the vehicle to give a warning. The time it takes for the AV to notify the user of a sensor failure or inform the user that it has failed to detect the pedestrian in front of it is crucial. If the warning is given too late for the user to take over the vehicle or if the warning is not clear, the user’s liability shall be discussed on a case-by-case basis. Although the users are obliged to take control of the vehicle when they receive the warning, it should be borne in mind that this may not be possible in very short periods. Also, if the notification from the vehicle is not clear, the user may not know how to intervene. That is why the timing and clarity of warnings, the level of vehicle autonomy, and the effectiveness of human-machine interaction are all important in establishing liability. Given the limitations and difficulties of human-machine interaction, each case must be considered on a case-by-case basis.

There are various traffic rules in EU Member States besides Germany. For instance, due to Loi Badinter’s influence, France has a strong tradition of strict liability for traffic accidents. It remains to be seen whether France will continue to recognize strict liability for level 3 vehicles, how it will establish the obligations and liability of level 3 and level 4 AV users, and whether it will follow a different path than Germany.

Risk Regulation or Liability?

The AI Act has been widely discussed from different aspects of its path since it was proposed. The Regulation introduces a risk-based approach to AI and sets out the various actors involved in the use of AI and their obligations.

The act defines providers and deployers. Accordingly, providers are persons who develop an AI system or have an AI system developed to place it on the market or put it into service pursuant to Article 3(2), while deployers are persons who use the AI system under their authority, except where the AI system is used during personal, non-professional activity pursuant to Article 3(4) of AI Act. Non-professional users of AI are not defined in the AI Act and are excluded from its scope. The driver of an AV is ultimately a non-professional user of AI; therefore, AV users are not covered by the AI Act.

In addition to the AI Act, a proposal for the AI Liability Directive has been published to complement it. The AI Liability Directive introduces procedural provisions for victims of damage caused by using an AI system under the AI Act. Although one might expect from the name of the AI Liability Directive that it would contain some substantive tort rules, the proposal does not go beyond procedural provisions. Overall, the AI Act does not recognize non-professional users of AVs, and the AILD does not provide a comprehensive tort liability for them. Therefore, at the moment, it is still up to the national laws of the Member States to establish the liability of the personal users of an AV.

EU level: What rules should we have?

Establishing the liability of personal users of AVs is important as we need defined rules for the proper distribution of liability among the actors involved. This would ensure that personal users face effective damages liability while protecting victims.

What also needs to be considered is the current situation within the EU, where the most important debate is whether EU Member States other than Germany will follow a similar path and introduce separate AV liability rules and whether there will be a differentiation of obligations and liability for level 3 and level 4 vehicle users. Since there is currently no law within the EU and the rules for AV non-professional users are at the discretion of the Member States, we can see various and quite different practices within the EU.

There is a reluctance to have EU-wide liability rules since they are traditional and different in EU Member States. However, there is an effort to increase the development and use of AI and to try to harmonize AI systems within the EU internal market. Given that AVs are a severe example of the use of AI, perhaps a future consideration could be whether similar policies could be used to set liability rules to prevent differences in the application and use of AI in AVs within the EU.

Published under licence CC BY-NC-ND. 

This Blogpost was written by

Author

  • Didem Polad

    Didem Polad is a doctoral researcher at the Legal Tech Lab, University of Helsinki Faculty of Law. She holds a master's degree in International Business Law from the University of Helsinki. She is also a registered lawyer in the Istanbul Bar Association. Her research interests are the regulation of artificial intelligence, autonomous systems, and civil liability. She conducts her doctoral research on civil liability for non-professional personal use of artificial intelligence.

Didem Polad Written by:

Didem Polad is a doctoral researcher at the Legal Tech Lab, University of Helsinki Faculty of Law. She holds a master's degree in International Business Law from the University of Helsinki. She is also a registered lawyer in the Istanbul Bar Association. Her research interests are the regulation of artificial intelligence, autonomous systems, and civil liability. She conducts her doctoral research on civil liability for non-professional personal use of artificial intelligence.