Next Article in Journal
Signal Transformations for Analysis of Supraharmonic EMI Caused by Switched-Mode Power Supplies
Next Article in Special Issue
Calibration Venus: An Interactive Camera Calibration Method Based on Search Algorithm and Pose Decomposition
Previous Article in Journal
Q-Factor Performance of 28 nm-Node High-K Gate Dielectric under DPN Treatment at Different Annealing Temperatures
Previous Article in Special Issue
A Low-Cost Platform for Modeling and Controlling the Yaw Dynamics of an Agricultural Tractor to Gain Autonomy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Automated Driving: A Literature Review of the Take over Request in Conditional Automation

by
Walter Morales-Alvarez
1,*,
Oscar Sipele
2,
Régis Léberon
3,
Hadj Hamma Tadjine
4 and
Cristina Olaverri-Monreal
1,*
1
Chair ITS-Sustainable Transport Logistics 4.0, Johannes Kepler University, 4040 Linz, Austria
2
Computer Science Department, Universidad Carlos III de Madrid, 28911 Leganés, Spain
3
IAV France S.A.S.U., 4, Rue Guynemer, 78280 Guyancourt, France
4
IAV GmbH Entwicklungszentrum, Carnotstraße 1, 10587 Berlin, Germany
*
Authors to whom correspondence should be addressed.
Electronics 2020, 9(12), 2087; https://doi.org/10.3390/electronics9122087
Submission received: 9 November 2020 / Revised: 29 November 2020 / Accepted: 1 December 2020 / Published: 7 December 2020
(This article belongs to the Special Issue Autonomous Vehicles Technology)

Abstract

:
In conditional automation (level 3), human drivers can hand over the Driving Dynamic Task (DDT) to the Automated Driving System (ADS) and only be ready to resume control in emergency situations, allowing them to be engaged in non-driving related tasks (NDRT) whilst the vehicle operates within its Operational Design Domain (ODD). Outside the ODD, a safe transition process from the ADS engaged mode to manual driving should be initiated by the system through the issue of an appropriate Take Over Request (TOR). In this case, the driver’s state plays a fundamental role, as a low attention level might increase driver reaction time to take over control of the vehicle. This paper summarizes and analyzes previously published works in the field of conditional automation and the TOR process. It introduces the topic in the appropriate context describing as well a variety of concerns that are associated with the TOR. It also provides theoretical foundations on implemented designs, and report on concrete examples that are targeted towards designers and the general public. Moreover, it compiles guidelines and standards related to automation in driving and highlights the research gaps that need to be addressed in future research, discussing also approaches and limitations and providing conclusions.

1. Introduction

Vehicles with some degree of driving automation have been anticipated for decades. The series of automated actions that these vehicles perform to transport people or goods make it possible to define them as vehicular robots, since they move between two points without the intervention of humans [1,2].
However, the full automation that characterizes level 5 vehicles requires mastery of the many challenges that pertain to their development and introduction in the market, including the detection of other road users and the monitoring of driver behavior in case manual control needs to reinstated [3].
Even if the absence of human intervention in the control of autonomous vehicles (AV) increases road safety [4], the implementation of AV represents a complex multi-disciplinary problem that has not yet been totally solved. Hierarchical steps have been introduced to define the systems capabilities and address the role of the human and the system with regard to the control, environment monitoring, and fallback control depending on different levels of driving automation [5].
However, the most advanced vehicle functionality currently available on the market (AutoPilot from Tesla [6]) only corresponds to partial automation or level 2.
In conditional automation or level 3, human drivers are required to be ready to resume control in emergency situations, such as a system failure, if a Take Over Request (TOR) from the system has been triggered or where the circumstances exceed the Operational Design Domain (ODD) of the system. In this process, the perception and comprehension of the road and traffic environment information or situational awareness [7] play a fundamental role in deciding imminent actions.
When a TOR is issued, drivers need to deviate their attention from whatever tasks they are performing on the road. To this end, they must process the perceived information from the environment and react by activating the vehicle actuators to perform the dynamic driving task based on their understanding of the present situation. Figure 1 illustrates the process by depicting the components involved in a take over request.
Before a TOR is triggered, both internal and external information about the driver’s state and the driving situation should be gathered to guarantee a safe outcome. This is particularly important in take over processes that are required in emergency situations. Therefore, it is necessary to develop Driver Monitoring Systems (DMS) that are able to estimate the driver’s physical and mental state and can correlate this information with the specific road situation. To this end, DMS collects baseline values of different driver state indicators, such as drivers gaze, pose, and physiological metrics, and calculate a deviation measure with respect to these values, which then allows them to determine the level of the driver’s attention [8].
The National Transportation Safety Board (NTSB) promotes the use of DMS in response to system failures as for example the fatal AutoPilot Crash of the Uber car in 2018 [9]. Driver monitoring should assure appropriate driver engagement on all vehicles with at least a level of partial automation (level 2), as drivers are required to resume control at any time [5].
This work compounds and analyzes literature in the field of conditional automation or SAE level 3, focusing on the TOR, while emphasizing concerns, describing theoretical foundations on solution designs, and mentioning concrete examples, particularly highlighting areas in need of further research. It reviews the basic groundwork for an understanding of important human and environmental factors that influence the design and testing of the TOR. Furthermore, it provides referenced information to help practitioners improve systems that trigger TOR.
It is not the aim of this work to provide an exhaustive overview of all guidelines for all possible automated functions in the vehicle, but rather to review concepts and suggest a foundation for an understanding of some important factors that influence the design and testing of TOR systems. It focuses on the major achievements in the field, the main areas of debate, and outstanding research questions.
The next sections are distributed as follows: Section 2 gives an overview of the challenges that are related to the process of the take over request associated with level 3. Section 3 presents concepts that pertain to the design of systems to transfer vehicle control. Section 4 presents a summary of the main points covered in the previous sections. Section 5 highlights the theoretical foundations of solution designs, presents a selection of tested design concepts and findings, and gives concrete examples of how to apply these theories to specific designs. Section 6 outlines guidelines and standards related to specific features of automated driving mentioning as well policies and regulations. Finally, Section 7 recapitulates the main issues described in the paper, uncovers research gaps, and proposes new research questions for further work.

2. Conditional Automation Challenges and Complexity

The level 3 of automation is the next milestone in developing fully autonomous cars. However, there is an ongoing discussion about whether it is better to simply shift the focus to the development of systems with a level of automation 4 or higher [10]. Companies such as Ford and Toyota are currently developing systems with a level 4 of driving automation [11,12], leaving aside level 3. One of the main reasons for this decision relies on human factors, since level 3 systems must successfully return drivers that are inattentive to the Dynamic Driving Task (DDT) [13]. This interaction between system and driver depends to a large extent on the capabilities of each individual driver, which will be detailed in the sections below. Therefore, even a well-designed information transfer does not ensure that all drivers gain control of the vehicle within the time that is required to ensure road safety. Moreover, the uncertainty that pertains to level 3 regarding liability in case of an accident and the lack of regulatory measures and legal framework supports the decision of avoiding level 3.
According to the information presented so far, we summarize that regaining control of the vehicle after automation is a complex process that requires the driver to be aware of the specific emergency situation, the information provided by the system and at the same time to identify, process, and comprehend the current traffic conditions.
Several factors contribute to this complexity and they are listed below.
  • The potential boredom and road monotony associated with higher levels of driving automation might lead to a reduction in driver situational awareness [14,15]. This hypovigilance needs to be taken into account when a vehicle control is expected from the driver [16].
  • The reaction time (RT) to a TOR after the driver has been involved in a non-driving related task (NDRT) does not return to its baseline performance level immediately after the distraction. This means that drivers can be distracted up to 27 s after finishing a highly distracting task and up to 15 s after interacting with a moderately distracting system [17], and secondary tasks may affect the driver even after a task/distraction phase has been completed [18].
According to the complexity classification in objective and subjective by the authors in [19], we have compiled the different factors that determine the complexity of the TOR in Table 1. We additionally describe several concrete examples.

2.1. Objective Complexity

Objective complexity or descriptive complexity [20] refers to the inherent properties of novel systems and therefore varies independently of the individuals to which the system is exposed [21]. According to this, there are factors in a TOR that depend not only on what people perceive, but also on objective situational characteristics that are independent of the observer.
The objective complexity of a TOR is determined by factors such as:
  • traffic density,
  • road conditions,
  • environmental conditions,
  • specific transfer system from automatic to manual driving
We describe in the following sections how each of these factors affects the TOR process and the driver’s ability to regain control of the driving task.

2.1.1. Traffic Density

Driving is a social activity that demands interaction with other road users and that varies according to the specific environment. Therefore, several works studied the influence of the traffic situation on the TOR, the results of which showed that complex traffic situations negatively affected the quality of the process. The dynamic state of the surrounding vehicles may prevent certain maneuvers from being carried out by the driver, such as a lane change maneuver in a situation of high traffic density. In this case, the number of braking maneuvers might increase.
Under high density traffic conditions, the time to collision was reduced, thereby increasing the probability of collision and augmenting the lateral acceleration of the vehicle [22,23].
There is also a direct relationship between the time to regain the control of the vehicle and traffic density, with a high density of traffic implying a longer time to regain control (7 s in a situation with 10 to 20 vehicles per kilometer, according to [23]), which therefore presents an increase of collision risk [24].

2.1.2. Road Conditions

Another critical factor that affects TOR is road geometry, as it directly influences driving by imposing inherent speed and acceleration limits. The curvature of the road affects the driver’s ability to maintain vehicle position in the center of the lane. At the same time, curvy roads obstruct the visibility of the road regarding upcoming vehicles. Therefore, emergency maneuvers are required in many cases where drivers enter curves at high speed [25].
Vehicles with a high degree of autonomy can adapt their speed to the curvature based on the control algorithms with which they are developed. However, in vehicles with a level 3 of autonomy, road curvature negatively influences TOR performance compared to straight roads by increasing driver reaction time and increasing lateral deviation [26]. Furthermore, road geometry influences driver deceleration patterns when they regain control of the vehicle, deceleration being more abrupt when entering a curve [27]. Furthermore, a relationship between the road conditions and the urgency of the TOR exists, with a longer time required for the driver to control the vehicle in straight lines if the urgency is low. However, in curves, the time that drivers need to control the vehicle is longer if the urgency is high [28].
As a consequence, road conditions play an important role in the design of TOR systems, and they should adhere to predictive algorithms to establish transition protocols. Such an approach would allow the driver to take over control of the vehicle while minimizing lateral deviation.

2.1.3. Control Transfer Systems

There is a time interval for the driver to take control of the vehicle safely after having received the TOR message. In this interval, the driver is required to follow a process of adaptation in the transition from a state of low situational awareness, to a higher one [29]. One of the biggest challenges is to create a system that conveys the message in a clear, explicit way, while at the same time allowing for the possibility of continued automated control of the vehicle in the event that the driver cannot take over [30]. To this end, during the mentioned interval, a shared control between the vehicle and the driver should be guaranteed.
Some approaches rely on guidance systems based on haptic devices that give feedback to the driver through the actuators of the vehicle. This guidance occurs when an action has been performed that differs from the maneuver that the automated system had selected. The use of haptic guidance reduces the cognitive workload of the driver.
Several works in the field developed and tested shared control policies. The studies showed that the use of these systems decreased the lateral error of the drivers, at the same time increasing comfort in the handling of the vehicles [31,32,33].
In the same context, further studies measured the situational awareness of the drivers by defining three levels of human participation that would determine the specific level of guidance of the vehicle, automated dominance being the lowest level and human dominance the highest level. Using simulations, the authors concluded that their guidance systems were capable of guiding driver collaboration with the automation systems and of resuming manual control safely and smoothly [34,35].

2.2. Subjective Complexity

Subjective complexity encompasses the factors that are affected by individual cognition adaptation processes of the driver and their influence on the response to a TOR. This includes the driver’s state, such as vigilance, stress level, and cognitive load due to non-driving secondary tasks (NDST) [36]. Human Machine Interface (HMI) establishes the dialog with the driver in order to support the driver’s decision-making process, keep their active supervision, and request their intervention. Moreover, automated driving interfaces that are easy to understand and use can create the level of trust required for the driver to feel that the vehicle is functioning correctly [3]. In line with this, adhering to the guidelines described in the work in [4], in the project TrustVehicle [37], different HMI were designed to promote trust by relying on adaptive and intuitive interfaces, measuring the driver state and identifying risky traffic conditions to prioritize information.
Accordingly, one of the most pressing current research questions is how can TOR methodologies take into account subjective complexity.
As previously mentioned in Section 1 and later in Section 3, current approaches rely principally on promoting driver vigilance with regard to the dynamic driving task when the automation system is active. The automated driving regulatory framework currently does not enable the ADS to perform the DDT without human inspection because the limited level 3 automation is extremely dangerous outside ODD limits. Therefore, the techniques oriented to preserve driver vigilance regarding the DDT are crucial for a safe TOR response, and they can make a cognitive reassignment much easier in case of need.
According to the issues exposed above, the two main aspects that subjective complexity covers are:
  • The assessment of the driver’s readiness to intervene after a TOR.
  • The assessment of the appropriate interfaces used for interacting with the driver.
Each aspect is further detailed in the following sections.

2.2.1. Driver’s Readiness Assessment

Different approaches enable the assessment of the driver’s readiness to interact with the vehicle when a TOR has been issued.
  • Firstly, sensors located on the steering wheel are able to sense periodic interaction. However, the main downside of this approach is the compromised user experience or reduced joy of use [38] that the obtrusive system demands and therefore the high tendency to cheat the system, this being a consequence of overtrust.
To solve the cheating problem, an integrated indicator for interaction assessment such as an applied torque on the steering wheel can ensure a reliable detection of a real steering wheel interaction [34,35].
  • Secondly, an eyes-on-road approach is based on glance analysis in order to determine the regularity of vigilance. These driver glance models allow the assessment of cognitive engagement during the TOR process [39]. During the transition to manual driving after conducting a NDRT, the driver’s pupil diameter, glance duration, and the glances to certain Area of Interest (AOI) are used to assess performance and gaze [40,41]. Here, it is crucial to address the need of measures to regulate the privacy issues that arise from the collection of data. It should be clear who will gain access to it, particularly when it concerns personal data.
  • Further studies include facial expression as an aspect for the assessment of TOR quality [42], whereby face changes might indicate an unfit condition for TOR attendance.

2.2.2. TOR Communication Interfaces

As depicted in Table 2, HMI can use different communication ways to transmit the TOR to the driver. Visual displays convey clear information that facilitates the understanding of the messages by the driver [27]. In the same context, auditory warning interfaces are complementary to visual displays. Studies such as [43] assessed how efficient generic warning tones compared to speech warnings were for conveying a TOR, concluding that speech messages improved the response time from the driver.
As for vibrotactile displays, the most relevant aspects of their functioning can be described using several dimensions according to [44].
  • Frequency and amplitude are static aspects, and they are more related with comfort concerns.
  • Location and timing can dynamically adjust, thus encoding different urgency levels.
Different configurations help to determine which set of these aspects have more influence in the transition from automated to manual driving. As a consequence, multimodal combinations that rely on the combination of auditory, visual, and tactile displays have demonstrated an improvement in the driver’s perception of urgency [45].
Additional works refer to the trust in the systems that trigger a TOR and show that the response time to a TOR from drivers that had been previously familiarized with the system functioning was positively affected [46,47].
The next sections define the concepts related to TOR and summarize the established metrics that determine a good response from the side of the driver.

3. TOR Design Concepts

Several works have investigated the transition from automated mode to manual vehicle control, handling the TOR as a process that consists of successive stages that is triggered when the operational limit of the ADS has been reached [23,48,49].
Vehicle sensors detect whether a need to trigger a TOR exists, and, in affirmative cases, the interaction with the driver occurs through the vehicle’s HMI. Drivers, who might be engaged in NDRT, must completely shift their attention back to the road environment and immediately estimate the situation, to be able to regain control of the vehicle actuators in time. According to the authors in [50], this transfer of control can be considered as a task interruption handling process that involves several stages.
The TOR design concepts developed so far address not only how the TOR is issued and presented to the driver, but they also rely on high-level architectures based on vehicle metrics, situational factors, driver state evaluations. Figure 2 illustrates through an example a conceptual framework. In this framework, an ADS controls the vehicle dynamics to perform a DDT, while constantly monitoring the driver. Inside this ADS, a TOR system conveys information to the driver through a HMI depending on whether the ODD boundaries have been reached or an emergency situation has been detected. The information is then received by the driver, through their sensory system. Finally, the driver responds to the TOR through their psychomotor system according to their driver’s state and cognitive capabilities.
A TOR can be modeled as a control switch between two systems and define boundaries that establish the security level based on the structure and components of the system [51]. In line with this, a structured catalog of recommendations for user-centered designs was compiled for systems that convey TORs within the AdaptiVe project [52]. The project included an arbitration model to regulate the interaction between the components of the system, namely vehicle, automation, and driver [53]. In contrast to this approach, the work in [35] proposes a TOR system that operates as a shared control model that continuously evaluates driver situational awareness before giving complete control to the driver. To this end, a DMS relies on the driver’s physical responses and environment perception to estimate their cognitive state [16,54].
After detecting an upcoming TOR, an HMI conveys a timely message to the driver. The interaction needs to be dynamic depending on the urgency of the request, the message being in a prominent location within the vehicle [4,55] to attract the driver’s attention. The way of conveying the message can be classified as follows:
  • Visual: In a visual HMI, the system relies on images that can either be explicit [56] or using icons [24]. Modern visual HMI can also rely on a vehicle’s ambient lights systems to attract driver attention continuously but unobtrusively through a peripheral vision stimulus that is processed subconsciously [38]. This strategy has been proved to create a balanced level of automation and cognitive workload.
  • Auditory: Regarding auditory signals, HMIs tend to rely on acoustic sounds at different frequencies to convey urgency signals to the driver [57], although there is ongoing research that shows that additional explicit information beyond auditory signals is needed to achieve necessary driver situational awareness [55,58].
  • Haptic: Haptic interaction relies on kinesthetic communication to convey information to drivers through tactors that can be located either on the seat or the seat belt [45].
To provide designers and practitioners with an overview of the different ways to convey TOR-related information, we show in Table 2 the HMI modalities that have been investigated in several research studies. The advantages and disadvantages are outlined and sourced from the literature by analyzing the references where the different modalities were used.
The ideal concept to convey a TOR in level 3 automation is still a pressing issue that needs further research. Several studies agree that multimodal messages consisting of a combination between acoustic sounds, images, and vibrotactile messages are more effective in TOR situations [36,60], but there are still many open questions about how this should be achieved and which is the message that has to be conveyed. To this end, several works have investigated information prioritization and functionality clustering for different modules in Driver Information Systems (DIS) and Advanced Driver Assistance Systems (ADAS) [41,84,85,86,87] to ascertain where the increasing amount of vehicle information should be located within the vehicle to reduce the drivers’ eye time off the road when looking for it. To illustrate several design concepts, Figure 3 shows HMI examples that have been implemented in TOR related studies.
In addition to the design approaches described so far, we argue that, from a system perspective and the design of the take over, an intuitive support that enables an automated mode disengagement to return to manual mode should be provided. In addition, an instantaneous take back of control performed intentionally by the driver is necessary. For this, two points need to be considered.
  • First, to filter the actions from the driver that might lead to a disengagement of the system unintentionally, such as touching the steering wheel accidentally.
  • Second, to secure the transition pressing the pedals should not be sufficient to disengage the lateral control until the driver has recovered the control of the steering wheel.

4. Take Home Messages and Recommendations

This section presents a summary of the main points covered so far.
  • Automation level 3 is the next milestone in developing fully autonomous cars. However, there is an ongoing discussion whether it is better to skip this level and give more attention to the development of systems with driving automation above level 3.
  • TOR design concepts rely on high-level architectures that are based on vehicle metrics, situational factors, and driver parameters evaluations.
  • Conditional automation (level 3) automation enables out-of-the-loop states during which drivers do not need to be aware of the driving functions or roadway conditions while the vehicle operates in its ODD. However, human drivers are required to be ready to resume control in emergency situations or when the system has reached its ODD boundary [5].
  • The lack of human intervention in the control of autonomous vehicles might increase road safety.
  • An HMI conveys timely information to the driver with a dynamic message located in a prominent location in the vehicle to attract the driver’s attention.
  • The most common approaches to issuing a TOR rely on visual, auditory and haptic information.
  • The complexity of a TOR is increased due to the boredom and road monotony associated with higher automatism in vehicles, which leads to a reduction in driver situational awareness.
  • The reaction time to a TOR does not return to its baseline performance level immediately after being involved in some NDRT.
  • Complex traffic situations increase the probability of collision.
  • A high density of traffic results in a shorter time to regain vehicle control.
  • TOR performance is negatively affected by road geometry; road curves increase driver reaction time, lateral deviation, and cause abrupt deceleration after control of the vehicle is regained.
  • A longer time is required for the driver to control the vehicle in straight lines if the urgency is low.
  • The time that drivers need to control the vehicle in curves is longer if the urgency is high.
  • Shared control mechanisms between the vehicle and the driver decrease the lateral error of the driver and increase fluidity of vehicle operation.
  • TOR guidance systems support drivers in resuming manual control safely and smoothly.
  • Driver state, such as vigilance level, stress level, and cognitive load affect the response to a TOR.
  • Factors such as the driver’s readiness to intervene (through sensors and/or eye/facial detection and tracking), as well as the interfaces used for TOR determine the subjective complexity of the system.
  • Auditory, visual, and vibrotactile and multimodal displays maximize the TOR execution quality depending on the defined urgency levels.
  • Actions from the driver that might lead to a disengagement of the system unintentionally should be considered.
  • Lateral control should only be disengaged when the driver has recovered the control of the steering wheel.
  • The use of haptic guidance reduces the cognitive workload of the driver. Thus, its implementation in the TOR serves as support for drivers to adapt to the current situation.
  • From a system perspective and the design of the take over, an intuitive support that enables a smooth transfer from automated to manual mode should be provided.
  • Familiarity with autonomy in vehicles is directly connected to trust: when repeatedly proven to function properly, these technologies can build trust and thereby support the use of more complex automated driving tasks.

5. Theoretical Foundations on Take over Assessment Metrics

This section provides the theoretical foundations behind the TOR concept, referring as well to the metrics used to measure the quality of the process. It outlines its complexity, as already mentioned in Section 2, relying on the classification introduced in [72]. Concrete examples of how to apply these theories to specific designs are also provided.
To improve the performance of the transition from autonomous to manual control, it is necessary to determine the metrics that measure the quality of the action. We briefly introduce these factors in this section, as they are crucial to establish the relationship between a triggered TOR and driver’s response.
The correct, complete execution of the TOR in performing a certain maneuver is the most intuitive parameter that measures performance. For example, the actual and successful avoiding of an obstacle collision, either by braking or by performing a lane change.
Driver situational awareness, as mentioned already in different sections, is crucial in the process, since it defines the perception and understanding of humans and allows them to project the future actions necessary to adapt to a dynamic environment [7,58]. Additional parameters to measure TOR are cognitive workload, trust, comfort, and the issue- and reaction times to the TOR, as described in Section 2.
The most commonly used metrics in driving performance studies to measure the quality of the take over are speed metrics. This is a straightforward regulating or monitoring driving performance metric that determines the speed-reducing effects of a specific event [88]. In many cases, the TOR represents an actual event that will cause a braking reaction. Additional driving performance assessment methods and metrics relate to lateral and longitudinal metrics as originally defined in [89] and time to collision. They have been used in TOR-related experiments that investigate different modalities to convey messages such as in [63].
The quality of performance of the transition from automated to manual control also depends on the controllability of the situation by the driver. Related to this, different naturalistic driving data recorded on video, such as longitudinal/lateral control of the vehicle, lane choices, braking response, system operation, and driver facial expressions, could be integrated into a global measure of controllability or rating system, in order to assess TOR situations [42].
Despite previous efforts, there are currently no developed standards for the assessment of the take over, since current solutions lack maturity and are not yet utilized. For example, the exact moment at which a TOR is triggered, the take over time (TOT) as well as the moment at which the take over occurs are decisive in determining the functioning and performance of the system. The section below describes in detail these metrics.

5.1. Take Over-Related Definitions

As previously mentioned, the dynamic TOR process is triggered in emergency situations or in situations where the ODD boundary is predicted to be reached. As a consequence, the transition time of vehicle control from automated to manual driving mode, the so-called handover phase, is critical as a sufficiently comfortable transition time is necessary to guarantee road safety [55,90].
Accordingly, the authors in [30] define the Take Over Time (TOT) as the time interval from when the TOR is issued until the driver has successfully taken control of the vehicle and resumed the DDT.
Related to this is the Take Over Reaction Time (TOrt) or the time needed to return control of the vehicle to the human driver [91]. This definition has been later used in different works (e.g., [59]) in combination with the lead-time from a TOR to a critical event (TORlt) to determine the time it takes drivers to resume control from conditional automation in noncritical scenarios. In line with other works, it was concluded that drivers occupied by a secondary task exhibited larger variance and slower responses to requests to resume control [59].
In an additional work, the impact of different TORlt on drivers was studied, the authors concluding that 7 s are required to regain control of the vehicle [24]. While some authors argued that TOrt should be between 2 and 3.5 s [59] (see also [92] for an exhaustive analysis of the related literature), other studies such as [64,93] showed that participants needed less than 1 s to take the wheel.
The TORlt must give the drivers a sufficient time budget to adapt to the current situation [94], but also not be too long, as in some instances it could confuse the drivers due to the lack of an imminent emergency [74,75]. On the other hand, some drivers might check the mirrors or adjust their seating position before taking control of the vehicle [92].
There is no consensus in the literature regarding the exact moment at which the take over occurs (which is used to measure TOrt, but it is not the TOrt). It can be defined as the moment in which the driver first glances back to the road [24]; the time at which the driver begins to brake [57]; the moment the driver’s hands move towards the steering wheel [95] or the moment the driver actually touches the steering wheel [93]. The TOrt is a complex parameter that not only depends on the intrinsic ability of the driver to react to sudden events, but also on the situation in which the TOR is issued.
As described in this section, driver reaction time depends on the complexity of the situation, which is determined by factors such as NDRT, as well as the specific definition of how to determine the time. Thus, the reaction times in the literature vary a lot depending on the specific study.
As a consequence, common standard definitions and guidelines are needed to accurately investigate the TOR. To this end, further studies such as those of [49,96] focused on modeling the TOR in order to predict the reaction time of drivers, depending on the characteristics and complexity of each situation.
In order to provide the reader with an overview or the most representative findings regarding the assessment of the take over, Table 3 shows the times to react to a TOR depending on the modality used to convey the message and the type of action that determined that the control handover was successful.

5.2. Take Home Messages and Recommendations

We present here a recapitulation of the main points covered in the section. As it has been described above, complex situations and road conditions play an important role in the design of TOR systems and should therefore be taken into account.
  • The most commonly used metrics to determine the performance of a TOR are driving performance metrics such as longitudinal/lateral control, speed, brake, time to collision, and driver physical state.
  • Currently, there are no developed standards for rating systems that assess TOR situations.
  • The vehicle control transition time from automated to manual driving modus, the so-called handover phase, is critical, as a sufficiently comfortable transition time is necessary to guarantee road safety.
Predictive algorithms to establish transition protocols would allow the driver to take over control of the vehicle while minimizing the lateral deviation of the drivers.
Common standard definitions and guidelines are needed to accurately investigate TOR situations. Additionally, more research is needed to predict the reaction time of drivers as it relates to the characteristics and complexity of each situation.
To guarantee a safe outcome, autonomous vehicles will need to leverage road user and passenger safety and other factors including the detection of obstacles along with weather and road conditions [3].
The next sections outline guidelines, standards, and regulations related to automation, summarize the main issues described in this work, and also propose further work.

6. Standards, Guidelines, Policies, and Regulations

Currently, vehicles with high levels of automation are still under development and are not mature enough to be launched into the market. For example, the Autopilot from Tesla [6] is marketed as a level 2. A level 3 system, the Vision iNext, is currently been developed by BMW [114]. Mercedes-Benz, in association with Daimler, is developing a level 3 concept called Drive Motion [115] as well. Table A1 from Appendix A lists a variety of systems that are provided with automation that are currently being developed by the automotive industry.
Audi intended to commercialize vehicles with limited level 3 capabilities, such as the Audi A8 with its traffic jam pilot system [116], but the project was dropped due the lack of a legal framework to certify level 3 automation features [117]. The goal was to operate within a ODD that was restricted to traffic jams, in which the automation controlled the vehicle at a limited speed, making it possible for the driver to perform NDRT. Although this system has been tested, most countries’ regulations prohibit the use of vehicles with driving automation above level 2 systems on roadways. Current regulations demand that drivers are attentive to the road at every moment, prohibiting the execution of tasks other than driving. An example of these regulations is described in [118] for Austria.
Achieving widespread use of driving automation above level 2 is not only being pursued by the automobile industry, but also by other organizations such as the European Union (EU), which recognizes the societal impact that the new technologies might have regarding improved traffic efficiency and reduced emissions and energy consumption [119].
These technologies have the potential to improve traffic flow and road safety as well as create new job opportunities, making related industries more competitive in offering new products in a variety of sectors such as transport convenience stores, fleet, and insurance companies, etc. [120]. Therefore, the EU encourages the investment in automation innovation and promotes the development of autonomous systems through a variety of research calls. A selection of funded research projects related to automated and connected vehicles has been compiled in Table A2.
Nowadays, governmental associations such as the National Highway Traffic Safety Administration (NHTSA) in the United States or the Mobility and Transport branch of the European Commission investigate and issue the future challenges and legislation about driving automation systems in the matter of road safety. Original Equipment Manufacturers (OEMs) and academic researchers contribute to the regulations with their know-how and experience, which is included and represented in the standards and legal framework defined by these regulatory organisms. For example, an extended taxonomy of terms regarding TOR was compiled in [121], covering also its different related scopes such as legislation, insurance industry, as well as technical concepts regarding stakeholders that are interested in this complex process.
To help practitioners improve systems with several levels of automation, we have compiled referenced information that might be useful when considering the design of TOR. We include a selection of standards, guidelines, and European regulations to support the development process of new automation features in Table A3, extending the list with ISO, SAE, IEEE, and ETSI standards that define the characteristics, limits, terminology, technical aspects, and evaluation procedures of automation systems in Table A4.

7. Research Gaps and Conclusions

After having identified in this work the major achievements in the field and the main challenges and research questions, some research gaps that could improve the TOR process were found. This section will discuss open research steps in the TOR field.

7.1. TOR Models

TOR needs to be modeled as a system that consists of the sub-processes that have been introduced in this paper. Works such as those of [30,34,35,51] show the first steps that are required to develop TOR systems. However, in order to cover all the issues that are related to TOR, these models need to be extended. Although previous literature has in great measure formalized the characteristics of the TOR process, validations based on comparisons with naturalistic driving data from field tests using statistical analyses are missing.
Future work should be performed to classify situations in which a TOR is triggered in order to estimate the driver’s ability to regain control of the vehicle. For example, predictive models could be developed to determine the type of NDRT, and based on this establish the type of message to transmit to the driver, its urgency, and the suggested driving maneuver.

7.2. Vehicle-Driver Cooperation

As mentioned in this work, system information and driver state integration allow for a smooth transition from ADS to manual driving. To this end, TOR-specific driver-system cooperation policies are required. Most of the current literature focuses on the factors that influence driver’s response to a TOR, lacking these studies’ solutions that address information on shared control policies. The existing works address a driver–system cooperation to perform the DDT assuming that the driver’s attention is on the road [122]. Therefore, it is fundamental to consider cooperation in further situations, in which the driver performs NDRT.

7.3. Real World Tests

Due to current regulations, the lack of certified systems for road use, and driver safety concerns, most of the studies mentioned in this paper are based on driving simulations with different levels of realism. The use of simulations represents one of the most critical limitations in studying human factors in autonomous vehicles since drivers tend to behave differently in real vehicles and less controlled environments. There are studies such as [93], which use a real platform to determine driver reaction time. However, such works are very limited and a detailed study with a big sample of data needs to be performed to investigate all the different factors that affect TOR in a real road environment.

7.4. Additional Factors That Require Further Study

To the best of our knowledge, the study of the relationship between TOR performance and factors such as gender, age, health, road, and weather conditions, number of passengers, or previous knowledge of the driver regarding automated functions is still limited and needs more research.
Additionally, more studies should assess the relationship between physiological measures such as heart rate, eye pupil dilatation, or brain waves and driver behavior under TOR conditions. The establishment of the related relationships would allow the development of models that can predict driver reactions before a fallback. Therefore, there is a growing need for vehicle data sets and driver metrics that could help the research community to train and validate models that consider a TOR.
Additional research gaps that have been identified in this work concern situations in which potential hazards are outside the Field of View of the driver (FOV):
  • Should a TOR be triggered in situations in which potential hazards are outside the driver’s FOV?
  • How should the warnings be conveyed in this case and would these warnings impact driver behavior even without the driver being able to confirm the threat visually?
An additional concern that could be investigated is how to reduce annoyance and maximize safety benefits. For example, could the measurement of stress levels help determine a better moment to trigger the TOR, i.e., produce a TOR at the most convenient moment given the situation?
Finally, we would like to emphasize the fact that, despite the guidelines and standards mentioned in Section 6, there is a lack of specific standards for TOR. Most of the presented guidelines and recommendations refer to how the TOR should be performed without considering the time to issue and to understand the TOR, the specific road situation, and/or the driver’s individual characteristics. Therefore, it is imperative to create standards that are based on the factors that affect TOR, systematically establishing the requirements that must be met to deliver safe control of the vehicle to the driver. Furthermore, these standards must stipulate quantifiable quality measures in TOR that must be met by ADS with level 3.

Author Contributions

Conceptualization, W.M.-A., O.S., H.H.T., R.L., and C.O.-M.; methodology, W.M.-A. and C.O.-M.; formal analysis, W.M.-A.; investigation, W.M.-A.; resources, W.M.-A. and C.O.-M.; data curation, W.M.-A.; writing—original draft preparation, W.M.-A. and O.S.; writing—review and editing, R.L., H.H.T., and C.O.-M.; visualization, W.M.-A. and C.O.-M.; supervision, C.O.-M.; project administration, R.L., H.H.T. and C.O.-M.; funding acquisition, C.O.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the Austrian Ministry for Climate Action, Environment, Energy, Mobility, Innovation, and Technology (BMK) Endowed Professorship for Sustainable Transport Logistics 4.0; the Spanish Ministry of Economy, Industry and Competitiveness under the TRA201563708-R and TRA2016-78886-C3-1-R project; open access funding by the Johannes Kepler University Linz.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Systems with automation currently being developed by the automotive industry.
Table A1. Systems with automation currently being developed by the automotive industry.
CompanySystemLevel of AutomationTargeted EnvironmentRemarks
TeslaAutopilotlevel 2HighwayCurrent autopilot features require active driver supervision. They do not produce autonomous vehicles.
AudiTraffic Jam Assistantlevel 3Highway & Traffic JamThe first series vehicle not only maintains speed and the distance to the vehicle in front fully automatically, but also changes lanes automatically by overtaking. However, so far, no Audi A8 has been delivered with the corresponding software enabled. The reason is the lack of a legal basis. After an initial euphoria, even in politics, there are still no regulations on the use of automated driving functions in public road traffic.
BMWVision iNext,level 3HighwayThe vehicle featured two type of modes: - Boost mode: The driver can retain the traditional controls and drive the vehicle. - Ease mode: The driver can activate the automated driving system and take their eyes off the road to focus on other activities. First system on US roads to utilize conditional automation according to SAE standards.
Mercedes BenzDrive Pilot, Intelligent Drive & Highway pilotlevel 3HighwayThe readiness from the driver to take over is continuously monitored. When drivers are requested to take over, the system control is maintained until the take over maneuver and the request time are finished, and the failure mitigation strategy is triggered. Very limited to the geographical areas and working conditions like weather, traffic, lighting and road types.
Hyundai Motor Co. & Kia Motors CorpM.BILLYlevel 3HighwayHyundai Mobis since last year is conducting tests in South Korea, the United States and Europe. The proposed system allows lane changes and other autonomous driving functions to work without driver intervention. The driver must be available to take control of the vehicle if the need arises.
Hyundai Motor Co. & Kia Motors CorpM.VISIONlevel 3HighwayStands in the concept phase. Vehicle is under evaluation. Aptiv & Hyundai founded the joint venture to develop this system.
Yandex with Hyundai level 4Highway & urbanR&D project as a result from the cooperation between both companies. Around 100 Hyundai Sonata of the 2020 model year are used in the Moscow area.
Renault Allianz: NissanProPilot 2.0level 3HighwayDestination is first given via the navigation system. By approaching the motorway, hand free driving becomes available. The vehicle is maintained permanently in the middle of the lane with a defined speed and distance set to the front vehicle. When the vehicle approaches a junction or a lower-speed vehicle based on the time to collision, a decision is made. The driver is then responsible to take the steering wheel with both hands and confirm the lane change by pressing a button.
Renault Allianz: RenaultSymbiozlevel 4Highway chauffeur and valet parkingConcept designed to show what an electric, connected and autonomous vehicle might look like in the future. It was officially presented in 2017. The concepts on the Symbioz have already been tested in a number of prototypes: Renault Talisman known as ‘Road Runner’, another Talisman fitted with sensors to match the dimensions of the Symbioz called ‘Mad Max’ and an Espace called ‘Ghost Rider’.
GMChevy Boltlevel 4–5Highways, urban, semi-UrbanThe company Cruise was conducting testing with a fleet of approximately 130 self-driving vehicles produced by General Motors (GM). GM is still awaiting approval from the National Highway Traffic Safety Administration to deploy the fleet of vehicles without steering wheels or pedals.
GeelyG-Pilotlevel 4Valet parking & Highway chauffeurFirst connected autonomous prototype, which is close to series production. The vehicle is equipped with an automated Valet Parking System which will allow cars to self-park and be able to be summoned to a location on demand using V2X and 5G systems. Geely Auto Group collaborated with the Ningbo government to establish an autonomous drive test zone in the Hangzhou Bay and transform the district into a smart city.
VolvoHighway Pilotlevel 4HighwayWith XC90 Volvo plan to get the full level 4 by 2020 built on the SPA2 (Scalable Product Architecture) together with Zenuity and veoneer in series production. Volvo announced a partnership with Baidu to develop a level 4 robotaxi service in China.
PSA GroupMobil Lablevel 3Highway & UrbanR&D applications for the evaluation of TOR and Highway chauffeur
ToyotaChauffeurlevel 4HighwayTest vehicle developed with the Toyota Research Institute (TRI) using the platform 4 (P4) in a specific “mobility as a service” (MaaS) driving environment. The vehicle will be available for public demonstration in September in Tokyo. As contributor is the company Tony.ai.
Google’s WaymoWaymo onelevel 4–5Highway, semi Urban & UrbanVehicle prototype used for R&D topics. It is one of the world’s longest ongoing driving test, through millions of miles on public roads and billions of miles in simulation. Volvo, Renault allianz, Landrover, FCA, Intel and NVDIA are partners in this project.
Argo AIArgolevel 4–5Highway, semi Urban & UrbanFord and Volkswagen have co-invested in the autonomous vehicle specialist Argo AI. Argo will integrate all R&D solutions in the MEB platform.
BaiduApollolevel 4–5Highway, semi Urban & UrbanWith more than 200 autonomous vehicles equipped with functions to support the fully autonomous vehicle development process, from research to testing Baidu is developing the world’s biggest testing ground for autonomous driving. Huawei for 5G and V2X collaborates.
Amazon Level 4–5Highway, semi Urban & UrbanPartnership with Zoox, Aurora, AWS, and Rivian
Table A2. Selection of European Projects with Focus on Systems with Automation.
Table A2. Selection of European Projects with Focus on Systems with Automation.
ProjectPartnersScopeDuration
HAVEit [56]- Continental Automotive GmbH
- Efkon AG
- Allemann Technologies Sàrl
- Volkswagen AG
- Stuttgart University
Further partners [56]
Improving traffic safety and efficiency by the development and validation of ADAS, focusing on the interaction between driver and automated vehicles. The project included the direct and indirect monitoring of the driver to measure the level of attention and optimize the system’s feedback strategyFebruary 2008–July 2011
AdaptIVe [52]- Volkswagen AG
- Continental Automotive GmbH
- Volvo Technology AB
- Robert Bosh GmbH
- Daimler AG
Further partners [52]
Developing automated driving functions that are able to adapt to situation and driver status. This project focused on the study of autonomous driving under situations such as close-distance maneuvers on highway and structured and unstructured urban environment, addressing as well the research of driver-system interactions.January 2014–June 2017
CityMobil2 [123]- University of Florence
- University of Southampton
- University of Leeds
- National Institute for Research in Computer Science and Automation (INRIA)
- NEC Laboratories Europe GmbH
Further partners [123]
This project aimed to implement Autonomous Road Transport System (ARTS) on European cities to study the long-term socio-economic impact of automated, and consecutively define and demonstrate the legal and technical frameworks necessary to enable ARTS on the roads.September 2012–August 2016
SCOUT [124]- VDI/VDE Innovation + Technik GmbH
- Renault S.A.S
- Centro Ricerche Fiat SCPA (CRF)
- NXP Semiconductors GmbH
- Robert Bosch GmbH
Further partners [124]
Analyzing Intelligent Transport Systems to identify the pathways for the development of Connected Automated Driving. This analysis aimed to considerate the concerns and perspective of users, suppliers of AV technologies.July 2016–June 2018
C-ROADS [125]- Intercor
- Flanders State of the Art
- Tractebel
- ITS.be
Further partners [125]
Testing and implementing cross-border Cooperative Intelligent Transport Systems services for road users focusing on data exchange through wireless communication.September 2016–October 2019
MAVEN [126]- German Aerospace Center (DLR)
- Dynniq B.V.
- Hyundai Motor Europe Technical Center GmbH
- Czech Technical University in Prague
Further partners [126]
Development of infrastructure-assisted platoon organization for vehicle management at signalized intersections and highways. Maven aimed to build a system prototype for testing and modeling for impact assessment to contribute on the development of ADAS to include vulnerable road users.September 2016–September 2019
CARTRE [127]- ERTICO
- ITS Europe
- BMW Group
- Aptiv PLC
- Tecnalia Research & Innovation
- Delft University of Technology
Further partners [127]
Supporting the creation of policies for EU Members States for the development and deployment of automated road transportOctober 2016–October 2018
AUTO C-ITS [128]- University of Aveiro
- Mapfre, S.A
- Institut of Systems and Robotics-University of Coimbra
- Anritsu
Further partners [128]
Demonstration of the advantages that Cooperative Intelligent Transport Systems brings to Connected Autonomous Driving by obtaining information from V2X communications. This transmitted information can be analyzed by the vehicle’s control system along with the on-board sensory information to drive safer though the streets.November 2016–November 2018
TRANSFORMING TRANSPORT [129]- Indra Sistemas, S.A
- Administrador de Infraestructuras Ferroviarias (ADIF)
- Boeing Research & Technology Europe S.L.U
- Technical University of Madrid
- Renault S.A.S
Further partners [129]
Demonstrating the transformation that Big Data is bringing to ITS and the logistic market. This project addressed important pilot domains for mobility and logistics sector, such as smart highways, sustainable vehicle fleets, proactive rail infrastructures, ports as intelligent logistics hubs, efficient air transport, multi-modal urban mobility, and dynamic supply chains.January 2017–August 2019
AUTOPILOT [130]- ERTICO—ITS Europe
- Akka High Tech
- German Aerospace Center (DLR)
- Centro Ricerche Fiat SCPA (CRF)
Further partners [130]
Using IoT solutions that relate to autonomous vehicles, road infrastructure and surroundings to design system architectures for the developing of ADS dedicated vehicles.January 2017–February 2020
TrustVehicle [37]- Valeo Vision SAS
- Infineon Technologies Austria AG
- AVL List GmbH
- University Of Surrey
Further partners [37]
Aims the advance of Level 3 Automated Driving functions in adverse and non-adverse conditions. This project seeks to provide solutions that increase automation reliability and trustworthiness following a driver centric approach.June 2017–October 2020
L3 PILOT [131]- Volkswagen AG
- BMW Group
- University of Genoa
- University of Leeds
- Toyōta Motor Corporation
Further partners [131]
Testing the viability of ADS dedicated driving as safe and efficient means of transportation on public roads, by performing large-scale piloting around created standardized Europe-wide piloting environment with passenger cars provided of developed level 3 and 4 functionsSeptember 2017–August 2021
CLASS [132]- Barcelona Supercomputing Center
- University of Modena and Reggio Emilia
- IBM Israel—Science and Technology LTD
- ATOS Spain
- Maserati S.p.A.
Further partners [132]
Developing software architecture frameworks to help big data developers to distribute data analytics workload along the compute continuum (from edge to cloud). These frameworks integrate the use of big data in critical real-time systems, providing them with enhanced data analytic capabilities for the implementation of new autonomous control applications.January 2018–January 2021
SECREDAS [133]- NXP Semiconductors BV
- Virtual Vehicle Research GmbH
- Transport & Mobility Leuven
- Brno University of Technology
- Indra Sistemas, S.A
Further partners [133]
Developing multi-domain architecture methodologies, reference architectures, and components for automated vehicles, combining security and privacy protection.May 2018–May 2021
AVENUE [134]- VIRTUAL VEHICLE Research Center (VIF)
- University of Geneva
- NAVYA - Siemens AG
- AVL LIST GmbH
Further partners [134]
Designing and carrying out demonstrations of urban transport automation by deploying fleets of autonomous buses in Europe. Avenue aims to set a new model of public transportation that takes into account the new concept of Mobility Cloud, and assess public transportation paradigms, such as availability, coverage, accessibility, and travel time.May 2018–May 2022
ENSEMBLE [135]- Renault Trucks
- Robert Bosch GmbH
- NXP Semiconductors GmbH
- University Paris-Est Marne-la-Vallée
- Vrije Universiteit Brussel (VUB)
Further partners [135]
Demonstrating the benefits of multi-brand truck platooning in Europe to improve fuel economy, traffic safety, and throughput. This project will address the requirements and standardization of different aspects of truck platooning, such as V2I communication, maneuvers, operational conditions and safety mechanism.June 2018–June 2021
5G-MOBIX [136]- Technical University of Berlin
- Akka Informatique Et Systemes
- Automotive Technology Centre of Galicia
- University of Luxembourg
- SIEMENS industry software and services
Further partners [136]
This projects aims to develop and test automated vehicles using 5G technologies under across European different environments, traffic conditions, and legal regulations. The aim of 5G-MOBIX is to conceptualize a 5G reference framework considering the life cycle for the design and deployment of CCAM as well as 5G network services.November 2018–October 2021
HEADSTART [137]- Idiada Automotive Technology S.A.
- Valeo
- Toyōta Motor Corporation
- ERTICO
- ITS Europe
- Virtual Vehicle Research GmbH
Further partners [137]
Defining testing and validation procedures of Connected and Automated Driving functions such as communications, cyber-security, and positioning. These tests will be performed both in simulations and in real environments to validate the reliability of Autonomous DrivingJanuary 2019–January 2022
NEW CONTROL [138]- Infineon Technologies AG
- BMW
- Technical University of Munich
- University Carlos III of Madrid
- Virtual Vehicle Research GmbH
Further partners [138]
Developing holistic virtualized platforms for perception, decision, and control related to ADS dedicated driving to enable mobility as a service for the next generation of highly automated vehiclesApril 2019–April 2022
TRUSTONOMY [139]- algoWatt S.p.A.
- Softeco Sismat S.r.l
- University of Leeds
- University Gustave Eiffel
- Intrasoft International S.A.
Further partners [139]
Incrementing trust, safety, and acceptance of automated vehicles by addressing technical problems such as driver monitoring and TOR, as well as no technical problem such as the ethical implications of automated decision-making processesMay 2019–May 2022
Drive2TheFuture [140]- Centre for Research & Technology, Hellas
- Technical University of Munich
- Technical University of Berlin
- Fraunhofer Institute for Industrial Engineering
- Institut Vedecom
Further partners [140]
Preparing vehicle users to accept and use connected automated modes of transport and give a path to industries to develop autonomous technologies adapted to users needs. This project will model the behavior of different automated vehicle drivers, will predict acceptance for several automated driving scenarios and will develop specialized training tool and optimized HMI for driver-vehicle handovers.May 2019–May 2022
SUaaVE [141]- Instituto de Biomecanica de Valencia
- Idiada Automotive Technology S.A.
- Technical University of Munich
- Institut Vedecom
- Centro Ricerche Fiat SCPA (CRF)
Further partners [141]
Developing of the Automation Level Four+ Reliable Empathic Driver system (ALFRED). ALFRED will be a layer of behavior that will understand the emotions of the passenger on board and will adapt the vehicle features to enhance user experience.May 2019–May 2022
PAsCAL [142]- Luxembourg Institute of Science and Technology
- University of Liverpool
- University of Leeds
- LuxMobility
- Oply Mobility S.A.
Further partners [142]
Developing of a set of recommendations and guidelines to understand public awareness about connected autonomous vehicles, to measure the degree of acceptance of European citizens towards AV, provide knowledge of how to integrate citizens needs and interest when moving to higher levels of automation and allow the education of future AV drivers, passengers, and those who will share the road.June 2019–June 2022
HADRIAN [143]- Virtual Vehicle Research GmbH
- Technical University of Delft
- Tecnalia Research & Innovation
- University of Granada
- University of Surrey
Further partners [143]
Studying the role of drivers within automated driving systems by developing a driving system solution focusing on HMIs that take into account driver and environmental conditions.December 2019–May 2023
SHOW [144]- International Association of Public Transport
- German Aerospace Center (DLR)
- Robert Bosch GmbH
- Siemens AG Austria
- e.GO MOOVE GmbH
Further partners [144]
Analyzing the role of Autonomous Vehicles in making urban transport more efficient, by deploying shared, connected, and cooperative fleets of autonomous vehicles in coordinated public transport, demand responsive transport, mobility as a service, and logistics as a service.January 2020–January 2024
Table A3. Most relevant standards, guidelines and regulations for the design and implementation of automated vehicle features in Intelligent Transportation Systems (ITS) and factors that relate to TOR (adapted and extended from [15]).
Table A3. Most relevant standards, guidelines and regulations for the design and implementation of automated vehicle features in Intelligent Transportation Systems (ITS) and factors that relate to TOR (adapted and extended from [15]).
(a)
Targeted ITS AspectTargeted SystemRelevant Standards
Driver Monitoring and Design of In-Vehicle Systems Specifics for elliptical models in three dimensions to represent location of driver’s eyes and determine field of viewISO 4513 (2003)—“Road vehicles—Visibility. Method for establishment of eyellipses for driver’s eye location” [145]
SAE J1050—“Describing and Measuring the Driver’s Field of View” [146]
SAE J941—“Motor Vehicle Drivers’ Eye Locations” [147]
Warning messages and signals—to clearly perceive and differentiate alarms, warnings and information signals while taking into account different degrees of urgency and combining modalities of warningsISO 11429:1996 “Ergonomics—System of auditory and visual danger and information signals” [148]
ISO/TR 12204:2012 “Road Vehicles—Ergonomic aspects of transport information and control systems—Introduction to integrating safety-critical and time-critical warning signals” [149]
ISO/TR 16352:2005 “Road vehicles—Ergonomic aspects of in-vehicle presentation for transport information and control systems—Warning systems” [150]
Human centered design principles and activities for computer-based interactive systemsISO 9241-210:2010 “Ergonomics of human–system interaction—Human design for interactive systems” [151]
Driver’s visual behavior—Assessment of impact of human–machine interactionISO 15007-1:2014 “Road vehicles—Measurement of driver visual behavior with respect to transport information and control systems—Part 1: Definitions and parameters” [152]
ISO 15007-2:2014 “Road vehicles—Measurement of driver visual behavior with respect to transport information and control systems—Part 2: Equipment and procedures” [153]
In-vehicle displays, e.g., image quality, legibility of characters, color recognition, etc. and procedures for determining the priority of on-board messages presented to driversISO 15008:2017—“Road vehicles—Ergonomic aspects of transport information and control systems—Specifications and compliance procedures for in-vehicle visual presentation” [154]
ISO 15008:2009 “Road vehicles—Ergonomic aspects of transport information and control systems—Specifications and compliance procedures for in-vehicle visual presentation” [155]
ISO/TS 16951:2004 “Road Vehicles—Ergonomic aspects of transport information and control systems—Procedures for determining priority of on-board messages presented to drivers” [156]
Suitability of transport information and control systems (TICS) for use while drivingISO 17287:2003 “Road vehicles—Ergonomic aspects of transport information and control systems—Procedure for assessing suitability for use while driving” [157]
Aptiv, Audi, Baidu, BMW, Continental, Daimler, FCA US LLC, HERE, Infineon, Intel, and VolkswagenFramework for the development, testing and validationSafety First for Automated Driving” (SaFAD) [158]
Verband der Automobilindustrie (VDA) “Standardization Roadmap for Automated Driving” [159]
European Data protection Board (EDPB)Privacy terms of shared data in wireless vehicular networks“Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility related applications” [160]
UN/ECE—Functional Requirements for Automated and Autonomous Vehicles (FRAV)Adecuation and harmonization regulation for OEM and national legislation“Guidelines on the Exemption Procedure for the EU Approval Of Automated Vehicles” [161]
(b)
EC Regulation Framework—Law ReferenceTitle
2008/653/EC“Commision Recommendation on safe and efficient in-vehicle information and communication systems: update of the European
Statement of Principles on human–machine interface” [162]
COM/2006/0059Communication from the Commission to the Council, the European Parliament, the European Economic and Social Committee
and the Committee of the Regions on the Intelligent Car Initiative—“Raising Awareness of ICT for Smarter, Safer and Cleaner
Vehicles” [163]
COM/2019/464Implementation of Directive 2010/40/EU of the European Parliament and of the Council of 7 July 2010 on the framework for the
deployment of Intelligent Transport Systems in the field of road transport and for interfaces with other modes of transport [164]
C/2019/5177Commission Implementing Regulation (EU) 2019/1213 of 12 July 2019 laying down detailed provisions ensuring uniform
conditions for the implementation of interoperability and compatibility of on-board weighing equipment pursuant to Council
Directive 96/53/EC [165]
2019/C 162/01European Parliament resolution of 13 March 2018 on a European strategy on Cooperative Intelligent Transport Systems
(2017/2067(INI)) [166]
Table A4. Additional ITS standards relevant in vehicle automation.
Table A4. Additional ITS standards relevant in vehicle automation.
Targeted ITS AspectTargeted SystemRelevant Standards
Overall Safety and TrustMain concepts of automation drivingSAE J3016—“Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles” [5]
ETSI TS 103 300-2—“Vulnerable Road Users (VRU) awareness” [167]
Functional safety of vehicle electronic componentsISO 26262:2018—“Road vehicles Functional safety” [168]
ISO/PAS 21448:2019—“Road Vehicles—Safety of the Intended Functionality (SOTIF)” [169]
Trust Assessment of in-vehicle safety systemsUL 4600—“Standard for Safety for the Evaluation of Autonomous Products” [170]
ISO 16673:2017—“Road vehicles—Ergonomic aspects of transport information and control systems—Occlusion method to assess visual demand due to the use of in-vehicle systems” [171]
IEEE P7011—“Standard for the Process of Identifying and Rating the Trustworthiness of News Sources” [172]
IEEE P7009—“Standard for Fail-Safe Design of Autonomous and Semi-Autonomous Systems” [173]
V2X Communications and SecurityCommunication technologies for intelligent transport systemsSAE J2735—“Dedicated Short Range Communications (DSRC) Message Set Dictionary” [174]
ETSI EN 302 663 V1.3.1—“ITS-G5 Access layer specification for Intelligent Transport Systems operating in the 5 GHz frequency band” [175]
Management of secure access to wireless network services and threats identification and avoidanceIEEE 1609—“IEEE Wireless Access in Vehicular Environments (WAVE)” [176]
ISO/SAE DIS 21434—“Road vehicles— Cybersecurity engineering” [177]
ETSI TS 102 731: ITS; Security; Security Services and Architecture [178]
ETSI TR 102 893; ITS; Security, Threat, Vulnerability and Risk Analysis [179]
ETSI TS 102 940: ITS; Security; ITS communications security architecture & security management [180]
ETSI TS 102 941: ITS; Security; Trust and Privacy Management [181]
ETSI TS 102 942: ITS; Security; Access control [182]
ETSI TS 102 943: ITS; Security; Confidentiality services [183]
ETSI TS 103 097: ITS; Security; Security header and certificate formats [183]
Data Privacy and EthicsTransversal standards for the development and management of involved information systems of intelligent transport systems information systemsIEEE P7001—“Transparency of Autonomous Systems” [184]
IEEE P7003—Algorithmic Bias Considerations [185]
IEEE P7007—“Ontological Standard for Ethically driven Robotics and Automation Systems” [186]
IEEE P7008—“Standard for Ethically Driven Nudging for Robotic, Intelligent and Autonomous Systems” [187]
IEEE P7010—“Wellbeing Metrics Standard for Ethical Artificial Intelligence and Autonomous Systems” [188]
IEEE P1228—“Standard for Software Safety” [189]
IEEE P2846—“A Formal Model for Safety Considerations in Automated Vehicle Decision Making” [190]
ISO 24100:2010—“Intelligent transport systems - Basic principles for personal data protection in probe vehicle information services” [191]
IEEE P7002—“Data Privacy Process” [192]
IEEE P7006—“Standard on Personal Data AI Agent Working Group” [193]
IEEE P7012—“Standard for Machine Readable Personal Privacy Terms” [194]

References

  1. Hussein, A.; Garcia, F.; Olaverri-Monreal, C. ROS and Unity Based Framework for Intelligent Vehicles Control and Simulation. In Proceedings of the 2018 IEEE International Conference on Vehicular Electronics and Safety (ICVES), Madrid, Spain, 12–14 September 2018; pp. 1–6. [Google Scholar]
  2. Wöber, W.; Novotny, G.; Mehnen, L.; Olaverri-Monreal, C. Autonomous Vehicles: Vehicle Parameter Estimation Using Variational Bayes and Kinematics. Appl. Sci. 2020, 10, 6317. [Google Scholar] [CrossRef]
  3. Olaverri-Monreal, C. Promoting trust in self-driving vehicles. Nat. Electron. 2020, 3, 292–294. [Google Scholar] [CrossRef]
  4. Olaverri-Monreal, C.; Jizba, T. Human factors in the design of human–machine interaction: An overview emphasizing V2X communication. IEEE Trans. Intell. Veh. 2016, 1, 302–313. [Google Scholar] [CrossRef]
  5. SAE on-Road Automated Driving Committee. SAE J3016. Taxonomy and Definitions for Terms Related to Driving Automation Systems for on-Road Motor Vehicles; Technical Report; SAE International: Warrendale, PA, USA, 2016. [Google Scholar]
  6. Tesla, Inc. Autopilot. Available online: https://www.tesla.com/autopilot (accessed on 25 June 2020).
  7. Endsley, M.R. Toward a Theory of Situation Awareness in Dynamic Systems. Hum. Factors J. Hum. Factors Ergon. Soc. 1995, 37, 32–64. [Google Scholar] [CrossRef]
  8. Goncalves, J.; Olaverri-Monreal, C.; Bengler, K. Driver Capability Monitoring in Highly Automated Driving: From State to Capability Monitoring. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, Las Palmas, Spain, 15–18 September 2015; Volume 2015, pp. 2329–2334. [Google Scholar] [CrossRef]
  9. Self-Driving Uber Kills Arizona Woman in First Fatal Crash Involving Pedestrian—Uber—The Guardian. Available online: https://www.theguardian.com/technology/2018/mar/19/uber-self-driving-car-kills-woman-arizona-tempe (accessed on 18 October 2020).
  10. Paresh, D. Google Ditched Autopilot Driving Feature after Test User Napped Behind Wheel. Available online: https://www.reuters.com/article/us-alphabet-autos-self-driving/google-ditched-autopilot-driving-feature-after-test-user-napped-behind-wheel-idUSKBN1D00MD?il=0 (accessed on 18 October 2020).
  11. Ford Mobility. Autonomous Vehicles. Available online: https://www.ford-mobility.eu/autonomous-vehicles (accessed on 18 October 2020).
  12. Toyota to Offer Rides in SAE Level-4 Automated Vehicles on Public Roads in Japan Next, Summer—Corporate—Global Newsroom—Toyota Motor Corporation Official Global Website. Available online: https://global.toyota/en/newsroom/corporate/30344967.html (accessed on 25 June 2020).
  13. Clark, J.R.; Stanton, N.A.; Revell, K.M. Automated Vehicle Handover Interface Design: Focus Groups with Learner, Intermediate and Advanced Drivers. Automot. Innov. 2020, 3, 14–29. [Google Scholar] [CrossRef] [Green Version]
  14. Olaverri-Monreal, C. Autonomous vehicles and smart mobility related technologies. Infocommun. J. 2016, 8, 17–24. [Google Scholar]
  15. Olaverri-Monreal, C. Road safety: Human factors aspects of intelligent vehicle technologies. In Smart Cities, Green Technologies, and Intelligent Transport Systems; Springer: Berlin/Heidelberg, Germany, 2017; pp. 318–332. [Google Scholar]
  16. Allamehzadeh, A.; Olaverri-Monreal, C. Automatic and manual driving paradigms: Cost-efficient mobile application for the assessment of driver inattentiveness and detection of road conditions. In Proceedings of the 2016 IEEE Intelligent Vehicles Symposium (IV), Gothenburg, Sweden, 19–22 June 2016; pp. 26–31. [Google Scholar]
  17. Strayer, D.; Cooper, J.; Siegel, L. Up to 27 s of Inattention after Talking to Your Car or Smartphone: Distraction Rated ‘High’ for Most Devices While Driving; The University of Utah: Salt Lake, UT, USA, 2015; Available online: http://unews.utah.edu/up-to-27-seconds-of-inattention-after-talking-to-your-car-or-smart-phone/ (accessed on 25 October 2020).
  18. Winzer, O.M.; Conti, A.S.; Olaverri-Monreal, C.; Bengler, K. Modifications of driver attention post-distraction: A detection response task study. In Proceedings of the International Conference on HCI in Business, Government, and Organizations, Vancouver, BC, Canada, 9–14 July 2017; pp. 400–410. [Google Scholar]
  19. Sasangohar, F.; Cummings, M. Human-System Interface Complexity and Opacity Part II: Methods and Tools to Assess HIS Complexity; HAL2010-03 Rapport; Human Automation Laboratory, Massachusetts Institute of Technology: Cambridge, MA, USA, 2010. [Google Scholar]
  20. Schlindwein, S.L.; Ison, R. Human knowing and perceived complexity: Implications for systems practice. Emerg. Complex. Organ. 2004, 6, 27–32. [Google Scholar]
  21. Cummings, M.; Sasangohar, F.; Thornburg, K.M.; Xing, J.; D’Agostino, A. Human-System Interface Complexity and Opacity Part I: Literature Review; HAL2010-03 Rapport; Human Automation Laboratory, Massachusetts Institute of Technology: Cambridge, MA, USA, 2010. [Google Scholar]
  22. Radlmayr, J.; Gold, C.; Lorenz, L.; Farid, M.; Bengler, K. How Traffic Situations and Non-Driving Related Tasks Affect the Take-Over Quality in Highly Automated Driving. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 2063–2067. [Google Scholar] [CrossRef] [Green Version]
  23. Gold, C.; Körber, M.; Lechner, D.; Bengler, K. Taking Over Control From Highly Automated Vehicles in Complex Traffic Situations. Hum. Factors J. Hum. Factors Ergon. Soc. 2016, 58, 642–652. [Google Scholar] [CrossRef]
  24. Gold, C.; Damböck, D.; Lorenz, L.; Bengler, K. “Take over!” How long does it take to get the driver back into the loop? Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2013, 57, 1938–1942. [Google Scholar] [CrossRef] [Green Version]
  25. Martens, M.H.; Compte, S.L.; Kaptein, N.A. The Effects of Road Design on Speed Behaviour: A Literature Review; Deliverable D1 (Report 2.3.1), Managing Speed on European Roads (MASTER) Project; VTT: Espoo, Finland, 1997. [Google Scholar]
  26. Naujoks, F.; Mai, C.; Neukum, A. The effect of urgency of take-over requests during highly automated driving under distraction conditions. In Proceedings of the 5th International Conference on Applied Human Factors and Ergonomics AHFE, Krakow, Poland, 19–23 July 2014. [Google Scholar]
  27. Brandenburg, S.; Chuang, L. Take-over requests during highly automated driving: How should they be presented and under what conditions? Transp. Res. Part F Traffic Psychol. Behav. 2019, 66, 214–225. [Google Scholar] [CrossRef]
  28. Borojeni, S.S.; Boll, S.C.; Heuten, W.; Bülthoff, H.H.; Chuang, L. Feel the movement: Real motion influences responses to Take-over requests in highly automated vehicles. In Conference on Human Factors in Computing Systems—Proceedings; Association for Computing Machinery: New York, NY, USA, 2018; Volume 2018. [Google Scholar] [CrossRef]
  29. Russell, H.E.; Harbott, L.K.; Nisky, I.; Pan, S.; Okamura, A.M.; Gerdes, J.C. Motor learning affects Car-To-Driver handover in automated vehicles. Sci. Robot. 2016, 1. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Bahram, M.; Aeberhard, M.; Wollherr, D. Please take over! An analysis and strategy for a driver take over request during autonomous driving. In Proceedings of the IEEE Intelligent Vehicles Symposium, Seoul, Korea, 28 June–1 July 2015; Volume 2015, pp. 913–919. [Google Scholar] [CrossRef]
  31. Mulder, M.; Abbink, D.A.; Boer, E.R. The effect of haptic guidance on curve negotiation behavior of young, experienced drivers. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Singapore, 12–15 October 2008; pp. 804–809. [Google Scholar] [CrossRef]
  32. Steele, M.; Gillespie, R.B. Shared Control between Human and Machine: Using a Haptic Steering Wheel to Aid in Land Vehicle Guidance. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2001, 45, 1671–1675. [Google Scholar] [CrossRef] [Green Version]
  33. Mulder, M.; Abbink, D.A. Correct and faulty driver support from shared haptic control during evasive maneuvers. In Proceedings of the 2011 IEEE International Conference on Systems, Man, and Cybernetics, Anchorage, AK, USA, 9–12 October 2011; pp. 1057–1062. [Google Scholar]
  34. Lv, C.; Wang, H.; Cao, D.; Zhao, Y.; Sullman, M.; Auger, D.J.; Brighton, J.; Matthias, R.; Skrypchuk, L.; Mouzakitis, A. A Novel Control Framework of Haptic Take-Over System for Automated Vehicles. In Proceedings of the IEEE Intelligent Vehicles Symposium, Changshu, China, 26–30 June 2018; Volume 2018, pp. 1596–1601. [Google Scholar] [CrossRef]
  35. Li, Y.; Lv, C.; Xue, J. A novel predictive haptic control interface for automation-to-human takeover of automated vehicles. In Proceedings of the IEEE Intelligent Vehicles Symposium, Paris, France, 9–12 June 2019; Volume 2019, pp. 994–999. [Google Scholar] [CrossRef]
  36. Yoon, S.H.; Kim, Y.W.; Ji, Y.G. The effects of takeover request modalities on highly automated car control transitions. Accid. Anal. Prev. 2019, 123, 150–158. [Google Scholar] [CrossRef] [PubMed]
  37. Improved Trustworthiness and Weather-Independence of Conditionally Automated Vehicles in Mixed Traffic Scenarios—TrustVehicle Project. Available online: http://www.trustvehicle.eu/wp-content/uploads/2019/01/TrustVehicle-D4.1publishable-summary.pdf/ (accessed on 3 July 2020).
  38. Capalar, J.; Olaverri-Monreal, C. Hypovigilance in limited self-driving automation: Peripheral visual stimulus for a balanced level of automation and cognitive workload. In Proceedings of the IEEE Conference on Intelligent Transportation Systems ITSC, Yokohama, Japan, 16–19 October 2017; pp. 27–31. [Google Scholar] [CrossRef]
  39. Hayashi, H.; Kamezaki, M.; Manawadu, U.E.; Kawano, T.; Ema, T.; Tomita, T.; Catherine, L.; Sugano, S. A driver situational awareness estimation system based on standard glance model for unscheduled takeover situations. In Proceedings of the IEEE Intelligent Vehicles Symposium, Paris, France, 9–12 June 2019; Volume 2019, pp. 798–803. [Google Scholar] [CrossRef]
  40. Li, X.; Schroeter, R.; Rakotonirainy, A.; Kuo, J.; Lenné, M.G. Effects of different non-driving-related-task display modes on drivers’ eye-movement patterns during take-over in an automated vehicle. Transp. Res. Part F Traffic Psychol. Behav. 2020, 70, 135–148. [Google Scholar] [CrossRef]
  41. Olaverri-Monreal, C.; Hasan, A.E.; Bulut, J.; Körber, M.; Bengler, K. Impact of in-vehicle displays location preferences on drivers’ performance and gaze. IEEE Trans. Intell. Transp. Syst. 2014, 15, 1770–1780. [Google Scholar] [CrossRef]
  42. Naujoks, F.; Wiedemann, K.; Schömig, N.; Jarosch, O.; Gold, C. Expert-based controllability assessment of control transitions from automated to manual driving. MethodsX 2018, 5, 579–592. [Google Scholar] [CrossRef]
  43. Forster, Y.; Naujoks, F.; Neukum, A.; Huestegge, L. Driver compliance to take-over requests with different auditory outputs in conditional automation. Accid. Anal. Prev. 2017, 109, 18–28. [Google Scholar] [CrossRef]
  44. Petermeijer, S.M.; De Winter, J.C.; Bengler, K.J. Vibrotactile Displays: A Survey with a View on Highly Automated Driving. IEEE Trans. Intell. Transp. Syst. 2016, 17, 897–907. [Google Scholar] [CrossRef] [Green Version]
  45. Politis, I.; Brewster, S.; Pollick, F. Language-based multimodal displays for the handover of control in autonomous cars. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications—AutomotiveUI’15, Nottingham, UK, 1–3 September 2015; Association for Computing Machinery (ACM): New York, NY, USA, 2015; pp. 3–10. [Google Scholar] [CrossRef] [Green Version]
  46. Hergeth, S.; Lorenz, L.; Krems, J.F. Prior Familiarization With Takeover Requests Affects Drivers’ Takeover Performance and Automation Trust. Hum. Factors 2017, 59, 457–470. [Google Scholar] [CrossRef]
  47. Hoff, K.A.; Bashir, M. Trust in automation: Integrating empirical evidence on factors that influence trust. Hum. Factors 2015, 57, 407–434. [Google Scholar] [CrossRef]
  48. Zeeb, K.; Buchner, A.; Schrauf, M. Is take-over time all that matters? the impact of visual-cognitive load on driver take-over quality after conditionally automated driving. Accid. Anal. Prev. 2016, 92, 230–239. [Google Scholar] [CrossRef]
  49. Zeeb, K.; Buchner, A.; Schrauf, M. What determines the take-over time? An integrated model approach of driver take-over after automated driving. Accid. Anal. Prev. 2015, 78, 212–221. [Google Scholar] [CrossRef] [PubMed]
  50. Janssen, C.P.; Iqbal, S.T.; Kun, A.L.; Donker, S.F. Interrupted by my car? Implications of interruption and interleaving research for automated vehicles. Int. J. Hum. Comput. Stud. 2019, 130, 221–233. [Google Scholar] [CrossRef]
  51. Venkita, S.R.; Willemsen, D.; Alirezaei, M.; Nijmeijer, H. Switching from autopilot to the driver: A transient performance analysis. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2020, 234, 1346–1360. [Google Scholar] [CrossRef]
  52. Automated Driving Applications and Technologies for Intelligent Vehicles—AdaptIVe FP7 Project—Automated Driving Applications and Technologies for Intelligent Vehicles. Available online: http://www.adaptive-ip.eu/ (accessed on 3 July 2020).
  53. Kelsch, J. Arbitration between Driver and Automation: Why overriding is just the tip of the iceberg. In Proceedings of the InteractIVe Summer School, Corfu Island, Greece, 4–6 July 2012. [Google Scholar]
  54. Allamehzadeh, A.; De La Parra, J.U.; Hussein, A.; Garcia, F.; Olaverri-Monreal, C. Cost-efficient driver state and road conditions monitoring system for conditional automation. In Proceedings of the IEEE Intelligent Vehicles Symposium, Los Angeles, CA, USA, 11–14 June 2017; pp. 1497–1502. [Google Scholar] [CrossRef]
  55. Olaverri-Monreal, C.; Kumar, S.; DÍaz-Álvarez, A. Automated Driving: Interactive Automation Control System to Enhance Situational Awareness in Conditional Automation. In Proceedings of the IEEE Intelligent Vehicles Symposium, Changshu, China, 26–30 June 2018; Volume 2018, pp. 1698–1703. [Google Scholar] [CrossRef]
  56. European Commission. Highly Automated Vehicles for Intelligent Transport—Final Report. Available online: https://trimis.ec.europa.eu/project/highly-automated-vehicles-intelligent-transport#tab-docs (accessed on 3 July 2020).
  57. Gold, C.; Lorenz, L.; Bengler, K. Influence of Automated Brake Application on Take-Over Situations in Highly Automated Driving Scenarios. In Proceedings of the FISITA 2014 World Automotive Congress KIVI, Maastricht, The Netherlands, 2–6 June 2014. [Google Scholar]
  58. Petersen, L.; Robert, L.; Yang, J.; Tilbury, D. Situational Awareness, Driver’s Trust in Automated Driving Systems and Secondary Task Performance. SSRN Electron. J. 2019. [Google Scholar] [CrossRef] [Green Version]
  59. Eriksson, A.; Stanton, N.A. Takeover Time in Highly Automated Vehicles: Noncritical Transitions to and from Manual Control. Hum. Factors 2017, 59, 689–705. [Google Scholar] [CrossRef] [PubMed]
  60. Bazilinskyy, P.; Petermeijer, S.M.; Petrovych, V.; Dodou, D.; de Winter, J.C. Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays. Transp. Res. Part F Traffic Psychol. Behav. 2018, 56, 82–98. [Google Scholar] [CrossRef]
  61. Lee, J.D.; McGehee, D.V.; Brown, T.L.; Marshall, D. Effects of Adaptive Cruise Control and Alert Modality on Driver Performance. Transp. Res. Rec. J. Transp. Res. Board 2006, 1980, 49–56. [Google Scholar] [CrossRef]
  62. Naujoks, F.; Forster, Y.; Wiedemann, K.; Neukum, A. A human–machine interface for cooperative highly automated driving. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2017; Volume 484, pp. 585–595. [Google Scholar] [CrossRef]
  63. Kim, J.W.; Yang, J.H. Understanding metrics of vehicle control take-over requests in simulated automated vehicles. Int. J. Automot. Technol. 2020, 21, 757–770. [Google Scholar] [CrossRef]
  64. Scott, J.J.; Gray, R. A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Hum. Factors 2008, 50, 264–275. [Google Scholar] [CrossRef] [PubMed]
  65. Kelsch, J.; Wilbrink, M. Joint driver-automation system design: Gradual action-oriented ambient stimuli. In Proceedings of the International Conference on Applied Human Factors and Ergonomics AHFE 2015, Las Vegas, NV, USA, 26–30 July 2015. [Google Scholar] [CrossRef]
  66. Dettmann, A.; Bullinger, A.C. Spatially distributed visual, auditory and multimodal warning signals—A comparison. In Proceedings of the Human Factors and Ergonomics Society Europe, HFES Europe Chapter 2017, Rome, Italy, 28–30 September 2017; pp. 185–199. [Google Scholar]
  67. Pfromm, M.; Cieler, S.; Bruder, R. Driver assistance via optical information with spatial reference. In Proceedings of the IEEE Conference on Intelligent Transportation Systems ITSC, The Hague, The Netherlands, 6–9 October 2013; pp. 2006–2011. [Google Scholar] [CrossRef]
  68. Meschtscherjakov, A.; Döttlinger, C.; Rödel, C.; Tscheligi, M. ChaseLight: Ambient LED stripes to control driving speed. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications—AutomotiveUI’15, Nottingham, UK, 1–3 September 2015; Association for Computing Machinery (ACM): New York, NY, USA, 2015; pp. 212–219. [Google Scholar] [CrossRef]
  69. Borojeni, S.S.; Chuang, L.; Heuten, W.; Boll, S. Assisting drivers with ambient take-over requests in highly automated driving. In AutomotiveUI 2016—8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Proceedings; Association for Computing Machinery, Inc.: New York, NY, USA, 2016; pp. 237–244. [Google Scholar] [CrossRef] [Green Version]
  70. Löcken, A.; Heuten, W.; Boll, S. Supporting lane change decisions with ambient light. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications—AutomotiveUI’15, Nottingham, UK, 1–3 September 2015; Association for Computing Machinery (ACM): New York, NY, USA, 2015; pp. 204–211. [Google Scholar] [CrossRef]
  71. Den Beukel, V.; der Voort, V. Design Considerations on User-Interaction for Semi-Automated driving. In Proceedings of the FISITA World Automotive Congress, Maastricht, The Netherlands, 2–6 June 2014. [Google Scholar]
  72. Scharfe, M.S.L.; Zeeb, K.; Russwinkel, N. The Impact of Situational Complexity and Familiarity on Takeover Quality in Uncritical Highly Automated Driving Scenarios. Information 2020, 11, 115. [Google Scholar] [CrossRef] [Green Version]
  73. Mohebbi, R.; Gray, R.; Tan, H.Z. Driver Reaction Time to Tactile and Auditory Rear-End Collision Warnings While Talking on a Cell Phone. Hum. Factors J. Hum. Factors Ergon. Soc. 2009, 51, 102–110. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  74. Mok, B.; Johns, M.; Lee, K.J.; Ive, H.P.; Miller, D.; Ju, W. Timing of unstructured transitions of control in automated driving. In Proceedings of the IEEE Intelligent Vehicles Symposium, Seoul, Korea, 28 June–1 July 2015; Volume 2015, pp. 1167–1172. [Google Scholar] [CrossRef]
  75. Mok, B.; Johns, M.; Lee, K.J.; Miller, D.; Sirkin, D.; Ive, P.; Ju, W. Emergency, Automation Off: Unstructured Transition Timing for Distracted Drivers of Automated Vehicles. In Proceedings of the IEEE Conference on Intelligent Transportation Systems ITSC, Las Palmas, Spain, 15–18 September 2015; Volume 2015, pp. 2458–2464. [Google Scholar] [CrossRef]
  76. Sasse, M.A.; Johnson, C.; Johnson, C.W. Human-Computer Interaction. In Proceedings of the INTERACT’99: IFIP TC. 13 International Conference on Human-Computer Interaction, Edinburgh, UK, 30 August–3 September 1999. [Google Scholar]
  77. Schwalk, M.; Kalogerakis, N.; Maier, T. Driver Support by a Vibrotactile Seat Matrix—Recognition, Adequacy and Workload of Tactile Patterns in Take-over Scenarios During Automated Driving. Procedia Manuf. 2015, 3, 2466–2473. [Google Scholar] [CrossRef] [Green Version]
  78. Ho, C.; Tan, H.Z.; Spence, C. Using spatial vibrotactile cues to direct visual attention in driving scenes. Transp. Res. Part F Traffic Psychol. Behav. 2005, 8, 397–412. [Google Scholar] [CrossRef]
  79. Calhoun, G.L.; Draper, M.H.; Ruff, H.A.; Fontejon, J.V. Utilty of a Tactile Display for Cueing Faults. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2002, 46, 2144–2148. [Google Scholar] [CrossRef]
  80. Fitch, G.M.; Hankey, J.M.; Kleiner, B.M.; Dingus, T.A. Driver comprehension of multiple haptic seat alerts intended for use in an integrated collision avoidance system. Transp. Res. Part F Traffic Psychol. Behav. 2011, 14, 278–290. [Google Scholar] [CrossRef]
  81. Borojeni, S.S.; Wallbaum, T.; Heuten, W.; Boll, S. Comparing Shape-Changing and Vibro-Tactile Steering Wheels for Take-Over Requests in Highly Automated Driving. In AutomotiveUI’17: 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2017; pp. 221–225. [Google Scholar] [CrossRef]
  82. Scheiner, J. Veoneer Verkauft Seine Japanischen und Chinesischen Beteiligungen. Available online: https://www.automobil-industrie.vogel.de/veoneer-verkauft-seine-japanischen-und-chinesischen-beteiligungen-a-902861/ (accessed on 25 June 2020).
  83. Petermeijer, S.M.; Hornberger, P.; Ganotis, I.; de Winter, J.C.; Bengler, K.J. The design of a vibrotactile seat for conveying take-over requests in automated driving. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2018; Volume 597, pp. 618–630. [Google Scholar] [CrossRef]
  84. Lee, J.D.; Gore, B.F.; Campbell, J.L. Display alternatives for in-vehicle warning and sign information: Message style, location, and modality. Transp. Hum. Factors 1999, 1, 347–375. [Google Scholar] [CrossRef]
  85. Olaverri-Monreal, C.; Bengler, K.J. Impact of cultural diversity on the menu structure design of driver information systems: A cross-cultural study. In Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV), Baden, Germany, 5–9 June 2011; pp. 107–112. [Google Scholar]
  86. Olaverri-Monreal, C.; Lehsing, C.; Trübswetter, N.; Schepp, C.A.; Bengler, K. In-vehicle displays: Driving information prioritization and visualization. In Proceedings of the 2013 IEEE Intelligent Vehicles Symposium (IV), Gold Coast, Australia, 23–26 June 2013; pp. 660–665. [Google Scholar] [CrossRef]
  87. Wittmann, M.; Kiss, M.; Gugg, P.; Steffen, A.; Fink, M.; Pöppel, E.; Kamiya, H. Effects of display position of a visual in-vehicle task on simulated driving. Appl. Ergon. 2006, 37, 187–199. [Google Scholar] [CrossRef]
  88. Olaverri-Monreal, C.; Gomes, P.; Silveria, M.K.; Ferreira, M. In-vehicle virtual traffic lights: A graphical user interface. In Proceedings of the 2012 7th Iberian Conference on Information Systems and Technologies (CISTI), Madrid, Spain, 20–23 June 2012; pp. 1–6. [Google Scholar]
  89. Östlund, J.; Peters, B.; Thorslund, B.; Engström, J.; Markkula, G.; Keinath, A.; Horst, D.; Juch, S.; Mattes, S.; Foehl, U. Driving Performance Assessment—Methods and Metrics; Technical Report, AIDE Deliverable 2.2.5; 2005. Available online: http://www.aide-eu.org/pdf/sp2_deliv_new/aide_d2_2_5.pdf (accessed on 26 June 2020).
  90. NHTSA. Preliminary Statement of Policy Concerning Automated Vehicles; NHTSA: Washington, DC, USA, 2013. [Google Scholar]
  91. Stanton, N.A.; Marsden, P. From fly-by-wire to drive-by-wire: Safety implications of automation in vehicles. Saf. Sci. 1996, 24, 35–49. [Google Scholar] [CrossRef] [Green Version]
  92. Zhang, B.; de Winter, J.; Varotto, S.; Happee, R.; Martens, M. Determinants of take-over time from automated driving: A meta-analysis of 129 studies. Transp. Res. Part F Traffic Psychol. Behav. 2019, 64, 285–307. [Google Scholar] [CrossRef]
  93. Alvarez, W.M.; Smirnov, N.; Matthes, E.; Olaverri-Monreal, C. Vehicle Automation Field Test: Impact on Driver Behavior and Trust. arXiv 2020, arXiv:2006.02737. [Google Scholar]
  94. Merat, N.; Jamson, A.H.; Lai, F.C.; Daly, M.; Carsten, O.M. Transition to manual: Driver behaviour when resuming control from a highly automated vehicle. Transp. Res. Part F Traffic Psychol. Behav. 2014, 27, 274–282. [Google Scholar] [CrossRef] [Green Version]
  95. Kerschbaum, P.; Lorenz, L.; Bengler, K. A transforming steering wheel for highly automated cars. In Proceedings of the IEEE Intelligent Vehicles Symposium, Seoul, Korea, 28 June–1 July 2015; Volume 2015, pp. 1287–1292. [Google Scholar] [CrossRef]
  96. Deng, C.; Cao, S.; Wu, C.; Lyu, N. Modeling Driver Take-Over Reaction Time and Emergency Response Time using an Integrated Cognitive Architecture. Res. Artic. Transp. Res. Rec. 2019, 2673, 380–390. [Google Scholar] [CrossRef]
  97. Rezvani, T.; Driggs-Campbell, K.; Sadigh, D.; Sastry, S.S.; Seshia, S.A.; Bajcsy, R. Towards trustworthy automation: User Interfaces that convey internal and external awareness. In Proceedings of the IEEE Conference on Intelligent Transportation Systems ITSC, Rio de Janeiro, Brazil, 1–4 November 2016; pp. 682–688. [Google Scholar] [CrossRef]
  98. You, F.; Wang, Y.; Wang, J.; Zhu, X.; Hansen, P. Take-Over Requests Analysis in Conditional Automated Driving and Driver Visual Research Under Encountering Road Hazard of Highway. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2018; Volume 592, pp. 230–240. [Google Scholar] [CrossRef]
  99. Merat, N.; Jamson, A.H.; Lai, F.C.; Carsten, O. Highly automated driving, secondary task performance, and driver state. Hum. Factors 2012, 54, 762–771. [Google Scholar] [CrossRef] [PubMed]
  100. Körber, M.; Weißgerber, T.; Kalb, L.; Blaschke, C.; Farid, M. Prediction of take-over time in highly automated driving by two psychometric tests. DYNA (Colombia) 2015, 82, 195–201. [Google Scholar] [CrossRef]
  101. Damböck, D.; Bengler, K.; Farid, M.; Tönert, L. Übernahmezeiten beim hochautomatisierten Fahren [Takeover times for highly automated driving]. Tagung Fahrerassistenz 2012, 5, 16–28. [Google Scholar]
  102. Van Den Beukel, A.P.; Van Der Voort, M.C. The influence of time-criticality on Situation Awareness when retrieving human control after automated driving. In Proceedings of the IEEE Conference on Intelligent Transportation Systems ITSC, The Hague, The Netherlands, 6–9 October 2013; pp. 2000–2005. [Google Scholar] [CrossRef]
  103. Feldhütter, A.; Gold, C.; Schneider, S.; Bengler, K. How the Duration of Automated Driving Influences Take-Over Performance and Gaze Behavior. In Advances in Ergonomic Design of Systems, Products and Processes; Springer: Berlin/Heidelberg, Germany, 2017; pp. 309–318. [Google Scholar] [CrossRef]
  104. Körber, M.; Gold, C.; Lechner, D.; Bengler, K. The influence of age on the take-over of vehicle control in highly automated driving. Transp. Res. Part F Traffic Psychol. Behav. 2016, 39, 19–32. [Google Scholar] [CrossRef] [Green Version]
  105. Louw, T.; Merat, N.; Jamson, A. Engaging With Highly Automated Driving: To Be Or Not To Be In The Loop? In Proceedings of the 8th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, Salt Lake, UT, USA, 22–25 July 2015. [Google Scholar] [CrossRef]
  106. Walch, M.; Lange, K.; Baumann, M.; Weber, M. Autonomous driving: Investigating the feasibility of car-driver handover assistance. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications—AutomotiveUI’15, Nottingham, UK, 1–3 September 2015; Association for Computing Machinery (ACM): New York, NY, USA, 2015; pp. 11–18. [Google Scholar] [CrossRef]
  107. Lorenz, L.; Kerschbaum, P.; Schumann, J. Designing take over scenarios for automated driving: How does augmented reality support the driver to get back into the loop? Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 1681–1685. [Google Scholar] [CrossRef]
  108. Schömig, N.; Hargutt, V.; Neukum, A.; Petermann-Stock, I.; Othersen, I. The Interaction Between Highly Automated Driving and the Development of Drowsiness. Procedia Manuf. 2015, 3, 6652–6659. [Google Scholar] [CrossRef] [Green Version]
  109. Louw, T.; Kountouriotis, G.; Carsten, O.; Merat, N. Driver Inattention During Vehicle Automation: How Does Driver Engagement Affect Resumption Of Control? In Proceedings of the 4th International Conference on Driver Distraction and Inattention, Sydney, Australia, 9–11 November 2015. [Google Scholar]
  110. Dogan, E.; Deborne, R.; Delhomme, P.; Kemeny, A.; Jonville, P. Evaluating the shift of control between driver and vehicle at high automation at low speed: The role of anticipation. In Proceedings of the Transport Research Arena (TRA) 5th Conference: Transport Solutions from Research to Deployment, Paris, France, 14–17 April 2014. [Google Scholar]
  111. Naujoks, F.; Neukum, A. Timing of in-vehicle advisory warnings based on cooperative perception. In Proceedings of the 4th Human Factors and Ergonomics Society Europe Chapter Annual Meeting, Turin, Italy, October 2013. [Google Scholar]
  112. Payre, W.; Cestac, J.; Delhomme, P. Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery. Hum. Factors 2016, 58, 229–241. [Google Scholar] [CrossRef] [PubMed]
  113. Melcher, V.; Rauh, S.; Diederichs, F.; Widlroither, H.; Bauer, W. Take-Over Requests for Automated Driving. Procedia Manuf. 2015, 3, 2867–2873. [Google Scholar] [CrossRef]
  114. BMW. The BMW Vision iNext, Future Focused. Available online: https://www.bmwgroup.com/BMW-Vision-iNEXT (accessed on 25 June 2020).
  115. European Telecommunications Standards Institute (ETSI). Introducing DRIVE PILOT: An Automated Driving System for the Highway. Available online: https://www.daimler.com/documents/innovation/other/2019-02-20-vssa-mercedes-benz-drive-pilot-a.pdf (accessed on 5 July 2020).
  116. Audi AG. Audi A8: Audi AI Traffic Jam Pilot. Available online: https://www.audi-mediacenter.com/en/press-releases/presales-start-for-new-audi-a8-9406 (accessed on 3 July 2020).
  117. Autovista Group. Audi A8 Will Not Feature Level 3 Autonomy. Available online: https://autovistagroup.com/news-and-insights/audi-a8-will-not-feature-level-3-autonomy (accessed on 5 July 2020).
  118. RIS. Automatisiertes Fahren Verordnung—Bundesrecht konsolidiert, Fassung vom 09.11.2020. Available online: https://www.ris.bka.gv.at/GeltendeFassung.wxe?Abfrage=Bundesnormen&Gesetzesnummer=20009740 (accessed on 5 July 2020).
  119. European Union. Shaping Europe’s Digital Future: Connected and Automated Mobility in Europe. Available online: https://ec.europa.eu/digital-single-market/en/connected-and-automated-mobility-europe (accessed on 3 July 2020).
  120. Alonso Raposo, M.; Grosso, M.; Després, J.; Fernandez Macias, E.; Galassi, M.; Krasenbrink, A.; Krause, J.; Levati, L.; Mourtzouchou, A.; Saveyn, B.; et al. An Analysis of Possible Socio-Economic Effects of a Cooperative, Connected and Automated Mobility (CCAM) in Europe; Publications Office of the European Union: Luxembourg, 2018. [Google Scholar] [CrossRef]
  121. McCall, R.; McGee, F.; Mirnig, A.; Meschtscherjakov, A.; Louveton, N.; Engel, T.; Tscheligi, M. A taxonomy of autonomous vehicle handover situations. Transp. Res. Part A Policy Pract. 2019, 124, 507–522. [Google Scholar] [CrossRef]
  122. Marcano, M.; Díaz, S.; Pérez, J.; Irigoyen, E. A Review of Shared Control for Automated Vehicles: Theory and Applications. IEEE Trans. Hum. Mach. Syst. 2020, 1–17. [Google Scholar] [CrossRef]
  123. European Commission. Cities Demonstrating Cybernetic Mobility—CITYMOBIL2 Project—FP7—CORDIS. Available online: https://cordis.europa.eu/project/id/314190 (accessed on 3 July 2020).
  124. European Commission. Safe and Connected Automation in Road Transport—SCOUT Project—H2020—CORDIS. Available online: https://cordis.europa.eu/project/id/713843/es (accessed on 3 July 2020).
  125. C-ROADS The Platform of the Harmonised C-ITS Deployment in Europe. Available online: https://www.c-roads.es/ (accessed on 3 July 2020).
  126. Innovation and Networks Executive Agency. Managing Automated Vehicles Enhances Network, MAVEN Project. Available online: https://ec.europa.eu/inea/en/horizon-2020/projects/h2020-transport/automated-road-transport/maven (accessed on 3 July 2020).
  127. Connected Automated Driving Europe. Coordination of Automated Road Transport Deployment for Europe. Available online: https://connectedautomateddriving.eu/about-us/cartre/ (accessed on 3 July 2020).
  128. Regulation Study for Interoperability in the Adoption of Autonomous Driving in European Urban Nodes—AUTO C-ITS Project. Available online: https://www.autocits.eu/ (accessed on 3 July 2020).
  129. Intrasoft International S.A. Transforming Transport Project. Available online: https://transformingtransport.eu/ (accessed on 3 July 2020).
  130. Intelligent Transport Systems & Services Europe. AUTOmated Driving Progressed by Internet of Things—Autopilot Project. Available online: https://autopilot-project.eu/ (accessed on 3 July 2020).
  131. L3Pilot Consortium—L3Pilot Driving Automation Project. Available online: https://www.l3pilot.eu/ (accessed on 3 July 2020).
  132. Edge and Cloud Computation: A Highly Distributed Software for Big Data Analytics—CLASS Project. Available online: https://class-project.eu/ (accessed on 3 July 2020).
  133. ECSEL Joint Undertaking. SECREDAS Project. Available online: https://www.ecsel.eu/projects/secredas (accessed on 3 July 2020).
  134. The Avenue Consortium. Autonomous Vehicles to Evolve to a New Urban Experience—AVENUE Project. Available online: https://h2020-avenue.eu/ (accessed on 3 July 2020).
  135. ENabling SafE Multi-Brand Platooning for Europe—Platooning Ensemble Project. Available online: https://platooningensemble.eu/ (accessed on 3 July 2020).
  136. Driving forward Connected & Automated Mobility—5G-MOBIX Project. Available online: https://www.5g-mobix.com/ (accessed on 3 July 2020).
  137. Harmonised European Solutions for Testing Automated Road Transport—HEADSTART Project. Available online: https://www.headstart-project.eu/ (accessed on 3 July 2020).
  138. AVL List GmbH. NewControl Project. Available online: https://www.newcontrol-project.eu/ (accessed on 3 July 2020).
  139. Softeco Sismat SRL. Trustonomy Project. Available online: https://h2020-trustonomy.eu/ (accessed on 3 July 2020).
  140. Centre for Research and Technology Hellas (CERTH) and Hellenic Institute of Transport (HIT). Drive2Thefuture Project. Available online: http://www.drive2thefuture.eu/ (accessed on 3 July 2020).
  141. SUaaVE Consortium. Colouring Automated Driving with Human Emotions—SUaaVE Project. Available online: http://www.suaave.eu/ (accessed on 3 July 2020).
  142. Enhance Driver Behaviour and Public Acceptance of Connected and Autonomous Vehicles—PAsCAL Project. Available online: https://www.pascal-project.eu/ (accessed on 3 July 2020).
  143. European Commission. Holistic Approach for Driver Role Integration and Automation Allocation for European Mobility Needs—HADRIAN Project—H2020—CORDIS. Available online: https://cordis.europa.eu/project/id/875597 (accessed on 3 July 2020).
  144. European Commission. SHared Automation Operating Models for Worldwide Adoption—SHOW Project—H2020—CORDIS. Available online: https://cordis.europa.eu/project/id/875530/es (accessed on 3 July 2020).
  145. ISO 4513:2003—Road Vehicles—Visibility—Method for Establishment of Eyellipses for Driver’s Eye Location. Available online: https://www.iso.org/standard/36126.html (accessed on 4 July 2020).
  146. SAE International. J1050: Describing and Measuring the Driver’s Field of Viewl. Available online: https://www.sae.org/standards/content/j1050_200902/?src=j941_201003 (accessed on 4 July 2020).
  147. SAE International. J941: Motor Vehicle Drivers’ Eye Locations—SAE International. Available online: https://www.sae.org/standards/content/j941_201003/?src=j1050_200902 (accessed on 4 July 2020).
  148. ISO 11429:1996—Ergonomics—System of Auditory and Visual Danger and Information Signals. Available online: https://www.iso.org/standard/19369.html (accessed on 4 July 2020).
  149. ISO/TR 12204:2012—Road Vehicles—Ergonomic Aspects of Transport Information and Control Systems—Introduction to Integrating Safety Critical and Time Critical Warning Signals. Available online: https://www.iso.org/standard/51275.html (accessed on 4 July 2020).
  150. ISO/TR 16352:2005—Road vehicles—Ergonomic Aspects of in-Vehicle Presentation for Transport Information and Control Systems—Warning systems. Available online: https://www.iso.org/standard/37859.html (accessed on 4 July 2020).
  151. ISO 9241-210:2010—Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems. Available online: https://www.iso.org/standard/52075.html (accessed on 4 July 2020).
  152. ISO 15007-1:2014—Road vehicles—Measurement of Driver Visual Behaviour With Respect to Transport Information and Control Systems—Part 1: Definitions and Parameters. Available online: https://www.iso.org/standard/56621.html (accessed on 4 July 2020).
  153. ISO/TS 15007-2:2014—Road vehicles—Measurement of Driver Visual Behaviour With Respect to Transport Information and Control Systems—Part 2: Equipment and Procedures. Available online: https://www.iso.org/standard/56622.html (accessed on 4 July 2020).
  154. ISO 15008:2017—Road Vehicles—Ergonomic Aspects of Transport Information and Control Systems—Specifications and Test Procedures for in-Vehicle Visual Presentation. Available online: https://www.iso.org/standard/62784.html (accessed on 4 July 2020).
  155. ISO 15008:2009—Road Vehicles—Ergonomic Aspects of Transport Information and Control Systems—Specifications and Test Procedures for in-Vehicle Visual Presentation. Available online: https://www.iso.org/standard/50805.html (accessed on 4 July 2020).
  156. ISO/TS 16951:2004—Road Vehicles—Ergonomic Aspects of Transport Information and Control Systems (TICS)—Procedures for Determining Priority of on-Board Messages Presented to Drivers. Available online: https://www.iso.org/standard/29024.html (accessed on 4 July 2020).
  157. ISO 17287:2003—Road Vehicles—Ergonomic Aspects of Transport Information and Control Systems—Procedure for Assessing Suitability for Use While Driving. Available online: https://www.iso.org/standard/30597.html (accessed on 4 July 2020).
  158. Safety First for Automated Driving. Available online: https://www.daimler.com/documents/innovation/other/safety-first-for-automated-driving.pdf (accessed on 4 July 2020).
  159. Standardization Roadmap for Automatic Driving—VDA. Available online: https://www.vda.de/en/services/Publications/standardization-roadmap-for-automatic-driving.html (accessed on 4 July 2020).
  160. European Data Protection Board. Guidelines 1/2020 on Processing Personal Data in the Context of Connected Vehicles and Mobility Related Applications. Available online: https://edpb.europa.eu/our-work-tools/public-consultations-art-704/2020/guidelines-12020-processing-personal-data-context_es (accessed on 4 July 2020).
  161. European Commission. EUR-Lex—52018DC0283—EN—EUR-Lex—Guidelines on the Exemption Procedure for EU Approval of Automated Vehicles. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52018DC0283 (accessed on 4 July 2020).
  162. European Commission. EUR-Lex—32008H0653—EN—EUR-Lex—Commission Recommendation on Safe and Efficient in-Vehicle Information and Communication Systems: Update of the European Statement of Principles on Human-Machine Interface. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1594131793771&uri=CELEX:32008H0653 (accessed on 4 July 2020).
  163. European Commission. EUR-Lex—52006DC0059—EN—EUR-Lex—Raising Awareness of ICT for Smarter, Safer and Cleaner Vehicles. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1594120977891&uri=CELEX:52006DC0059 (accessed on 4 July 2020).
  164. European Commission. EUR-Lex—52019DC0464—EN—EUR-Lex—Implementation of Directive 2010/40/EU of the European Parliament and of the Council on the Framework For the Deployment of Intelligent Transport Systems in the Field of Road Transport and for Interfaces With Other Modes of Transport. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1594119042362&uri=CELEX:52019DC0464 (accessed on 4 July 2020).
  165. European Commission. EUR-Lex—32019R1213—EN—EUR-Lex—Commission Implementing Regulation: Laying Down Detailed Provisions Ensuring Uniform Conditions For the Implementation of Interoperability and Compatibility of on-Board Weighing Equipment Pursuant to Council Directive 96/53/EC. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1594116441993&uri=CELEX:32019R1213 (accessed on 4 July 2020).
  166. European Commission. EUR-Lex—52018IP0063—EN—EUR-Lex—European Parliament Resolution on a European Strategy on Cooperative Intelligent Transport Systems. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1594133531751&uri=CELEX:52018IP0063 (accessed on 4 July 2020).
  167. European Telecommunications Standards Institute (ETSI). IntElligent Transport System (ITS); Vulnerable Road Users (VRU) Awareness; Part 2: Functional Architecture and Requirements Definition; Release 2. Available online: https://www.etsi.org/deliver/etsi_ts/103300_103399/10330002/02.01.01_60/ts_10330002v020101p.pdf (accessed on 4 July 2020).
  168. ISO 26262-1:2018(en), Road Vehicles—Functional Safety—Part 1: Vocabulary. Available online: https://www.iso.org/obp/ui/#iso:std:iso:26262:-1:ed-2:v1:en (accessed on 4 July 2020).
  169. ISO/PAS 21448:2019(en), Road Vehicles—Safety of the Intended Functionality. Available online: https://www.iso.org/obp/ui#iso:std:iso:pas:21448:ed-1:v1:en (accessed on 4 July 2020).
  170. UL Standard. UL 4600. Available online: https://www.shopulstandards.com/ProductDetail.aspx?productid=UL4600 (accessed on 4 July 2020).
  171. ISO 16673:2017—Road Vehicles—Ergonomic Aspects of Transport Information and Control Systems—Occlusion Method to Assess Visual Demand Due to The Use of in-Vehicle Systems. Available online: https://www.iso.org/standard/71508.html (accessed on 4 July 2020).
  172. IEEE Standards Association. P7011—Standard for the Process of Identifying and Rating the Trustworthiness of News Sources. Available online: https://standards.ieee.org/project/7011.html (accessed on 4 July 2020).
  173. IEEE Standards Association. P7009—Standard for Fail-Safe Design of Autonomous and Semi-Autonomous Systems. Available online: https://standards.ieee.org/project/7009.html (accessed on 4 July 2020).
  174. SAE International. J2735: Dedicated Short Range Communications (DSRC) Message Set Dictionary™. Available online: https://www.sae.org/standards/content/j2735_5C_200911/ (accessed on 4 July 2020).
  175. European Telecommunications Standards Institute (ETSI). ITS-G5 Access Layer Specification for Intelligent Transport Systems Operating in the 5 GHz Frequency Band. Available online: https://www.etsi.org/deliver/etsi_en/302600_302699/302663/01.03.01_60/en_302663v010301p.pdf (accessed on 4 July 2020).
  176. IEEE Standards Association. 1609.0-2013—IEEE Guide for Wireless Access in Vehicular Environments (WAVE)—Architecture. Available online: https://standards.ieee.org/standard/1609_0-2013.html (accessed on 4 July 2020).
  177. ISO/SAE DIS 21434—Road Vehicles—Cybersecurity Engineering. Available online: https://www.iso.org/standard/70918.html (accessed on 4 July 2020).
  178. European Telecommunications Standards Institute (ETSI). Intelligent Transport Systems (ITS); Security; Security Services and Architecture. Available online: https://www.etsi.org/deliver/etsi_ts/102700_102799/102731/01.01.01_60/ts_102731v010101p.pdf (accessed on 4 July 2020).
  179. European Telecommunications Standards Institute (ETSI). Intelligent Transport Systems (ITS); Security; Threat, Vulnerability and Risk Analysis (TVRA). Available online: https://www.etsi.org/deliver/etsi_tr/102800_102899/102893/01.02.01_60/tr_102893v010201p.pdf (accessed on 4 July 2020).
  180. European Telecommunications Standards Institute (ETSI). Intelligent Transport Systems (ITS); Security; ITS Communications Security Architecture and Security Management. Available online: https://www.etsi.org/deliver/etsi_ts/102900_102999/102940/01.03.01_60/ts_102940v010301p.pdf (accessed on 4 July 2020).
  181. European Telecommunications Standards Institute (ETSI). Intelligent Transport Systems (ITS); Security; Trust and Privacy Management. Available online: https://www.etsi.org/deliver/etsi_ts/102900_102999/102941/01.02.01_60/ts_102941v010201p.pdf (accessed on 4 July 2020).
  182. European Telecommunications Standards Institute (ETSI). Intelligent Transport Systems (ITS); Security; Access Control. Available online: https://www.etsi.org/deliver/etsi_ts/102900_102999/102942/01.01.01_60/ts_102942v010101p.pdf (accessed on 4 July 2020).
  183. European Telecommunications Standards Institute (ETSI). Intelligent Transport Systems (ITS); Security; Confidentiality Services. Available online: https://www.etsi.org/deliver/etsi_ts/102900_102999/102943/01.01.01_60/ts_102943v010101p.pdf (accessed on 4 July 2020).
  184. IEEE Standards Association. P7001—Transparency of Autonomous Systems. Available online: https://standards.ieee.org/project/7001.html (accessed on 4 July 2020).
  185. IEEE Standards Association. P7003—Algorithmic Bias Considerations. Available online: https://standards.ieee.org/project/7003.html (accessed on 4 July 2020).
  186. IEEE Standards Association. P7007—Ontological Standard for Ethically Driven Robotics and Automation Systems. Available online: https://standards.ieee.org/project/7007.html (accessed on 4 July 2020).
  187. IEEE Standards Association. P7008—Standard for Ethically Driven Nudging for Robotic, Intelligent and Autonomous Systems. Available online: https://standards.ieee.org/project/7008.html (accessed on 4 July 2020).
  188. IEEE Standards Association. 7010-2020—IEEE Recommended Practice for Assessing the Impact of Autonomous and Intelligent Systems on Human Well-Being. Available online: https://standards.ieee.org/standard/7010-2020.html (accessed on 4 July 2020).
  189. IEEE Standards Association. 1228-1994—IEEE Standard for Software Safety Plans. Available online: https://standards.ieee.org/standard/1228-1994.html (accessed on 4 July 2020).
  190. IEEE Standards Association. IEEE 2846 WG. Available online: https://sagroups.ieee.org/2846/ (accessed on 4 July 2020).
  191. ISO 24100:2010—Intelligent Transport Systems—Basic Principles For Personal Data Protection in Probe Vehicle Information Services. Available online: https://www.iso.org/standard/42017.html (accessed on 4 July 2020).
  192. IEEE Standards Association. P7002—Data Privacy Process. Available online: https://standards.ieee.org/project/7002.html (accessed on 4 July 2020).
  193. IEEE Standards Association. P7006—Standard for Personal Data Artificial Intelligence (AI) Agent. Available online: https://standards.ieee.org/project/7006.html (accessed on 4 July 2020).
  194. IEEE Standards Association. P7012—Standard for Machine Readable Personal Privacy Terms. Available online: https://standards.ieee.org/project/7012.html (accessed on 4 July 2020).
Figure 1. Take over process timeline.
Figure 1. Take over process timeline.
Electronics 09 02087 g001
Figure 2. Conceptual framework of TOR.
Figure 2. Conceptual framework of TOR.
Electronics 09 02087 g002
Figure 3. HMI examples implemented in the literature. (a) Panel with visual information to transmit urgency trough yellow and red color (adapted from [43]); (b) visual, dynamic information according to the driving automation or control transfer need [55]; (c) haptic steering wheel that triggers a TOR by a flexible shape (left) or vibration (right) (concept from [81]); (d) Visual interface to visualize a TOR through lights on the steering wheel (figure inspired by [82]); (e) ambient lights installed on the driver’s periphery [38]; (f) matrix of tactors installed in the driver’s seat [83].
Figure 3. HMI examples implemented in the literature. (a) Panel with visual information to transmit urgency trough yellow and red color (adapted from [43]); (b) visual, dynamic information according to the driving automation or control transfer need [55]; (c) haptic steering wheel that triggers a TOR by a flexible shape (left) or vibration (right) (concept from [81]); (d) Visual interface to visualize a TOR through lights on the steering wheel (figure inspired by [82]); (e) ambient lights installed on the driver’s periphery [38]; (f) matrix of tactors installed in the driver’s seat [83].
Electronics 09 02087 g003
Table 1. Complexity factors that affect TOR.
Table 1. Complexity factors that affect TOR.
Complexity TypeComplexity FactorSpecific Context
Objective ComplexityTraffic situationTraffic densityHigh
Low
Road conditionsRoad geometryCurved
Straight
Road lanes
Control transferHaptic guidance
Abrupt transition
Subjective ComplexityNon-driving related tasksManual
Visual
Cognitive
AgeYoung
Old
TrustHigh
Low
Urgency of situationEmergency event
ODD limit
Human machine interfaceVisualImages
Ambient
SoundInformative
Acoustic
HapticMotion cues
Vibration
Situational awarenessHigh
Low
Table 2. Human Machine Interface modalities studied in the literature.
Table 2. Human Machine Interface modalities studied in the literature.
HMIDescriptionAdvantagesDisadvantagesRelated Work
VisualImagesCondense and transmit a great amount of information in a single displayTOR information can be missed by distracted drivers[22,24,26,27,28,45,46,48,49,55,59,60,61,62,63]
AmbientEasily detected by distracted drivers, unobtrusive, does not affect joy of use of the automation systemhowever, hard to understand if intended to convey a particular message[38,64,65,66,67,68,69,70,71]
AuditoryAcousticDoes not require eyes-off-the-road timebut the intended message might not be clear to the driver, not intuitive[22,23,24,26,28,46,48,49,61,63,64,69,72,73]
InformativeExplicit, clear to understand voice messages, eyes-off-the-roadhowever, longer time required to transmit urgent information; it requires more attentional resources from the driver than acoustic signals[45,58,59,60,74,75]
Tactile VibrotactileObstrusive, enhances driver auditory or visual perception [76]however, transmission of a limited amount of information; not suitable to convey multiple alerts as they are not intuitive[45,60,61,63,64,73,77,78,79,80,81]
Table 3. Take over times reported in literature. The table has been extended from [59].
Table 3. Take over times reported in literature. The table has been extended from [59].
ModalityReferenceTORlt (Seconds)TOrt (Seconds)Control Moment Definition
Visual[97]5--
[98]4, 6, 8--
[99]-30Time to perform lane change
[94]010–15Time correcting the steering wheel position
Auditory[23]72.49–3.61Time correcting the steering wheel position
[75]2, 5, 8--
[93]-0.75–1.3Time to hands on wheel
[58]6.5, 58–9.9Time to start a maneuver
[100]3--
[101]4, 6, 8--
[102]1.5, 2.2, 2.8--
[103]61.88–2.24Time correcting the steering wheel position
[104]72.41–3.66Time to start a maneuver
Visual-Auditory[49]2.5, 3, 3.5, 121.14Time to hands on wheel
[55]-1.64–2.00Time to press a button on the steering wheel
[57]51.68–2.22-
[38]-1.54–1.61Time to press a button on the steering wheel
[48]2.5, 41.9–3Time to system deactivation
[105]6.52.18–2.47Time to steer the wheel
[95]72.22–3.09Time to steer the wheel or time to brake
[106]4, 61.90–2.75Time to hands on wheel
[107]72.86–3.03Time to steer the wheel or time to brake
[26]-2.29–6.90Time to hands on wheel
[108]12--
[109]3--
[110]3--
[111]0, 1, 2, 3, 4--
[112]2, 304.30–8.70Time to steer the wheel, brake or accelerate
[39]5.5, 8.5--
[22]71.55–2.92Time to steer the wheel or time to brake
[46]72.00–3.5Time to steer the wheel or time to brake
[27]15, 243, 3.4-
[59]30–454.57–6.06-
[24]5, 72.10–3.65Time to brake
Auditory-Haptic[73]50.69–0.95Time to brake
Visual-Auditory-Haptic[36]72.10–2.63Time to steer
[64]3.50.6–0.9Time to brake
[45]-2.21–6.91Time to press a button on the steering wheel
[113]101.4–6.7Time to brake or time accelerate
[63]5–72.17Time to steer the wheel, brake or accelerate
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Morales-Alvarez, W.; Sipele, O.; Léberon, R.; Tadjine, H.H.; Olaverri-Monreal, C. Automated Driving: A Literature Review of the Take over Request in Conditional Automation. Electronics 2020, 9, 2087. https://doi.org/10.3390/electronics9122087

AMA Style

Morales-Alvarez W, Sipele O, Léberon R, Tadjine HH, Olaverri-Monreal C. Automated Driving: A Literature Review of the Take over Request in Conditional Automation. Electronics. 2020; 9(12):2087. https://doi.org/10.3390/electronics9122087

Chicago/Turabian Style

Morales-Alvarez, Walter, Oscar Sipele, Régis Léberon, Hadj Hamma Tadjine, and Cristina Olaverri-Monreal. 2020. "Automated Driving: A Literature Review of the Take over Request in Conditional Automation" Electronics 9, no. 12: 2087. https://doi.org/10.3390/electronics9122087

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop