Realtime Technologies

News

Driver Research: Addressing Autonomous Driving System Over-reliance

 

As we’ve noted in the past over-reliance on automated driving systems (ADS) and advanced driver-assistance systems (ADAS) is a big hurdle to broad acceptance of autonomous vehicles. Fortunately, recent driver research hints at some truly tiny tweaks we could make to driver training programs that would help laypeople right-size their reliance on the ADS/ADAS features coming soon to a vehicle near you.

Over the last several years Moritz Körber, Eva Baseler, and Klaus Bengler have studied how to best manipulate trust in and reliance on ADS. For one study (ultimately published in 2017 as “Introduction Matters: Manipulating Trust in Automation and Reliance in Automated Driving”) they sought to create two test groups: One with greater trust in autonomous vehicles, the other with lowered trust.

 

Driver Research into Trust in Automation

Both test groups were prepped for their simulated drives in similar ways: They were first shown a brief introductory video on autonomous vehicles, then given a written overview of ADS, and then experienced a scripted “introductory drive.”

Both groups began by watching the same introductory video, which explained the basics of automated driving, vehicle sensors, trajectory planning, etc. But, the video shown to the “trust lowered” group included an additional final segment, showing a “non-critical [vehicle] take-over.” Nothing traumatic—just a scene in which the vehicle occupant had to take over vehicle control as a matter of convenience. The written overview of ADS was likewise similar for both groups, with a slight revision: “[I]n the Trust Lowered group, a take-over situation was described as possible at any time whereas in the Trust Promoted group it was described as unlikely to happen, but possible.”

Finally, each participant took an in-simulator “introductory drive” lasting roughly two minutes, and presented as a simple exercise in acclimating the subjects to being in a simulator. The drive began with the participant controlling the vehicle. They were then instructed to activate the ADS, and told they could “overrule the automated driving system with their manual input at any time.” About a minute later the participants received a TOR (take-over request) from the vehicle. But, while the control group experienced this with no other traffic or obstacles, the “trust lowered” group was asked to take over vehicle control on short notice, in order to avoid an impending collision with a disabled vehicle.

 

Small Changes Made a Big Difference in Driver Safety

It’s interesting to note that both groups received fairly straightforward training. The cost difference (in terms of production of the curriculum materials and time to complete the training) was negligible. The only real difference was that one group “was explicitly reminded that they are ultimately responsible for their vehicle and road safety at all times.”

The results? This second group—who were reminded visually, in writing, and in a simulation that ADS require human monitoring—paid more attention, took control of the vehicle more often, and were in zero collisions. Meanwhile, the control group—whose existing ideas of what an ADS can do went unchallenged—spent significantly “less time looking at the road or instrument cluster” and more time otherwise distracted with other tasks. Almost one-third of this group got into collisions—despite the vehicle warning them of impending collisions. Even when they did avoid the collision, it took them longer to do so and they came much closer to being in an accident.