As I climbed under the kitchen table with my five-year-old this weekend, she explained that we were in a car, but “it can drive itself, so we can just relax, OK?”. We settled down for a pretend nap on the way to the pretend beach.
I didn’t tell her that grown-ups are really struggling to turn this vision into reality. Even Waymo, the company that is furthest ahead, still has self-driving taxis in only a handful of US cities.
In the meantime, car makers are packing many of their new models with so-called “Level 2″ partial automation features instead. These can do a certain amount of driving in some circumstances, but require the human driver to pay attention and take over when necessary.
Yet this halfway-house, which relies on humans and machines, is proving troublesome. And it is trouble worth noting, even if you have no interest in cars, because other sectors are also beginning to embrace the concept of automated “co-pilots” to help everyone from coders to doctors.
Say Nothing: Bingeable yet sober-minded eulogy for the tragedy of the Troubles
Here are 33 places to eat in Ireland that readers say are good value
Chris Horn: AI is 90% marketing, 10% reality, and its true business impact has yet to be proven
Ireland v Finland: TV details, kick-off time, team news and more
The big problem is known as “automation complacency”. People have been studying the phenomenon for decades in all kinds of partially automated systems, from aviation to manufacturing processes.
When you ask humans to supervise automated systems, their attention starts to wander – which means they don’t always notice in time when a problem does arise, nor are they aware enough of the context to immediately take over. And the better an automated system performs most of the time, the more complacent we humans become.
Mica Endsley, a former chief scientist at the US Air Force, has made a career of studying these issues after first encountering them in the 1980s. “The public don’t quite understand the subtle ways that automation affects their attention [but] it’s like giving people a sedative,” she told me. “They’re going to find something else to do or they’re going to zone out, and neither is good.”
Car drivers, it turns out, are not immune.
[ Investors should be wary of Tesla’s robotaxi hypeOpens in new window ]
Studies of various partial-automation systems have found that drivers become increasingly likely to disengage the longer they use them. In the US, the National Transportation Safety Board has blamed automation complacency for a number of car crashes.
If humans are notoriously poor monitors, the solution, apparently, is to monitor the monitors. Safety bodies and regulators have pushed for steering wheels that detect whether people are holding them, and driver-facing cameras that detect the direction of the driver’s gaze and head posture at all times. Most provide visual, audio and even seat-vibration alerts that increase in intensity to warn distracted drivers to return their attention to the road. Tesla cars have a disciplinary system whereby, if drivers accumulate too many “strikeouts”, the partial-automation system is suspended for a week.
But bullying drivers to pay attention doesn’t seem to be sufficient. When Mikael Ljung Aust, a driver-behaviour specialist at Volvo Cars, ran a study on a test track with employees, he found that distraction alerts did successfully make people keep their eyes on the road and their hands on the wheel. But even then, almost 30 per cent of them allowed the car to crash straight into an object in the road.
[ Could ride-sharing robo-pods be the future of cities?Opens in new window ]
In follow-up interviews, the drivers said they saw the object coming, but they trusted the car to deal with it, at least until it was too late. “Even if you write very clearly in the manual, ‘the car cannot see these objects’ and you show them pictures, once they get out on the road – for some people ... it seems like they can’t help trusting the car.”
He and several other safety experts said the best solution to the dangers of automation complacency seemed to be to keep the driver more actively involved in the steering and driving, with the partially automated system on in the background, gently guiding when necessary rather than taking over.
In other words, if you imagine automation as a scale with humans doing everything on one end and machines doing everything on the other, the best course might actually be to edge back slightly towards maintaining more human control, at least until the technology is good enough to leapfrog over to the other end of the scale.
Otherwise, we face a partially automated middle, where a car journey looks less like having a nap and more like watching the road anxiously with your eyes wide open and your neck straight, for fear that your car will shout at you again to pay attention. Is that the kind of future any five-year-old ever dreamt about? – Copyright The Financial Times Limited 2024