|Source: Car and Driver|
On Thursday, Tesla recalled more than 350,000 automobiles with its experimental self-driving software technology. Apparently, the software may lead to accidents in intersections, and it can cause cars to move at excess speeds at times.
Interestingly, Missy Cummings, a George Mason University engineering professor and former safety official at the National Highway Traffic Safety Administration, had warned about self-driving technology's potential problems and limitations in a New York Times article
on Wednesday. She argued that some drivers place too much trust in the self-driving technology, and thereby place themselves and others at risk.
The Cummings interview emphasized a concept I've been writing about for several years: compensatory behavior. The idea is simple. If we know about certain systems intended to assist us and potentially make us safer, we grow dependent on those systems and sometimes compensate by actually taking more risk. I first explored the idea in the context of Everest expedition teams. I once asked the great mountaineer, Ed Viesturs, why he often climbs without supplemental oxygen. He replied, "I don't think oxygen necessarily makes you safer in the mountains. It can give you a false sense of security." As a result, climbers engage in reckless behavior, not realizing quite how much danger they are encountering because of the safety blanket that they think the oxygen has provided them. Similar issues arose in my study of the BP oil spill in the Gulf of Mexico, as well as the Boeing 737 MAX design and subsequent crashes. In the latter case, I also learned that pilots can grow dependent on automated systems, and their skills may erode over time as a result. Think about driving a car again. If you always use the backup camera and parking-assist features, you may not be as capable of parallel parking if suddenly those systems are not available to you.
In sum, I'm not against technologies such as self-driving cars. I do think, however, that we must become aware of the dangers and risks. Those risks are not all simply about potential flaws in the technology. Those flaws also have to do with how humans interact with that technology over time.