When Joshua Neally left his office in Springfield, MO, climbed into his Tesla Model X, and merged onto the highway to head home, he did what many Tesla drivers do–he switched on Autopilot mode.
Neally, who reportedly pays close attention while driving Autopilot, following Tesla’s guidelines for use, may have expected the advanced driving feature to kick in, braking if a vehicle crossed its path or alerting him if a nearby car slid too close into his lane.
But, when Neally began experiencing tightness in his chest and, after calling his wife, realized he needed to get to the hospital, he used Autopilot in a way he probably never expected: To rush him straight to the hospital.
SEE: Tesla’s Autopilot: The smart person’s guide (TechRepublic)
The tightness in his chest turned out to be caused by a pulmonary embolism, and Neally was able to make a full recovery.
“I don’t really think I could have [made the drive without Autopilot],” Neally told CBS.
After the fatal accident in May 2016 that occurred when Tesla driver Joshua Brown collided with a tractor-trailer truck that barged across his path, many have wondered whether Tesla has pushed the limits of driverless-vehicle technology too far, making a promise of safety it can’t keep.
But whether or not Tesla’s Autopilot failed during the May fatality–it is still under investigation by the DOT–Neally’s use of Autopilot shows that the discussion about safety is more complicated, and warrants greater nuance, than the current debate reflects.
This is not the first time Autopilot has been used in a nontraditional sense. On July 16, a Washington driver’s automatic braking activated before the car hit a pedestrian.
SEE: Tesla’s Master Plan 2.0: AI experts, auto insiders, and Tesla customers weigh in (TechRepublic)
Jeffrey Miller, IEEE member and associate professor of engineering at the University of Southern California, sees this as “a great application of Autopilot.” He thinks it could also have implications in situations where drivers are under the influence, or drowsy at the wheel.
It also could mean that people can get to the hospital faster, perhaps, than first respondents.
“I’m sure there are a number of situations where first responders can’t really do much for a person but rush them to a hospital to get more advanced medical treatment,” said Miller. “That means that the person could have a driverless vehicle take them there without having to wait for a first responder.”
The situation also sheds light on instances in which an autonomous feature can be used, not to prevent an accident due to traffic, but to aid in the safety of the driver, who may be impaired in some other way.
“In fatal crashes, usually the crash occurs first and the death second,” said Bryant Walker Smith, professor at the University of South Carolina and a leading expert on the legal aspects of self-driving vehicles. “But, there are some cases where the death (or incapacitating injury) happens first and actually causes the crash.”
SEE: When will we get driverless cars? Experts say public opinion is the critical factor (TechRepublic)
Smith believes there are other health benefits to autonomous technology that have been missing in the debates.
“If automated driving is more energy efficient, in part by avoiding crashes and driving more smoothly, then it will reduce emissions,” he said. “Pollution actually causes more premature deaths in the United States than motor vehicle crashes.”
John Dolan, principal systems scientist in the Robotics Institute at Carnegie Mellon University, said it reminds him of an instance when a man suffering an aneurysm lost control of his car and killed two women walking in a park. Dolan believes that a car that can recognize when a driver is incapacitated, and automatically switch into an autonomous mode, may be the next step. While it would involve limiting false alarms, Dolan said that he has “been gaining experience on how to do that kind of thing with existing advanced driver-assist features.”
In considering the safety of Autopilot and other autonomous driving features, it’s critical to remember that Autopilot is simply a tool–it does not have an inherent ability to “kill” or “save” a life. But, if used well, it has the potential to assist a driver who needs help. And, beyond simply braking, Autopilot may have safety implications we have not yet conceived of.
- Tesla driver dies in first fatality with Autopilot: What it means for the future of driverless cars (TechRepublic)
- Learn Tesla Model 3’s key moves in autonomous driving, batteries, and charging (TechRepublic)
- Tesla’s fatal Autopilot accident: Why the New York Times got it wrong(TechRepublic)
- Why the US government should take Tesla up on its offer to share Autopilot data (TechRepublic)
- Tesla speaks: How we will overcome the obstacles to driverless vehicles (TechRepublic)
- Autonomous driving levels 0 to 5: Understanding the differences (TechRepublic)