Innovation

Tesla driver dies in first fatality with Autopilot: What it means for the future of driverless cars

A man using Tesla's Autopilot died when neither he, nor his car, braked to prevent crashing into a tractor-trailer. Here's how the accident fits into a broader conversation about autonomous driving.

Image: Jasper Juinen, Bloomberg via Getty Images

On a bright, sunny afternoon on May 7, Joshua Brown was heading east down highway U.S. 27 in Williston, Fla., when his Tesla Model S, which he called, playfully, "Tessy," crashed into a tractor-trailer that was crossing its path. Neither Brown, nor the car, engaged in Autopilot mode, succeeded to brake in time to prevent the accident, and Brown was killed.

There are many unanswered questions about the crash, which Tesla learned about on June 29 and posted about the next day, and is considered the first death due to a vehicle using autonomous driving software.

The incident is currently under investigation by the US National Highway and Transportation Safety Administration (NHTSA). And, according to Bryan Thomas, Communications Director for the NHTSA, the incident "should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles."

What we do know is that it is the first known fatality that has happened while a driver has used Autopilot, Tesla's optional, highway-driving mode. At this point, it's essentially like an advanced-cruise control, which is able to steer, detect obstacles, change lanes, and brake when needed.

Among all the talk of driverless, self-driving, and autonomous vehicles, it's important to define what, exactly, Autopilot is. Currently, Tesla does not make self-driving cars. And Autopilot does not turn Tesla into an autonomous vehicle. I recently tested Autopilot in a Model S, and was reminded that using the system "doesn't abdicate driver responsibility," and that I was expected to be ready to take over control at any point during the drive.

We should keep the context in mind. To date, Autopilot-users have driven more than 130 million miles; and, according to a 2015 report by the US National Safety Council, the estimated annual mileage death rate is 1.3 deaths per 100 million vehicle miles traveled.

According to Bryant Walker Smith, professor at the University of South Carolina, and an expert on the legal aspects of self-driving vehicles, "The big picture is this: Roughly 100 other Americans died in crashes the same day as this Tesla driver, largely without national or even local notice."

"How many other crashes have Tesla drivers prevented through their own vigilance?" Smith asked. "Against the backdrop of a tragic status quo of carnage on our roads, no one is quite sure how to strike the right balance between caution and aggression."

Tesla's response to the accident and approach to autonomous driving

According to Tesla's blog post, the circumstances around the accident were "extremely rare," and involved the Model S hitting the bottom of the trailer, rather than the front or rear of the trailer. If it had hit the front or rear, "its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents."

In other words, Autopilot is intended to increase safety for drivers—but it's important to keep the limitations of the system in mind.

SEE: Tesla speaks: How we will overcome the obstacles to driverless vehicles

"We constantly build updates to our software and specifically Autopilot features to continue to improve the feature and driver experience," a Tesla spokesperson said. "Autopilot is designed to add a layer of safety to Model S and to ensure this, it's important that customers use Autopilot responsibly and as intended."

They will not, said Tesla, "release any features or capabilities that have not been robustly validated in-house and will continue to leverage the enormous amount of insight this process provides to advance the state of the art in advanced driver assistance systems."

Response from other Tesla Autopilot users

How are Tesla Autopilot drivers responding? Tamara Fusco in Nixa, Mo., said "It doesn't worry me. When I use the Autosteer, I always keep a hand on the wheel," she said. "In fact, any time I put on Autosteer, it flashes a warning to keep my hands on the wheel. When the Tesla senses the car ahead of me, it automatically slows down to maintain that distance, even when I am following a semi, so I am curious what happened in the tragedy."

Evan Fusco, husband of Tamara Fusco and an administrator for Tesla's online forum, agreed that the incident won't change his comfort level with driving in Autopilot mode. "I always keep my hands on the wheel and keep a high degree of road awareness, in part due to the known gaps in the technology," said Fusco. "There are not enough details to make any judgement about what role, if any, Autopilot played. I think speculation on that would be premature."

SEE: Autonomous driving levels 0 to 5: Understanding the differences (TechRepublic)

According to Autopilot-user Daniel Nasserian, the accident "doesn't concern me at all. The AP feature is still in its infancy, and I think the media ran with the story because of how new the concept is. The amount of miles driven without a fatality on Autopilot still remains impressive compared to the human error factor."

"The tech isn't perfect, and won't be for a while, which is why people need to realize that this is a beta and there should still be attention on the road," said Nasserian. "I just hope that because Tesla is tied to the tech right now, people don't start going after the company, since it will hinder progress, and it seems it was used incorrectly in this case."

Anticipating the accident

While the accident is a wake-up call for the autonomous vehicle community, the fact that an autonomous system may have failed is not a complete surprise. Many experts in the field had guessed that this would happen at some point, as John Dolan, principal systems scientist in the Robotics Institute at Carnegie Mellon University, told TechRepublic just a day prior to news of the accident.

Smith previously told TechRepublic that the first autonomous driving crash "will be a big deal."

"I've heard from others that the tolerance for an accident in a driverless car will be much lower than it would be for a human," said Smith. "There's this kind of mental barrier. What do you think will happen? Who will be to blame and do you see any kind of major issues when, all of a sudden, these get out on the road and then there is an accident?"

Even a Tesla spokesperson recently worried about what would happen when the first Autopilot crash occurred. The spokesperson told TechRepublic, "The potential for an autonomous Three-Mile Island equivalent, and for what that did to the nuclear industry, is high. Should an event occur that receives major publicity, it's actually quite likely that that could throw a large bucket of cold water on progress made towards legalizing and increasing adoption of autonomous vehicles."

Social hurdles to driverless vehicle technology

The issue of trust will be central to the adoption of the autonomous driving technology. Ted Koslowski, former Vice President and Lead Automotive Analyst at Gartner, said it will involve "rethinking the entire car culture we've built over the past century."

Jeffrey Miller, IEEE member and associate professor of engineering at the University of Southern California, agreed. "The core challenge," he said, "is determining whether the relevant technologies have reached a demonstrated level of socially acceptable risk."

SEE: How driverless cars will transform auto insurance and shift burden onto AI and software (TechRepublic)

But what is a socially-acceptable level of risk? In large part, it depends on our expectations of driverless cars. "We have to accept and expect that these cars aren't flawless," said Koslowski.

"This is the first robot that we will experience on a day-to-day basis," he said. "Loss of control and trust is what a lot of people have problems with. All of a sudden we're delegating a task that consumers have been doing for a century to a robot."

But while many mainstream drivers may have worries about autonomous driving technology, the half-dozen Autopilot-users I spoke to still emphasized a feeling of excitement and overall safety with the system. While "taking your hands off the wheel is just something that we are taught not to do in all our years of driving," said Evan Fusco, "it is really exciting to see it actually work and work under conditions that I wasn't really sure how it would handle, tight curves and things like that."

What the crash means for the future of autonomous driving research

"This unfortunate accident confirms what many have pointed out: although autonomous driving on highways works well under normal circumstances, there are unexpected events and conditions that are difficult to handle," said Dolan. "Tesla may need to reconsider its Autopilot feature by, for example, providing stronger training or certification for drivers who use it or halting its use while the function is further refined."

Dolan also raised the issue of testing autonomous technology. "Traditional automakers are far more cautious in answering this question than Tesla, with its more freewheeling Silicon Valley outlook," he said. It is a question that the NHTSA plans to address this summer, when it will issue guidelines for autonomous vehicle research, Thomas told TechRepublic.

The accident also makes a compelling argument that the US government should take Tesla up on its offer to look at data from Autopilot-driven miles.

Miller recently said he "can't see Tesla continuing to get away with what they've done. It's incredible that they've said 'Okay, if you're not going to provide regulations, we'll keep updating our technology.'"

"I don't think the government will sit back and allow them to keep adding these self-driving features," he said.

"Although this accident is the type of thing those of us working in the field have most feared, I don't think it has to have a chilling effect," said Dolan. "The history of automotive technology has involved steadily increasing safety by learning from sometimes hard experience and adapting appropriately, and there's no reason that can't be true in this new phase of self-driving cars."

Smith emphasized the importance of companies and regulators having a grasp on the new technology. "They need to be able to be prepared to deal with these crashes. There are significant liability concerns for individual developers," said Smith.

"It's important to properly manage public expectations about these technologies, and to emphasize that they will not be perfect," said Smith, "but by the time they're deployed, the credible hope is that they will be significantly safer than human-driven vehicles."

While Autopilot may have failed Brown in early May, it was also a system that had once come to his rescue.

On April 6, almost exactly a month to the day before his fatal crash, Brown posted a video on YouTube, filmed with a camera inside his car. The film, as described by Brown, showed how he "became aware of the danger when Tessy alerted me with the 'immediately take over' warning chime, and the car swerving to the right to avoid the side collision."

"You can see where I took over when there's a little bit of blip in the steering," wrote Brown. "Tessy had already moved to the right to avoid the collision."

Without using Autopilot in April, Brown's near-accident may have been fatal.

In the end, we should remember what Autopilot is, and what it is not. It is not driverless, fully-autonomous, or self-driving. It's still closer to a smarter version of cruise control at this point, but its ultimate goal—stated by Elon Musk—is to prevent accidents and fatalities on the road.

Also see

About Hope Reese

Hope Reese is a Staff Writer for TechRepublic. She covers the intersection of technology and society, examining the people and ideas that transform how we live today.

Editor's Picks

Free Newsletters, In your Inbox