CXO

It's not just semantics: Why Tesla should stop using the term 'Autopilot'

It's easy for professionals to use technical terms, but considering the end user and choosing words that are readily understandable can make a big difference.

istock74256007medium.jpg
Image: iStock/Taina Sohlman

The latest Tesla kerfuffle surrounds their Autopilot technology, as a driver of a Tesla vehicle in Autopilot mode was recently killed in a crash with a tractor trailer, prompting debate about technology-driven driving aids and whether the Tesla system should be banned or disabled.

More for CXOs

Report: 2018 IT budgets are up slightly; spending focus is on security, hardware, and cloud

In a recent Tech Pro Research survey, 39% of respondents said their 2018 budget would increase slightly. This report and infographic has more information about how that money will be spent.

While I applaud Tesla for advancing these technologies, the choice of the name Autopilot was a poor one. But it's a choice many of us in technology frequently duplicate and should avoid.

In seeking to explain a complex technology, we often choose a moniker that's technically correct, but confusing to most of the population. Over the course of the current debate around its readiness for public use, Tesla has insisted that drivers must pay attention to the road while using the feature. Tesla has drawn the analogy to aircraft autopilot systems, which are designed as a "workload management system." Rather than absolving pilots from responsibility for their aircraft, autopilot generally manages a subset of flying tasks, like maintaining airspeed and altitude, freeing the pilot to monitor other aircraft systems.

This is a technically correct analogy to Tesla's system, whereby the car can manage some basic driving functions like maintaining speed and keeping the vehicle in a marked lane, but the driver ultimately must pay attention to the road and react to changing conditions, particularly unexpected or emergency situations that the current sensor and software suite is incapable of managing. In the fatal accident that triggered the debate, Tesla's Autopilot failed to recognize a large truck crossing the path of the vehicle, a situation easily recognizable by a human but challenging to the Tesla due to various environmental conditions.

SEE: Tesla's Autopilot: The smart person's guide

The semantic problem arises in that the vast majority of the general public are not licensed pilots or familiar with the nuances of aircraft autopilot or workload management. The closest most of us come to understanding autopilot systems is that we've seen a movie or two where autopilot systems allow pilots to take a nap or eat their lunch in a contrived scenario. Or more likely, we interpret "automatic pilot" at face value due to lack of knowledge of the technical nuances, and assume it's a device that will automatically pilot an airplane or vehicle, allowing the driver to check their phone, take a nap, or watch a movie, as myriad online videos show Tesla owners doing.

Avoiding your own Autopilot

As technologists, it's easy to gloss over semantics. We're usually neck deep in technology and assume that someone immediately grasps what we're requesting when we ask someone to "ping" us, or look down on the poor rube who pronounces ERP like famous lawman Wyatt's last name rather than spelling it out, or refers to "sap," as in what comes from a tree, instead of the haughty "S-A-P" that the cognoscenti prefer.

Worse yet is when we fail to remember that enterprise technologies are often linked to a human-driven business process. In incidents I've seen, a new system "controlled and automated" a logistics process was painstakingly explained in a training class, and weeks later, that plant was no longer manually moving pallets around, the workers having assumed that "automated" meant the system did the work rather than merely printing good movement slips that required human action.

Consider the end user

Many of these semantic problems could be avoided by simply considering the users of a technology that you're trying to explain. Rather than assuming they'll understand the nuances of our technically-correctly named Autopilot versus a level 4 autonomous driving aid, use terminology and descriptions that are readily understandable. In Tesla's case, perhaps something like "driving assistant" or "next generation cruise control" would have been more clear.

In the case of enterprise systems, put yourself in the shoes of the executive or overworked shop employee who has to quickly understand the technology you're presenting and rapidly grasp how it affects him or her, all in addition to dozens of other demands and changes that they're being subjected to on a regular basis. While it might seem cool to apply fancy technology terms or go for the technically accurate description rather than the readily understandable, ultimately you're increasing risk in the long term. While the consequences of opaque semantics in a software deployment may not be as dire as being hauled before the U.S. Congress like Tesla, simply striving for clarity and focusing on your audience is a simple and cost-free way to avoid unnecessary misunderstandings and the risks they induce.

Also see:
Tesla's fatal Autopilot accident: Why the New York Times got it wrong
Tesla driver dies in first fatality with Autopilot: What it means for the future of driverless cars
10 human factors that are critical to your app testing
Everyone's an architect. Here's who actually deserves that title.

About Patrick Gray

Patrick Gray works for a global Fortune 500 consulting and IT services company and is the author of Breakthrough IT: Supercharging Organizational Value through Technology as well as the companion e-book The Breakthrough CIO's Companion. He has spent ...

Editor's Picks

Free Newsletters, In your Inbox