Out Cold and Behind the Wheel in Semi-autonomous and Autonomous Cars


Strangely enough, none of the car manufacturers with semi-autonomous models had press images of people passed out behind the wheel. So instead we give you this well-coiffed guy pretending to be asleep. You get the idea. Jupiterimages/Getty Images
Strangely enough, none of the car manufacturers with semi-autonomous models had press images of people passed out behind the wheel. So instead we give you this well-coiffed guy pretending to be asleep. You get the idea. Jupiterimages/Getty Images

You're behind the wheel of your shiny, new BMW loaded with the full driver-assistance package. You're watching the scenery roll by, maybe having a little snack when — wham! — you're out.

Maybe it's a heart attack. Maybe it's the result of that wild night out. Maybe you're just really tired. Who knows? Whatever, you're out cold as your slick, new Bimmer barrels merrily down the highway.

Now what?

Well, first off, this scenario is not all that far-fetched. Not the self-driving part (you've heard of Tesla's Autopilot system, right?) and not the out-behind-the-wheel thing.

The Casualty Actuarial Society put together an Automated Vehicles Task Force that reported on just such cases back in 2014. (They would, of course, because they're actuaries, and assessing risk is what they do.)

Here's what the CAS said in that report:

Observing that 2% of accidents are caused when some sort of physical impairment, such as a heart attack, inhibits the driver's ability to effectively control his vehicle may suggest some important risk management controls. Depending on the trip and the automated vehicle's response, the technology could produce either a better or worse result for the car's passenger.

So yes, it happens. People have heart attacks, people are inattentive, people fall asleep, people suffer seizures while driving. More often than you'd think.

In Driver-assisted Cars

In conventional cars, of course, that's usually not good -- for the driver or the other people nearby. Assuming nobody is able to control the car at all, and without automatic braking (which is available on many cars now), the vehicle will continue until something stops it. It's Newton's first law of motion.

But in a world with semi-autonomous cars already here, and truly autonomous cars coming up quickly, that unhappy ending doesn't have to happen. Although it still could.

In May, a 40-year-old man was killed when his Tesla Model S, with Autopilot engaged, slammed into a tractor-trailer that was crossing its path. It was tragic, but it also demonstrated how safe even semi-autonomous cars can be. It was the first fatality in a Tesla in Autopilot mode in more than 130 million miles (209 million kilometers) of driving. The company elaborated, "Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles."

No one knows quite what went wrong in Florida. The accident is being investigated by two different federal agencies. The speculation is that the sensors on the Tesla didn't pick up the side of the truck against the bright sky. Witnesses reported the car never braked, continuing for hundreds of feet after the collision.

Tesla warns its owners that drivers must stay alert when the car is in Autopilot, with their hands on the wheel. They don't always do that — see here and here and here. It's still unclear what was happening just before the Florida wreck.

Many cars are, by definition, already semi-autonomous. They sport mind-blowing features like automatic emergency braking, adaptive cruise control, blind-spot warnings, rear cross-traffic alert, lane-departure warning and lane-keeping assist. Tesla's feature, which allows the driver to go hands-free (although the company doesn't recommend that) and maintain its position on the road, is called Autosteer.

If the Tesla's sensors in the Florida accident had recognized the truck — and, again, they may have; investigations continue — Autopilot could have saved the day, even if the driver happened to be incapacitated. (Again, there's no evidence of that.) In Autopilot, the Tesla is designed to automatically brake to avoid a collision, signal a warning inside the cabin, activate its hazard lights and continue to a safe stop if no input from the driver is received.

In the Car of the Future

Completely autonomous cars, when they get here, will do the same. Even more.

It's been tricky, so far, getting truly autonomous cars on the road. The reason is simple.

"The developers have yet to design or at least demonstrate a system that can achieve a socially acceptable level of risk across a wide range of driving conditions," says Bryant Walker Smith, an automated driving expert.

But developers are working on it. Smith, a law professor at the University of South Carolina and an affiliate scholar at the Center for Internet and Society at Stanford Law School, says that companies are constantly testing cars in literally thousands of different scenarios. Google has driverless cars in several cities with varying terrain and weather. Tesla is gathering data from Autopilot users across the world and has no plans to stop.

Smith — whose research is on newlypossible.org — envisions a time when autonomous cars will make the roadways safer (at least 30,000 Americans die on roadways every year). Eventually, an autonomous car may be able to save its out-cold driver, too.

"It may be monitoring your vital signs. It may recognize you have a health problem. It may change course and start driving to the nearest hospital, pull up into the emergency bay and already have transmitted all your vital information to the emergency responders," Smith says. "It may even do that before you have a heart attack, potentially. At what point is your car actually a medical device? That may be sooner than we think."