Some of you may have seen the recent video of a Tesla driver asleep at the wheel going about 75 MPH down the highway. He wasn’t just resting his eyes; he was in hibernation mode. Done for the day. Lights out. You know, that kind of deep sleep where you have no idea where you are when you wake up. This man was utterly unconscious while zipping by mini vans packed with kids, snarling semi-trucks, and road ragers.
Here’s another one for you.
Last year, California Highway Patrol spotted a guy asleep in his Tesla going 70 MPH down Highway 101 in Palo Alto at 3:30 in the morning. According to WIRED:
"[They] moved behind the car and turned on their siren and lights. When the driver didn't respond, the cops went beyond their standard playbook. Figuring the Tesla might be using Autopilot, they called for backup to slow traffic behind them, then pulled in front of the car and gradually started braking. And so the Tesla slowed down, too, until it was stopped in its lane."
As you would imagine, the driver was under the influence and immediately arrested. Tesla founder Elon Musk responded to a Tweet about how the autopilot feature should behave if the driver is unconscious. It's an ingenious concept with obvious flaws:
Now I do agree that self-driving cars have wonderful benefits. In general, they can lead to fewer accidents caused by user-error. Fewer accidents mean fewer injuries and deaths. I am all for a safer road, and I do hope the technology gets to that point.
But when you completely disengage - to the point where you feel comfortable checking out for 2 hours - you put yourself and your fellow defensive drivers at serious risk. Tesla has clearly warned its drivers of the responsibility they have behind the wheel, stating that their autopilot feature is "intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any time." Plain and simple.
But these cars are cool. They’re the future, and we want them to perform like futuristic automobiles. David Zuby, Chief Research Officer at the Insurance Institute for Highway Safety, elaborates:
"If [car designers] limit functionality to keep drivers engaged, they risk a backlash that the systems are too rudimentary. If the systems seem too capable, then drivers may not give them the attention required to use them safely."
So basically, if the consumer feels like their self-driving car can't do it all, it becomes an underwhelming purchase. On the other hand, if they feel invincible behind those sleek controls, they'll hand over all power and authority….and go to sleep.
Both Tesla and Uber autonomous cars have led to deadly crashes. A Tesla driver was killed last year while driving in autopilot mode. The car sped up and steered into a concrete barrier. According to the Guardian, Tesla "has pointed to its manual which warns that the technology cannot detect all objects,” and again, “drivers should remain attentive." Tesla also noted that "the driver had received multiple warnings to put his hands on the wheel and said he did not intervene during the five seconds before the car hit the divider."
So it seems like for the most part, the car isn’t the issue. They are intelligent, meticulously programmed to mimic the human brain when it comes to making decisions on the road. We are still the problem. Just because our cars are more intelligent doesn’t mean we should act less intelligent.
When you are inside of a two-ton metal object traveling 80 MPH on the same concrete slab as hundreds of other two-ton metal objects, it is common sense to stay alert at all times.
As a fellow driver and as someone who helps auto accident victims every day, let’s do better behind the wheel. No texting. No multi-tasking. No eating three-course meals. No dozing off.
I think we can manage that.