“A TESLA MODEL S driver using the car’s semi-autonomous Autopilot feature died when the car hit an 18-wheeler, the first known fatality involving technology that remains in beta testing.
The collision occurred May 7 when the big-rig made a left turn in from of the Model S at an intersection on a divided highway in Williston, Florida. “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied” and the car drove under the trailer, the automaker said today.
The National Highway Traffic Safety Administration sent a Special Crash Investigations Team to examine the vehicle and the crash scene. Experts from the agency’s Office of Defects Investigation plan to examine the design and performance of the Autopilot system. The agency said in a statement that opening an investigation “should not be construed as a finding that” the agency “believes there is either a presence or absence of a defect in the subject vehicles.”
The Silicon Valley automaker points out that its Autopilot is disabled by default, and drivers can activate it only after acknowledging that the technology is still in beta testing. Drivers are instructed to keep their hands on the steering wheel at all times and be ready to assume complete control at any moment.
“We do this to ensure that every time the feature is used, it is used as safely as possible,” Tesla says. And the automaker argues that even though it’s not perfect, “the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety.”
Still, something like this was all but inevitable. Tesla activated the feature via an over-the-air software update on October 15, and within days people were posting videos of them doing all kinds of stupid things, including sitting in the back seat and even sleeping. It wasn’t long before three people drove a Model S cross-country in less than 58 hours, using Autopilot to barrel along at up to 90 mph. The crash raises the long-anticipated liability question of autonomous driving: Who is at fault when someone dies? Tesla may describe the technology as a safety system designed to augment a driver’s vigilance, but many drivers appear to consider it an autonomous system capable of taking over entirely.
Although there aren’t any laws against cars driving themselves, regulations governing autonomous operation remain far from clear. “Companies can get away with a lot that’s in a legal gray area, as long as bad things don’t happen,” Bryant Walker Smith, an expert on the technology at University of South Carolina School of Law, said last year. But regulators step in when something goes awry.
The fatality also underscores why most automakers are moving far more cautiously in rolling out semi-autonomous systems. Cadillac, for example, announced in January that it is delaying the debut of its Supercruise feature. “Technical development will only proceed to production when it is well and truly ready,” company spokesman David Caldwell said at the time. “We won’t release it just to hit a date, nor will we ‘beta test’ with customers.” His comment was a blatant dig at Tesla Motors, and, in hindsight, a prescient awareness of what could go wrong.”