From The New York Times, an unobtainable article by Neal Boudette about Tesla:
Tesla's Self-Driving system cleared in crash
The NTSB found that, while Tesla's Autopilot feature did not prevent a crash in Florida, the system performed as it was intended.
Rico says that, if it was easy, anybody could do it. (Rico apologizes for the stingy attitude of The Times, and thus the lack of their article; you can read about it in an article by Louis Hansen from The San Jose Mercury News:
Federal regulators said they have closed an investigation of a fatal Tesla Autopilot crash, finding no defects and declining to issue a recall notice.Rico says he's still hoping his father gets one...
The six-month investigation by the National Highway Traffic Safety Administration into the nation’s first self-driving vehicle fatality found no problems with the design or performance of Autopilot. The system, based on radar, camera and machine learning technology, allows Tesla vehicles to sense potential crashes, stay within lanes, and adjust speeds automatically.
The probe did not uncover a “safety-related defect trend,” the report said, adding that “further examination of this issue does not appear to be warranted.” But the report said that drivers do need to pay better attention when using self-driving technology.
Tesla, based in Palo Alto, California, said in a brief statement that it appreciated the thoroughness of the investigation.“The safety of our customers comes first,” the company said.
Tesla CEO Elon Musk highlighted one finding of the report: crash rates dropped about forty percent in Teslas after the system was installed.
The Federal agency specifically looked at the design and performance of Tesla’s automatic emergency braking system, the interface between the driver and the vehicle, data from other Tesla crashes, and changes the company has made to Autopilot.
NHTSA spokesman Bryan Thomas said the agency favored the changes Tesla made to Autopilot after the crash, including more aggressive warnings when drivers take their hands off the wheel and that the system disengages when drivers repeatedly ignore the warnings. “It certainly addressed the issues we were evaluating,” Thomas said.
On 7 May 2016, a Tesla owner in a Model S was driving using Autopilot on a divided highway near Williston, Florida. A tractor-trailer made a legal, left-hand turn across the highway in front of the Tesla. The Tesla driver and Autopilot failed to brake, and the car crashed broadside into the truck. Josh Brown, a forty-year-old Navy veteran and entrepreneur from Ohio, was killed.
The safety administration examined data from multiple Tesla crashes where the airbags deployed. It also evaluated the automatic emergency braking performance, designed to stop or slow a vehicle before impact. The agency found that Tesla’s emergency braking worked as promised although, like all emergency braking systems on the market, its primary function is to limit rear-end collisions. It was not designed to brake for vehicles, like the tractor-trailer in Florida, crossing in front of a Tesla, the report said.
The report noted Tesla instructed drivers to “always keep your hands on the wheel” while using Autopilot, but suggested that the company could be more specific about the system’s limitations in its owner’s manual.
The crash led Tesla to re-examine and overhaul several parts of the driver assistance package. In September of 2016, Musk announced a new version of Autopilot, adjusting software to make the system more dependent on radar sensors. New Teslas rolling out of the factory also have greater computer processing power to handle data from its suite of sensors. The company has been sending out software upgrades this month.
Karl Brauer, publisher of Kelley Blue Book and Autotrader, said the crash is an example of a driver putting too much faith in Tesla’s sensors and computing power. “It almost certainly will not be the last incident on this journey, but people need to remember one thing: there are currently no fully autonomous cars available for public purchase or use,” Brauer said.
The findings did not free Tesla from criticism. Consumer Watchdog, an advocacy group critical of Tesla’s marketing of Autopilot, said in a statement Musk should be held responsible.
“The NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the technology and Tesla’s aggressive marketing,” the group said. “The very name Autopilot creates the impression that a Tesla can drive itself. It can’t.” The agency acknowledged criticisms about the system’s name, but said marketing was outside the scope of their investigation.
No comments:
Post a Comment
No more Anonymous comments, sorry.