Tesla Autopilot Faces U.S. Inquiry After Series of Crashes

0
342
Oracle enhances customer experience platform with a B2B refresh

Source is New York Times

The federal government’s top auto-safety agency has opened a formal investigation of Tesla’s Autopilot driver-assistance system because of growing concerns that it can fail to see parked emergency vehicles.

The National Highway Traffic Safety Administration said it was aware of 11 crashes since 2018 in which Tesla vehicles operating under Autopilot control had hit fire trucks, police cars and other vehicles with flashing lights that were stopped along roadways. Seven of those crashes have resulted in a total of 17 injuries and one death.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first-responder vehicle lights, flares, an illuminated arrow board, and road cones,” the safety agency said in a summary of the investigation.

The new investigation appears to be the broadest look yet at how Autopilot works and how it might be flawed. It could ultimately be used by the safety agency to force Tesla to recall cars and make changes to the system.

One critical issue that investigators will focus on is how Autopilot ensures that Tesla drivers are paying attention to the road. The company’s owner’s manuals instruct drivers to keep their hands on the steering wheel, but the system continues operating even if drivers only occasionally tap the wheel.

General Motors has a similar system, called Super Cruise, that allows drivers to take their hands off the steering wheel but uses an infrared camera to monitor drivers’ eyes to ensure that they are looking at the road.

The safety agency will also examine how Autopilot identifies objects on the road and where Autopilot can be turned on. Tesla tells drivers to use the system only on divided highways, but they can use it on city streets. G.M.’s system uses GPS positioning to restrict its use to major highways that do not have oncoming or cross traffic, intersections, pedestrians or cyclists.

Tesla’s Autopilot system appears to have difficulty detecting and braking for parked cars generally, including private cars and trucks without flashing lights. In July, for example, a Tesla crashed into a parked sport-utility vehicle at the site of an earlier accident. The driver had Autopilot on and had fallen asleep and later failed a sobriety test, the California Highway Patrol said.

The safety agency’s investigation will look at the Tesla Models Y, X, S and 3 from the 2014 to 2021 model years, totaling 765,000 cars, a large majority of the cars the company has made in the United States over that time.

The agency already has opened investigations into more than two dozen crashes that involved Tesla cars and Autopilot. The agency has said eight of those crashes resulted in a total of 10 fatalities. Those investigations are meant to delve into the details of individual cases to provide data and insights that the agency and automakers can use to improve safety or identify problem areas.

Tesla and its chief executive, Elon Musk, have dismissed safety concerns about Autopilot and claimed that the system made its cars safer than others on the road. But the company has acknowledged that the system can sometimes fail to recognize stopped emergency vehicles.

Safety experts, videos posted on social media and Tesla drivers themselves have documented some of the weaknesses of Autopilot. In some accidents involving the system, drivers of Teslas have been found asleep at the wheel or were awake but distracted or disengaged. A California man was arrested in May after leaving the driver’s seat of his Tesla while it was on Autopilot; he was sitting in the back of his car as it crossed the Bay Bridge, which connects San Francisco and Oakland.

The National Transportation Safety Board, which has investigated a couple of accidents involving Autopilot, said last year that the company’s “ineffective monitoring of driver engagement” contributed to a 2018 crash that killed Wei Huang, the driver of a Model X that hit a highway barrier in Mountain View, Calif. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” Robert L. Sumwalt, the board’s chairman, said last year.

Source is New York Times

Vorig artikelMicrosoft launches 'top secret' Azure cloud region for US intelligence community
Volgend artikelICO consults public on personal data in employment practices