Tesla’s Autopilot Technology Faces Fresh Scrutiny

Tesla then faced many questions about autopilot technology A Florida driver was killed in 2016 When the system of sensors and cameras failed to see and brake a tractor-trailer crossing a road.

Now the company faces more scrutiny than in the past five years for Autopilot, which Tesla and its chief executive, Elon Musk, have long maintained Its cars are safer than other cars. Federal officials have recently been looking into a series of accidents involving Tesla that were either using autopilot or perhaps using it.

The National Highway Traffic Safety Administration confirmed last week that it was investigating 23 such crashes. In an accident the same month, a Tesla Model Y stopped a police car on a highway near Lansing, Mich. The driver, who was not seriously injured. Was using autopilot, The police said.

In Detroit in February, under similar circumstances 2016 Florida AccidentA Tesla tore the roof off the car from under a tractor-trailer crossing the road. The driver and a passenger were seriously injured. Authorities have not revealed whether the driver had turned on the autopilot.

NHTSA is also looking at February 27 Crash near houston In which a Tesla ran into a stopped police vehicle on a highway. It is not clear whether the driver was using Autopilot. Police said the car did not slow down before impact.

Autopilot is a computerized system that uses radars and cameras to detect radars and other vehicles and objects in the road. It can automatically steer, brake and accelerate with less input from the driver. Tesla has said that it should only be used on divided highways, but Videos on social media show drivers Using autopilot on a wide variety of roads.

“We need to look at the results of the investigation first, but these incidents are the latest examples showing these advanced cruise-control features. Tesla is not very good at detecting and then stopping for a vehicle that stops in a highway situation Is given, “said Jason Levine, executive director of the Center for Auto Safety, a group formed in the 1970s by the Consumers Union and Ralph Nadar.

This renewed investigation comes at a critical time for Tesla. After reaching record highs this year, its share price has fallen nearly 20 percent amid signs of the company’s electric cars Losing market share for traditional vehicle manufacturers. Ford Motor’s Mustang Mach E and Volkswagen ID.4 recently came into showrooms and are considered a serious challenge for the Model Y.

The result of the current investigation is important not only for Tesla but also for other technology and auto companies that are working on autonomous cars. While Mr. Musk has often suggested that widespread use of these vehicles is imminent, Google’s parents, Alphabet, Division, Ford, General Motors and Waymo, have said That moment can be years or even decades away.

Bryant Walker Smith, a professor at the University of South Carolina who advised the federal government on automated driving, said it was important to develop advanced technologies to reduce traffic-related disasters, which now amount to about 40,000 per year. But he said he had concerns about the autopilot, and how the name and Tesla’s marketing could help drivers safely divert their attention off the road.

“There is an incredible disconnect between what the company and its founders are saying and let people believe, and their system is truly capable,” he said.

Tesla, which dissolved its public relations department and generally did not respond to inquiries from reporters, did not return phone calls or emails seeking comment. And Mr. Musk did not respond to questions sent to him on Twitter.

The company has not publicly addressed recent accidents. However it can determine if Autopilot was on at the time of the accidents because its cars constantly send data to the company, so it has not been said if the system was in use.

The company has argued that its cars are very safe, claiming that its own data show that Teslas are less driven per mile and even lower when the autopilot is in use. It has also stated that it tells drivers that they should pay full attention to the road when using autopilot and should always be ready to take back control of their cars.

A federal investigation of the 2016 fatal crash in Florida found that the autopilot had failed to recognize a white semi-trailer against a bright sky, and that the driver was able to use it while he was not on the highway. The autopilot continued to operate the car at speeds of 74 mph, with even the driver, Joshua Brown, ignoring a number of warnings to keep his hands on the steering wheel.

A second fatal incident occurred in Florida in 2019 under similar circumstances – a Tesla crashed into a tractor-trailer when the autopilot was engaged. Investigators determined that the driver did not place his hand on the steering wheel before the impact.

While NHTSA has not forced Tesla to recall the autopilot, the National Transportation Safety Board concludes that the system “Played a major role in the 2016 Florida crash “. It also said the technology lacked safety measures to stop drivers from turning off the steering wheel or looking away from the road. Security board Came to a similar conclusion When it investigated the 2018 crash in California.

By comparison, a similar GM system, the Super Cruise, monitors a driver’s eye and closes if the person looks away from the road for more than a few seconds. That system can only be used on major highways.

in 1 February letterThe chairman of the National Transportation Safety Board, Robert Sumwalt, criticized NHTSA for not doing more for the evaluation of the autopilot and requiring Tesla to add safety measures that prevent drivers from abusing the system.

The new administration in Washington could take a stronger line on security. The Trump administration did not seek to enforce many regulations on autonomous vehicles and the auto industry did not seek to make other regulations easier, including fuel-economy standards. President Biden, by contrast, has appointed an acting NHTSA administrator, Steven Cliff, who served on the California Air Resource Board, often clashing with the Trump administration over regulations.

Concerns about the autopilot may prevent some car buyers from paying for the more advanced version of Tesla, Complete self-driving, which the company sells for $ 10,000. Many customers have paid for it in the hope of being able to use it in the future; Tesla commissioned the option on about 2,000 cars in a “beta” or test version starting at the end of last year, and Mr Musk recently said the company would soon Make it available to more cars. Full self-driving is believed to be capable of driving Tesla cars on cities and local roads, where oncoming traffic, intersections, traffic lights, pedestrians and cyclists make the driving situation more complicated.

Despite their name, autopilot and full self-driving have major limitations. Their software and sensors cannot control cars in many situations, which is why drivers have to keep their eyes on the road and hands on or near the wheel.

in A November letter to the California Department of Motor Vehicles Recently made public, a Tesla lawyer admitted that full self-driving struggled to react to a wide range of driving conditions and should not be considered a fully autonomous driving system.

Tesla’s associate general manager, Eric C. Williams wrote, not being able to identify or respond “to certain special circumstances and events”. “These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections, driving routes, complex, unfavorable or unfavorable vehicles in uncontrolled roadways in many ways.”

Mr. Levine of the Center for Auto Safety complained to federal regulators that the names autopilot and full self-driving are misleading at best and may encourage some drivers to be careless.

“Autopilot suggests cars can drive themselves and, more importantly, stop themselves,” he said. “And they doubled down with full self-driving, and again to convince consumers that the vehicle is capable of doing what it is capable of doing.”


Source link

Leave a Comment