Tesla is yet again undergoing scrutiny from federal regulators in the United States. The issue at hand now is whether the automotive company did enough in response to the 2023 recalls by the U.S. National Highway Traffic Safety Administration (NHTSA) and Transport Canada. The recalls were issued after several collisions resulting from using Tesla’s proprietary Autopilot system.
Tesla’s Autopilot—alongside systems like Cadillac’s Super Cruise and Volvo’s Pilot Assist—falls in the category of partially automated systems. These systems can take control of the vehicle’s acceleration and lane position however, they require the human driver to stay vigilant.
The recall issued on Dec. 12, 2023 affected over two million Tesla vehicles in North America. At the core of the recall were some of Autopilot’s features which, according to regulators, could have contributed to drivers misusing the system or failing to maintain continuous and sustained responsibility for vehicle operations.
The Autopilot recall is not the most recent in Tesla’s history. In April, regulators issued a recall of nearly 4,000 Cybertrucks in the U.S.—the design of the accelerator pedal causes it to get stuck when pressed, increasing the risk of collisions.
Recalls and investigations
On April 25, U.S. regulators announced that they have opened an investigation to assess the outcomes of the 2023 recall. In particular, it was reported that the Autopilot system was involved in 20 crashes since the recall—this prompted the current investigation by the NHTSA’s Office of Defects Investigation.
On the same day, the NHTSA reported opening another investigation, this time into the safety of BlueCruise, Ford’s partially automated system. In the report, the NHTSA writes that it will investigate two fatal incidents involving Ford Mustang Mach-E vehicles that occurred while the BlueCruise system was active.
Expected accidents
Although the onslaught of safety investigations by U.S. and Canada regulators may seem sudden, experts have been expecting these accidents to occur for years. For decades, research in human-machine interaction has indicated that there are inherent risks to having humans and robots share control of automated systems.
Unlike fully automated systems of everyday use—like elevators, whose safety has been long proven—current partially automated driving systems cannot operate on their own. Because of their inability to deal with the vast array of road, weather and traffic conditions that humans encounter daily, these systems require human drivers to stay alert at all times which drivers may not be very good at.
This, combined with many of these systems’ inability to accurately detect conditions like distraction and drowsiness, has contributed to the upward trend in collisions involving partially automated systems.
Protecting the public
In addition to the above actions by federal regulators in the U.S. and Canada, provincial and state governments alike have a role to play. They need to promote road safety and ensure that the general public does not serve as guinea pigs in the current autonomous vehicle experiment.
Earlier in April, British Columbia prohibited the operation of highly automated systems on provincial roads. This ban only impacts the use of a limited set of vehicles with more advanced capabilities; however, it represents an important step towards regulating technology and ensuring that vehicles function safely.
The Conversation
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Since Tesla recalled its vehicles in 2023, there have been 20 accidents and investigators are asking why (2024, May 6)
retrieved 6 May 2024
from https://techxplore.com/news/2024-05-tesla-recalled-vehicles-accidents.html
part may be reproduced without the written permission. The content is provided for information purposes only.
Tesla is yet again undergoing scrutiny from federal regulators in the United States. The issue at hand now is whether the automotive company did enough in response to the 2023 recalls by the U.S. National Highway Traffic Safety Administration (NHTSA) and Transport Canada. The recalls were issued after several collisions resulting from using Tesla’s proprietary Autopilot system.
Tesla’s Autopilot—alongside systems like Cadillac’s Super Cruise and Volvo’s Pilot Assist—falls in the category of partially automated systems. These systems can take control of the vehicle’s acceleration and lane position however, they require the human driver to stay vigilant.
The recall issued on Dec. 12, 2023 affected over two million Tesla vehicles in North America. At the core of the recall were some of Autopilot’s features which, according to regulators, could have contributed to drivers misusing the system or failing to maintain continuous and sustained responsibility for vehicle operations.
The Autopilot recall is not the most recent in Tesla’s history. In April, regulators issued a recall of nearly 4,000 Cybertrucks in the U.S.—the design of the accelerator pedal causes it to get stuck when pressed, increasing the risk of collisions.
Recalls and investigations
On April 25, U.S. regulators announced that they have opened an investigation to assess the outcomes of the 2023 recall. In particular, it was reported that the Autopilot system was involved in 20 crashes since the recall—this prompted the current investigation by the NHTSA’s Office of Defects Investigation.
On the same day, the NHTSA reported opening another investigation, this time into the safety of BlueCruise, Ford’s partially automated system. In the report, the NHTSA writes that it will investigate two fatal incidents involving Ford Mustang Mach-E vehicles that occurred while the BlueCruise system was active.
Expected accidents
Although the onslaught of safety investigations by U.S. and Canada regulators may seem sudden, experts have been expecting these accidents to occur for years. For decades, research in human-machine interaction has indicated that there are inherent risks to having humans and robots share control of automated systems.
Unlike fully automated systems of everyday use—like elevators, whose safety has been long proven—current partially automated driving systems cannot operate on their own. Because of their inability to deal with the vast array of road, weather and traffic conditions that humans encounter daily, these systems require human drivers to stay alert at all times which drivers may not be very good at.
This, combined with many of these systems’ inability to accurately detect conditions like distraction and drowsiness, has contributed to the upward trend in collisions involving partially automated systems.
Protecting the public
In addition to the above actions by federal regulators in the U.S. and Canada, provincial and state governments alike have a role to play. They need to promote road safety and ensure that the general public does not serve as guinea pigs in the current autonomous vehicle experiment.
Earlier in April, British Columbia prohibited the operation of highly automated systems on provincial roads. This ban only impacts the use of a limited set of vehicles with more advanced capabilities; however, it represents an important step towards regulating technology and ensuring that vehicles function safely.
The Conversation
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Since Tesla recalled its vehicles in 2023, there have been 20 accidents and investigators are asking why (2024, May 6)
retrieved 6 May 2024
from https://techxplore.com/news/2024-05-tesla-recalled-vehicles-accidents.html
part may be reproduced without the written permission. The content is provided for information purposes only.