Home > News list > Tech >> Industry dynamics

Tesla assisted driving system has caused 736 car accidents in the United States over the past four years, with at least 17 deaths

On June 12, it was reported that after analyzing the data collected by the National Highway Traffic Safety Administration (NHTSA), the media found that since 2019, there have been 736 accidents involving Tesla's driver assisted driving system in the United States, far more than previously reported. At least 17 people were killed in these accidents

On June 12, it was reported that after analyzing the data collected by the National Highway Traffic Safety Administration (NHTSA), the media found that since 2019, there have been 736 accidents involving Tesla's driver assisted driving system in the United States, far more than previously reported. At least 17 people were killed in these accidents. The surge in such crashes indicates that the use of Tesla's assisted driving system is becoming increasingly widespread, and the dangers it poses are also increasing.

The following is the translated text:

A police report stated that one afternoon in March this year, when 17-year-old Tillman Mitchell got off the bus, the school bus was displaying a stop sign and the red warning light was constantly flashing. Then, a Tesla Model Y drove into Highway 561 in North Carolina.

It is said that the driver assisted driving system Autopilot on this car was in a starting state at the time, but it showed no signs of slowing down. This ModelY collided with Mitchell at a speed of over 70 kilometers per hour. According to his aunt Dorothy Lynch, Mitchell first hit the windshield, then flew into the air, and finally fell face down on the road. Mitchell's father heard the crash and rushed out of the porch to find his son lying in the middle of the road. Lynch said, "If he were a younger child, he might have died a long time ago

Media analysis of data collected by NHTSA shows that the crash occurred in Halifax County, North Carolina, and is only one of the 736 crashes involving Tesla Autopilot in the United States since 2019, far exceeding previous reports. Data shows that the number of such crashes has surged in the past four years, indicating that as Tesla driver assisted driving technology is increasingly widely used and more cars equipped with such systems appear on American roads, the dangers they pose are also increasing.

The data also shows a significant increase in the number of deaths and serious injuries related to Autopilot. When NHTSA first released partial statistics on accidents involving Autopilot in June 2022, they only counted three fatal accidents specifically related to this technology. The latest data includes at least 17 fatal accidents, of which 11 have occurred since May last year, and 5 people have been seriously injured.

Mitchell survived a car crash in March, but suffered neck and leg fractures and had to undergo ventilator treatment. He still suffers from memory problems and difficulty walking. His aunt Lynch said that this incident should be a warning of the danger of this technology. I pray this is a learning process, "Lynch said. When it comes to machines, people trust them too easily

Elon Musk, CEO of Tesla, said that when the Autopilot system was started, Tesla cars were safer than cars driven only by humans, and listed the crash rates in two driving modes. He hopes that Tesla will develop and deploy more assisted driving features to avoid parked school buses, fire trucks, parking signs, and pedestrians. He even argues that this technology will bring a safer and almost no more car accidents in the future. Although it is impossible to say how many car accidents have been avoided, data shows that the technology tested in real-time on American highways has obvious flaws.

Media analysis found that the 17 fatal crashes involving Tesla Autopilot showed different patterns: 4 involved collisions with motorcycles, and the other involved collisions with emergency vehicles. At the same time, multiple experts have stated that some of Musk's decisions seem to be one of the reasons for the increase in accident reports, such as the widespread expansion of the availability of auxiliary driving functions and the cancellation of radar sensors on vehicles.

Tesla and Musk did not respond to requests for comment.

NHTSA stated that a collision report involving driver assisted driving systems does not necessarily mean that this technology is the cause of the accident. The agency's spokesperson Veronica Morales said, "NHTSA is actively investigating Tesla Autopilot, including its upgraded version of FSD. NHTSA reminds the public that all advanced assisted driving systems require human drivers to be in control at all times and fully engaged in driving tasks. Therefore, laws in all states in the United States require human drivers to be responsible for the operation of their vehicles

Musk has repeatedly defended his decision to promote assisted driving technology to Tesla car owners, believing that its advantages outweigh its disadvantages. He said last year, "Since you believe that increasing autonomy can reduce casualties, I believe you have a moral obligation to deploy it, even if you will be sued and criticized by many people

Missy Cummings, a former senior safety consultant of NHTSA and professor of the School of Engineering and Computing of George Mason University, said that the explosion of accidents involving Tesla was worrying.

In response to the above analysis findings, she said that Tesla's car accident was more serious and fatal than normal. Cummins said one possible reason is that in the past year and a half, the promotion of FSD has expanded, bringing assisted driving to streets in cities and residential areas. In fact, anyone can have this system. Is it reasonable to expect this to lead to an increase in accident rates? Of course it is reasonable

Cummins said that compared to overall car crashes, the increase in fatalities is also a concern.

It is currently unclear whether NHTSA's data records every collision involving Tesla's assisted driving system, but it includes some accidents with Autopilot or FSD enabled, including three fatal accidents.

After a federal order in 2021 required automakers to disclose accidents involving assisted driving technology, NHTSA, as the highest automotive safety regulatory agency in the United States, began collecting data. Compared to all road accidents, the total number of crashes involving this technology is still very small. NHTSA estimates that over 40000 Americans died in various car accidents last year.

Data shows that since the introduction of crash reporting requirements, the vast majority of 807 crashes related to assisted driving systems have involved Tesla. This is because Tesla's attempt at autonomous driving is more active than that of other car manufacturers, and it is also related to almost all fatal accidents.

Subaru ranks second with a total of 23 crashes since 2019. This huge gap may reflect the wider deployment and use of driving assistance systems throughout Tesla's entire fleet, as well as encouraging Tesla drivers to use Autopilot.

Tesla launched Autopilot in 2014, which is a set of auxiliary driving functions that allow cars to automatically turn from the highway on ramp to the off ramp, maintain a specific speed during this period, maintain a distance from other vehicles, and follow the lane line. Tesla has made Autopilot a standard feature of its vehicles, with over 800000 Tesla cars driving on American roads equipped with Autopilot, despite the cost of iteration.

FSD is an experimental feature that consumers must purchase, allowing Tesla to turn according to the instructions on the route, stop at stop signs and red lights, automatically turn and change lanes, and respond to hazards along the way, thereby maneuvering from point A to point B. Tesla stated that regardless of which system is used, drivers must monitor road conditions and intervene when necessary.

The increase in car accidents coincides with Tesla actively promoting FSD, with FSD users expanding from approximately 12000 to nearly 400000 in just over a year. Nearly two-thirds of all crashes involving assisted driving systems reported by Tesla to NHTSA occurred in the past year.

Philip Koopman, a professor at Carnegie Mellon University, has studied the safety of autonomous vehicle for 25 years. He stated that the frequent occurrence of Tesla's name in the data raises key issues.

The significant increase in numbers is definitely worrying, "he said. We need to understand whether this is due to more serious collisions or other factors, such as a significant increase in mileage when Autopilot mode is activated

In February of this year, Tesla recalled over 360000 vehicles equipped with FSD systems due to concerns that the software could prompt their vehicles to violate traffic lights, stop signs, and speed limit rules.

According to a document released by the safety agency, if drivers do not intervene and disregard traffic regulations, it may increase the risk of a collision. Tesla stated that the company has fixed these issues through aerial software updates and remotely resolved the risks.

Although Tesla continues to improve its driving assistance software, it has also taken unprecedented measures by removing radar sensors from new cars and disabling radar sensors on vehicles already on the road. In the case of a global shortage of computer Chip shortage, Musk introduced a simpler set of hardware to remove a key sensor from these vehicles. Last year, Musk said, "Only extremely high-resolution radar makes sense." There are reports that Tesla has recently taken measures to reintroduce radar sensors.

In a demonstration in March, Tesla claimed that by comparing the mileage traveled in each collision, the incidence of crashes involving FSD was at least five times lower than that of vehicles under normal driving. Without detailed data provided by Tesla, this statement and Musk's description of autonomous driving systems as "absolutely safer" may not be confirmed.

Autopilot is largely a highway system that can operate in less complex environments compared to the various situations experienced by typical road users.

It is currently unclear which system was used in fatal accidents covered by NHTSA data: Tesla has requested NHTSA not to disclose relevant information. In the NHTSA data section explaining the software version, Tesla marks it with capital letters: "Edited, may contain confidential commercial information

In recent years, both Autopilot and FSD have received close attention. Transportation in the United States Secretary Pete Buttigeg said last month that when the law requires you to put your hands on the steering wheel and keep your eyes on the road, Autopilot is obviously not the most appropriate name.

NHTSA has conducted multiple investigations into the crash involving Tesla and other issues with its assisted driving software. One of the research focuses on the so-called "phantom brakes", where vehicles suddenly slow down due to imagined dangers. In one case last year, a Tesla ModelS allegedly equipped with the auxiliary driving system suddenly braked in the traffic on the San Francisco Bay Bridge, resulting in a series of collisions of eight vehicles and nine injuries, including a 2-year-old child.

In other complaints submitted to NHTSA, car owners stated that these cars also suddenly brake when encountering trucks in the oncoming lane.

Many car accidents involve similar settings and conditions. For example, NHTSA has received over a dozen reports of Tesla cars colliding with parked emergency vehicles while starting Autopilot. Last year, the agency said it would upgrade its investigation into these incidents to "engineering analysis", which is a prelude to mandatory large-scale recalls.

Also last year, NHTSA conducted two consecutive special investigations into fatal crashes involving Tesla cars and motorcycle riders. It is said that one of the accidents occurred in Utah. Shortly after 1 a.m., a man riding a Harley Davidson motorcycle was driving on Interstate 15 outside Salt Lake City. A Tesla car with Autopilot turned on collided from behind.

The Utah Department of Public Safety stated that Tesla's driver did not see the motorcycle rider and collided with the back seat of the motorcycle, causing the rider to fall and die on the spot. Cummins said, "Motorcycles driving near Tesla are very dangerous

Among hundreds of crashes involving Tesla assisted driving systems, NHTSA has focused on conducting in-depth analysis of approximately 40 accidents, hoping to gain a deeper understanding of the operation of this technology. This includes a car accident involving Mitchell in North Carolina.

After the accident, Mitchell woke up in the hospital with no memory of what had happened. According to North Carolina Highway Patrol Sergeant Marcus Bethea, Tesla driver Howard G. Yee was charged with multiple crimes in the crash, including reckless driving, passing a stopped school bus, and hitting a person, which is a first degree felony.

It is said that Howard fixed the weight on the steering wheel at the time to lure Autopilot to record that the driver's hand was still held on the steering wheel. According to the setting, if the driver does not apply steering pressure for a long time, Autopilot will disable this function.

NHTSA is still investigating the car accident, and a spokesperson for the agency declined to provide further details on the grounds of 'not commenting on the ongoing investigation'. Tesla has requested that the agency exclude the company's summary of the accident from public view, stating that it may contain confidential commercial information.

Mitchell's aunt Lynch said that her family has always been paying attention to Howard and believes that his behavior was caused by an excessive trust in this technology, which experts refer to as "automation complacency". Lynch said, "We don't want his life to be ruined by this foolish accident

But when asked about Musk, Lynch's words were sharper. I think they need to disable the autonomous driving function, "she said. I think this technology should be banned

Tag: the Tesla assisted driving system has caused car accidents


Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.

AdminSo

http://www.adminso.com

Copyright @ 2007~2024 All Rights Reserved.

Powered By AdminSo

Open your phone and scan the QR code on it to open the mobile version


Scan WeChat QR code

Follow us for more hot news

AdminSo Technical Support