Tesla's Autopilot Promises Remain Unfulfilled: Analyst Tests FSD and Finds Unsafe Performance
Tesla's Autopilot Promises Remain Unfulfilled: Analyst Tests FSD and Finds Unsafe PerformanceAug. 30, 2023 Tesla CEO Elon Musk has promised investors that cars enabled with the company's Autopilot system, known as Full Self-Driving (FSD), will achieve higher safety than human drivers by the end of this year or early next year
Tesla's Autopilot Promises Remain Unfulfilled: Analyst Tests FSD and Finds Unsafe Performance
Aug. 30, 2023 Tesla CEO Elon Musk has promised investors that cars enabled with the company's Autopilot system, known as Full Self-Driving (FSD), will achieve higher safety than human drivers by the end of this year or early next year. However, firsthand testing by an analyst suggests Musk's pledges may be far from reality.
William Stein, a technology analyst at Truist Securities, has been invited by Musk to test the latest version of Tesla's FSD system three times over the past four months. Tesla claims the technology allows vehicles to drive autonomously with minimal human intervention, transporting people from their origin to their destination. However, Stein says the Tesla vehicles he tested exhibited unsafe or illegal driving behavior on each occasion. Earlier this month, his most recent test ride even "terrified" his 16-year-old son who was accompanying him.
Stein's experiences, coupled with a fatal accident involving FSD in the Seattle area in April that killed a motorcyclist, have sparked concern from federal regulators. These agencies have been investigating Tesla's Autopilot system for over two years due to its involvement in numerous accidents and safety concerns. These issues have heightened concerns about the potential for widespread safe operation of Tesla's autonomous driving systems.
Stein casts doubt on whether Tesla can deliver on Musk's prediction of launching a robotaxi service next year. Recent events come at a crucial moment for Tesla, as Musk has pledged to investors that the FSD system will surpass human drivers in safety by the end of this year or early next year. Tesla plans to launch a vehicle designed specifically for robotaxis within the next two months. To get Tesla's robotaxis on the road, Musk states that the company will need to demonstrate to regulators that the system is safer than human drivers. Additionally, Tesla must meet national vehicle safety standards.
However, the accident per mile data provided by Musk only applies to Tesla's more basic Autopilot system. Safety experts point out limitations in this data, as it only accounts for serious accidents involving airbag deployments, neglecting frequent instances where human drivers intervene to avoid collisions.
Currently, about 500,000 Tesla owners are using the FSD system on public roads, representing slightly over one-fifth of Tesla's total fleet. Most of these owners have paid an additional $8,000 or more for the system. Tesla explicitly warns that vehicles equipped with FSD are not yet fully autonomous, and drivers must remain vigilant and ready to take control in critical situations. Tesla emphasizes that it closely monitors each driver's performance and will suspend FSD access for those who fail to adequately supervise the system. Recently, Tesla has renamed the system to "FSD Beta."
Musk acknowledges that past predictions regarding the application of autonomous driving technology have been overly optimistic. In 2019, he predicted that self-driving cars would be on the road by the end of 2020, but five years later, many technology observers remain skeptical about the technology's ability to achieve widespread adoption in the United States. Michael Brooks, executive director of the non-profit Center for Auto Safety, unequivocally states, "We're far from that goal, and it's not likely to happen next year either."
Stein was driving a Tesla Model 3, an affordable model purchased from a Tesla showroom in Westchester County, north of New York City, and equipped with the latest FSD software. Musk describes this software as now infused with artificial intelligence, assisting in controlling the steering wheel and pedals. Stein points out that during his test drive, the Tesla vehicle drove smoothly, with the FSD system appearing more human-like than previous versions. However, in a trip of less than 10 miles, the vehicle turned left from a straight lane at a red light, which he describes as "shocking." Stein further mentions that while the road was empty and the maneuver seemed harmless, he did not intervene. However, the vehicle then entered Park Avenue, crossing between two lanes of oncoming traffic, forcing him to take control.
In his research report submitted to investors, Stein writes that the latest version of FSD has not "solved the self-driving problem" as Musk predicted, and is "far from the level of maturity needed for robotaxis." He further mentions that the Tesla vehicle exhibited unsettling driving behavior during his test drives in April and July. Tesla declined to comment on these reports.
While Stein believes Tesla could eventually benefit from its Autopilot technology, he remains pessimistic about the short-term realization of driverless, passenger-carrying robotaxis, predicting significant delays or limitations in the system's deployment. He emphasizes that Musk's pronouncements often diverge significantly from real progress. While social media showcases videos shared by Tesla enthusiasts that depict autonomous driving, these clips fail to comprehensively reflect the long-term stability of the system and often feature instances of dangerous behavior.
Alain Kornhauser, an autonomous vehicle research expert at Princeton University, also shared his personal experience. After driving a Tesla borrowed from a friend for two weeks, he found the vehicle effectively recognized pedestrians and other drivers most of the time. However, he was forced to take over control when encountering driving behavior that raised his concerns, warning that FSD is currently not capable of operating unsupervised in all environments. "The technology is not mature enough to drive autonomously to any destination," he concluded.
Kornhauser further suggests that the system could potentially achieve autonomous driving in small urban areas with detailed map assistance. He questions why Musk does not start by providing small-scale ride-hailing services, suggesting that such services could significantly enhance people's travel convenience.
Experts have long warned that Tesla's reliance solely on cameras and computer systems does not always accurately identify and interpret objects, particularly in adverse weather conditions or at night when camera visibility is more limited. In contrast, most robotaxi companies, such as Waymo (owned by Google's parent company, Alphabet) and General Motors' Cruise, have adopted a comprehensive approach incorporating cameras alongside radar and lidar sensors.
Missy Cummings, Professor of Engineering and Computer Science at George Mason University, emphasizes, "You can't make good planning, moving, and driving decisions if you can't accurately sense the world. Vision alone is not enough to support autonomous driving." She also points out that even vehicles equipped with lidar and radar have questionable driving reliability, raising concerns about the safety of Waymo and Cruise as well. Waymo and Cruise declined to comment.
Phil Koopman, Professor of Autonomous Vehicle Safety at Carnegie Mellon University, points out that fully AI-powered autonomous vehicles still need several years to properly address the complex and ever-changing dynamics of the real world. He states, "Machine learning lacks common sense and can only learn from a limited number of cases. Once confronted with an untrained situation, the system can easily fail."
In April of last year, a Tesla vehicle equipped with FSD was involved in a fatal collision with a motorcycle in Snohomish County, near Seattle. According to the Tesla driver, he had engaged FSD while checking his phone at the time of the accident. The investigation into this accident is ongoing to assess the information provided by both Tesla and law enforcement. However, Tesla stated that they are aware of Stein's use of the FSD system.
The National Highway Traffic Safety Administration (NHTSA) is closely monitoring the effectiveness of Tesla's recall actions aimed at enhancing driver monitoring in autonomous vehicles. NHTSA has also advised Tesla to consider an additional recall of FSD in 2023 due to the potential for the system to violate traffic rules in certain rare circumstances, increasing the risk of accidents. However, the agency declined to disclose specific details of the recall progress or its assessment of its effectiveness.
Despite recent price reductions for its electric vehicles, Tesla's sales performance continues to show signs of weakness. Musk has told investors that they should view the company more as a robotics and artificial intelligence enterprise rather than simply a car company. Notably, Tesla has been developing FSD technology for several years, dating back to 2015.
During the recent earnings call, Musk bluntly stated, "I would suggest that anyone who doesn't believe Tesla can solve the self-driving problem should not own Tesla stock." However, Stein urges investors to maintain independent judgment, especially regarding Tesla's FSD project. He writes, "While this project has a long history, generates revenue, and is being deployed in the real world, its true effectiveness is for investors to assess themselves."
Tag: Tesla Autopilot Promises Remain Unfulfilled Analyst Tests FSD and
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.