Artificial intelligence-powered autonomous vehicles are already on the streets in multiple major U.S. cities, but leading researchers say the novel technology isn’t fully ready for the road.
At the 2025 Consumer Electronics Show in Las Vegas in January, dozens of companies showcased their latest innovations in the self-driving vehicle (SDV) and autonomous vehicle (AV) space. One of the biggest names in the business, Waymo, anchored an entire wing of the Las Vegas Convention Center dedicated to burgeoning technology.
Waymo LLC, a subsidiary of Google parent Alphabet Inc., launched in 2009. It currently operates what it calls an “autonomous ride-hailing service,” or robotaxi, in Phoenix, Los Angeles, and San Francisco. In the near future, it plans to expand to Atlanta, Miami, and Austin, Texas.
In a Q&A session held on Jan. 8 at the annual tech summit, Waymo co-CEO Tekedra Mawakana said the company’s only mission is “to actually make roads safer, save lives, and offer sustainable mobility options to people.”
During the session, Mawakana touted a new study the company published that found its vehicles “demonstrated better safety performance when compared to human-driven vehicles, with an 88 percent reduction in property damage claims and 92 percent reduction in bodily injury claims.”
Less than two weeks later, on Jan. 20, an unoccupied self-driving Waymo car was struck from behind in a multivehicle collision in San Francisco. One person died in the crash. Waymo was not blamed for the collision. The accident marked the first fatal accident involving an AV in the United States.
Waymo is the subject of an active federal safety investigation. In May 2024, the National Highway Traffic Safety Administration’s (NHTSA’s) Office of Defects Investigation announced it was looking into the company’s fifth-generation automated driving system (ADS) after receiving reports of 22 separate incidents involving Waymo vehicles.
“Reports include collisions with stationary and semi-stationary objects such as gates and chains, collisions with parked vehicles, and instances in which the ADS appeared to disobey traffic safety control devices,” a summary of the investigation published by the NHTSA’s website said. “In certain incidents, a collision occurred shortly after the ADS exhibited unexpected behavior near traffic safety control devices.”
Waymo is not alone in the AV space. Japanese start-up TierIV and American start-ups Zoox and May Mobility all showcased vehicles at CES. The American companies have powerful financial allies. Zoox is backed by Amazon.com Inc. May Mobility says on its website that its investors include Toyota Motor Corp.
These companies and dozens of hardware and software makers working to outfit AVs are likely motivated by the multibillion-dollar potential represented by automating the work of human drivers.
On Jan. 6 at CES, Nvidia Corp. CEO Jensen Huang spoke about his own company’s work in the space and said he predicts AVs “will likely be the first multitrillion dollar robotics industry.”
![Nvidia founder and CEO Jensen Huang speaks during a Nvidia news conference ahead of the CES tech show in Las Vegas on Jan. 6, 2025. (Abbie Parr/AP Photo)](/_next/image?url=https%3A%2F%2Fimg.theepochtimes.com%2Fassets%2Fuploads%2F2025%2F01%2Fid5787925-1.download-600x400.jpg&w=1200&q=75)
The Road to Autonomy
Right now, neither Waymo nor May Mobility are operating AVs that fit the federal definition of complete autonomy.The U.S. Department of Transportation’s NHTSA currently holds six classifications for vehicle automation. The lowest rung, level zero, is classified as momentary driver assistance. The highest, level five, is identified as “full automation.”
To be fully automated, in the eyes of the DOT, the system is “fully responsible for all driving tasks under all conditions and on all roadways.”
Currently, Waymo vehicles on the streets of major American cities do not meet the level five standard. They do not offer rides on interstate highways to the public and do not drive without any human intervention.
In an email to The Epoch Times, Karsten Kutterer, a spokesman for May Mobility, said the company’s vehicles operate at level four. The NHTSA identifies level four as “high automation.”
Moreover, while there is not a driver in May Mobility’s “driver-out” AVs, they do use a remote “tele-assist operator.”
The operator, Kutterer told The Epoch Times, will monitor the vehicle “but will not take over and directly steer or control it.”
“Instead, they suggest the best maneuver for the situation, and then the vehicle decides whether the maneuver is viable and safe so that it can continue operations autonomously,” Kutterer said.
In the days after CES, Ali Kani, vice president of automotive at Nvidia, told an automotive industry publication that fully autonomous cars are “not close” and will “not appear in this decade.”
In a statement, Nvidia told The Epoch Times it does not comment on other media coverage.
However, in an email, Kani told The Epoch Times that “significant progress” is being made in deploying AVs for urban ride-hailing and delivery services. He said level four autonomy—which he defined as operating without human intervention in certain conditions and in limited geographies—will continue expanding over the next three years.
“Level 5 passenger vehicle solutions will take longer as these cars need to be built with cost-effective sensors while also being able to operate in all types of environments and without limitations,” Kani said in an email. “Predicting an exact timeline for volume production and deployment of such self-driving systems, however, is challenging.”
Kani said that safety “is the top priority” and that the entire technology and automotive industry must exercise caution to prevent souring public opinion on the burgeoning technology.
“It’s not just about getting it right; it’s about ensuring we never get it wrong,” Kani said. “Caution is critical in AV development because deploying self-driving vehicles before the industry and infrastructure are fully prepared could have serious consequences.”
![A Waymo driverless taxi stops on a street in San Francisco for several minutes because the back door was not completely shut, while traffic backs up behind it, on Feb. 15, 2023. (Terry Chea/AP Photo)](/_next/image?url=https%3A%2F%2Fimg.theepochtimes.com%2Fassets%2Fuploads%2F2023%2F08%2F11%2Fid5459775-driverless-taxi-600x372.jpg&w=1200&q=75)
Stops and Starts
In December 2024, Missy Cummings, a professor at George Mason University and the director of the university’s Mason Autonomy and Robotics Center, co-authored a study analyzing the real-world performance of self-driving vehicles in California.Cummings declined to be interviewed by The Epoch Times.
The study, based on data collected by state officials in California, concluded that “research is lacking, especially for artificial intelligence involving computer vision and reasoning under uncertainty.”
Cummings and her co-author Ben Bauchwitz, a graduate student at Duke University at the time of the study’s publication, analyzed crash data collected by the state from three companies: Cruise, Waymo, and Zoox.
In October 2023, a self-driving Cruise vehicle ran over a pedestrian in San Francisco who had been hit by another car driven by a human and dragged the person for about 20 feet. The incident led the California Department of Motor Vehicles to suspend Cruise’s license to operate and a NHTSA investigation into its safety.
In January 2025, federal regulators closed their investigation into Cruise without taking any further action.
The self-driving startup is majority owned by General Motors Co. In December 2024, GM announced Cruise would end its autonomous vehicle operations and instead focus on driver-assist technology.
In their study, Cummings and Bauchwitz examined 256 incidents, which were sorted into five categories: perception problem, struck from behind, planning problem, unexpected action by others, and incorrect safety driver input. They noted that due to the low frequency of police reports attached to each incident, it was often difficult to determine fault in each separate incident.
Most of the self-driving crashes, 124 incidents, were classified as “struck from behind,” meaning the “AV comes to an abrupt stop and is hit from behind.”
The authors said the data indicates an AV is 1.7 times more likely to be struck from behind compared with a human driver. They blamed the issue of “phantom braking,” which they attributed to “anomalies in computer vision systems, which are a feature on all AVs.”
“The AV companies are quick to blame human drivers for inattentive following, but human drivers are often surprised by AV dramatic decelerations for no obvious reason and may not be able to respond as quickly to an automated braking system that engages far more quickly and aggressively than humans,” Cummings and Bauchwitz said in the study.
In a statement provided to The Epoch Times, Jacob Crossman, senior vice president of autonomy engineering at May Mobility, said the company is working on that problem and has developed a system called Multi-Policy Decision Making (MPDM) to solve it.
“MPDM leverages in-situ AI reasoning models to solve the industry’s biggest challenge: adapting to unexpected, dynamic conditions or ‘edge cases,’ such as illegal pedestrian crossings or construction detours,” Crossman said.The researchers said the most common AV collisions, as well as the other less frequent accidents, boil down to one problem: perception of the road.
“Just as for humans, if the world around AVs cannot be properly sensed, then correct planning for the next set of actions cannot be accomplished,” Cummings and Bauchwitz said. “In addition, AVs are susceptible to uncertainty because this is a significant weakness of neural networks.”
Perception issues are caused by the AVs either missing an obstacle or falsely detecting one. Nearly half of the incidents, they said, were related to false positives. Cummings and Bauchwitz said there is “no consensus” as to why or when the false detections occur.
The possibility also exists, the researchers said, that “underlying measures of [computer vision] algorithm correctness do not align with reality.”
Recent studies into the computer vision software used by AVs indicate it cannot fully detect pedestrians or vehicles around it. In a highly complex scene, such as one in an urban environment, “one-third to one-half” of pedestrians and “almost half” of the vehicles around it are not detected.
“Human drivers would fail their driver’s license vision if they could not detect eight percent of vehicles around them,” Cummings and Bauchwitz said.