ARIZONA, USA — The U.S. government's highway safety agency has opened another investigation of automated driving systems, this time into crashes involving Waymo's self-driving vehicles.
The National Highway Traffic Safety Administration posted documents detailing the probe on its website early Tuesday after getting 22 reports of Waymo vehicles either crashing or doing something that may have violated traffic laws.
An NHTSA report shows eight crashes happened in the Valley. The crashes happened between February 2021 and December 2022. Five of the crashes happened in Phoenix while Tempe, Gilbert and Scottsdale each saw one crash. No injuries were reported in any of the incidents. In four of the incidents, passengers were in the vehicle and Phoenix police are investigating at least one of the occurrences. Four of the crashes happened in parking lots.
In the past month, the agency has opened at least four investigations of vehicles that can either drive themselves or take on at least some driving functions as it appears to be getting more aggressive in regulating the devices.
In the probe of Waymo, which was once Google's self-driving vehicle unit, the agency said it has reports of 17 crashes and five other reports of possible traffic law violations. No injuries were reported.
In the crashes, the Waymo vehicles hit stationary objects such as gates, chains or parked vehicles. Some of the incidents happened shortly after the Waymo driving system behaved unexpectedly near traffic control devices, according to the documents.
Waymo said NHTSA plays an important role in road safety, and it will continue working with the agency “as part of our mission to become the world’s most trusted driver.”
RELATED: A 'hesitant' and 'confused' car? Driverless car turns into oncoming traffic in the East Valley
The company said it makes over 50,000 weekly trips with riders in challenging environments. “We are proud of our performance and safety record over tens of millions of autonomous miles driven, as well as our demonstrated commitment to safety transparency,” the statement said.
Waymo, based in Mountain View, California, has been operating robotaxis without human safety drivers in Arizona and California.
Michael Brooks, executive director of the nonprofit Center for Auto Safety, said NHTSA's more aggressive actions show that autonomous vehicles may not be ready yet for public roads.
The agency's only enforcement power on autonomous vehicles, at present, is to open investigations and seek recalls, which it is doing, Brooks said. NHTSA has been criticized in the past for being slow to regulate Tesla and other companies that offer automated driving systems, but Brooks said things appear to have changed.
“Ultimately I think it's a good thing here that they're taking these steps, trying to figure out why these vehicles are acting the way they are," Brooks said.
NHTSA said it would investigate the 22 incidents involving Waymo's fifth generation driving system plus similar scenarios “to more closely assess any commonalities in these incidents.”
The agency said it understands that Waymo's automated driving system was engaged throughout each incident, or in some cases involving a test vehicle, a human driver disengaged the system just before an accident happened.
The probe will evaluate the system's performance in detecting and responding to traffic control devices, and in avoiding crashes with stationary and semi-stationary objects and vehicles, the documents said.
Since late April, NHTSA has opened investigations into collisions involving self-driving vehicles run by Amazon-owned Zoox, as well as partially automated driver-assist systems offered by Tesla and Ford.
In 2021 the agency ordered all companies with self-driving vehicles or partially automated systems to report all crashes to the government. The probes rely heavily on data reported by the automakers under that order.
NHTSA also is investigating General Motors' Cruise autonomous vehicle unit after getting reports that the vehicles may not have used proper caution around pedestrians. Cruise recalled its cars to update software after one of them dragged a pedestrian to the side of a San Francisco street in early October.
The agency also has questioned whether a recall last year of Tesla’s Autopilot driver-assist system was effective enough to make sure human drivers are paying attention. NHTSA said it ultimately found 467 crashes involving Autopilot resulting in 54 injuries and 14 deaths.
In the Ford investigation, the agency is looking into two nighttime crashes on freeways that killed three people.
The agency also pressured Tesla into recalling its “Full Self Driving” system last year because it can misbehave around intersections and doesn’t always follow speed limits.
Despite their names, neither Tesla's Autopilot nor its “Full Self Driving” systems can drive vehicles themselves, and the company says human drivers must be ready to intervene at all times.
In addition, NHTSA has moved to set performance standards for automatic emergency braking systems, requiring them to brake quickly to avoid pedestrians and other vehicles.
The standards come after other investigations involving automatic braking systems from Tesla,Honda and Fisker because they can brake for no reason, increasing the risk of a crash.
In a 2022 interview, then NHTSA Administrator Steven Cliff said the agency would step up scrutiny of automated vehicles, and the agency recently has taken more action. NHTSA has been without a Senate-confirmed administrator since Cliff left for the California Air Resources board in August of 2022.
UP TO SPEED