Waymo Is Suing the California DMV to Keep Its Crash Data Secret Subscribe to SFist - San Francisco News, Restaurants, Events, & Sports

Alphabet's self-driving car arm is running up against a roadblock in its path to becoming a fully-fledged autonomous taxi service in California, and that is coming in the form of a legal battle over its crash data.

Waymo filed a lawsuit last week against the California Department of Motor Vehicles (DMV) arguing that it should keep the company's crash data private. While some of Waymo's autonomous vehicle safety data has been made public, Waymo is fighting to keep certain redacted data secret — arguing that its competitors are trying to sniff out trade secrets, and that such data sharing would have a "chilling effect" on the entire self-driving car industry.

As The Verge reports, following an original report by the Los Angeles Times, this latest battle stems from a FOIA request by an unidentified party seeking out the crash data that Waymo submitted to the DMV as part of its application to operate commercially in the state. The DMV complied with the request, but allowed Waymo to redact some data — and this unidentified individual challenged the redactions.

Waymo argues in its suit, filed in Sacramento County Superior Court, that if it's forced to make public its process for analyzing crash data, this "could provide strategic insight to Waymo’s competitors and third parties regarding Waymo’s assessment of those collisions from a variety of different perspectives, including potential technological remediation." And sharing such trade secrets could "dissuade" further innovation in the field.

Waymo Is Suing the California DMV to Keep Its Crash Data Secret Subscribe to SFist - San Francisco News, Restaurants, Events, & Sports

"Every autonomous vehicle company has an obligation to demonstrate the safety of its technology, which is why we’ve transparently and consistently shared data on our safety readiness with the public,” says Nicholas Smith, a spokesperson for Waymo, in a statement. "We will continue to work with the DMV to determine what is appropriate for us to share publicly and hope to find a resolution soon."

As of September 2021, Waymo has had a permit from the DMV to operate commercially in San Francisco and San Mateo counties — and beta users of the Waymo One app have been able to summon driverless taxis in SF for the last few months. This is a step up from the testing permit that Waymo has had for several years, along with dozens of other autonomous vehicle companies in the state.

Waymo continues to do plenty of testing on SF's streets as well, without non-employee passengers onboard, and last year the robot cars appeared very obsessed with doing three-point turns on the same dead-end street in the Richmond.

From the get-go, proponents of autonomous vehicles have claimed that they will ultimately be far safer than human-driven cars, but the technology has not come as far as they said it would yet. Self-driving cars — and the irresponsible drivers behind the wheel who should have maintained control — have killed several people in recent years, including a Los Angeles couple whose case is now the subject of a high-profile lawsuit. (That case involves Tesla's Autopilot software, which is not meant to be 100% driverless, and it's a manslaughter case against the driver.)

Last month in the Lower Haight, a Waymo vehicle collided with a pedestrian — who was not seriously injured — but the company was quick to say that the vehicle was in manual mode and being operated by a human driver at the time of the accident.

Waymo has been more transparent than most in the industry when it comes to its vehicle safety data, as The Verge notes. In 2020, the company published two years and 6.1 million miles worth of data from its road-testing program in Arizona. In that time, Waymo cars were involved in 47 collisions and near misses, but none resulted in serious injuries — and "nearly all" of the collisions were the fault of the other driver, Waymo said.

Autonomous vehicle companies continue to argue that crash data is not a helpful way of assessing the overall safety of their vehicles and software.

Related: Why Are Waymo's Self-Driving Cars Constantly Hitting the Same Dead End On 15th Avenue?

Popular Articles