When the Lights Went Out: Why Waymo’s Robotaxis Froze in San Francisco
Tech

When the Lights Went Out: Why Waymo’s Robotaxis Froze in San Francisco

Discover the real reason Waymo's fleet paralyzed San Francisco traffic during the massive December blackout. It wasn't an AI glitch—it was a "safety bottleneck" that overwhelmed their remote support.

5 min read
Share:

Imagine driving through San Francisco on a stormy Saturday night. Suddenly, the city goes black. Traffic lights die instantly, turning busy intersections into chaotic four-way stops. As a human driver, you squint, inch forward, make eye contact with other drivers, and cautiously proceed.

Now, imagine you are a robot.

On December 20, 2025, a massive blackout hit San Francisco, plunging 130,000 customers into darkness. But the most viral image wasn't the blackout itself—it was the fleet of Waymo robotaxis frozen in intersections, blinking their hazards, effectively becoming high-tech roadblocks.

Why did some of the world's most advanced AI vehicles turn into expensive paperweights the moment the grid failed? The answer reveals a critical vulnerability in the future of autonomous driving—and it’s not what you think.


The Event: A Systemic Stress Test

The blackout, caused by a fire at a PG&E substation, didn't just cut power to homes; it knocked out traffic signals across huge swaths of the city.

For Waymo, this wasn't just a navigation challenge; it was a "stress test" of their operational architecture. While human drivers improvised, Waymo vehicles entered a defensive state, leading to gridlock that forced the company to temporarily suspend its entire San Francisco service.

But the cars didn't freeze because they were "confused" by the dark. They froze because they were too careful.


The Technical Breakdown: Why They Got Stuck

According to Waymo’s official explanation, the problem wasn't a failure of the driving AI, but a bottleneck in their safety protocols.

1. The "Four-Way Stop" Logic

Waymo’s "Driver" (the AI system) is explicitly programmed to handle non-functioning traffic lights. Legally and logically, a dark signal is treated as a four-way stop. Under normal circumstances, a Waymo vehicle sees a dark light, stops, checks for safety, and proceeds.

2. The "Phone a Friend" Bottleneck

Here lies the culprit. When a Waymo vehicle encounters an ambiguous or high-risk situation—like a dark intersection with erratic human behavior—it often requests a "confirmation check" from a remote human assistant.

During the blackout, thousands of signals went dark simultaneously.

  • The Result: The fleet generated a "concentrated spike" of remote assistance requests all at once.
  • The Failure: The remote support teams were overwhelmed. Unable to get immediate "permission" to proceed, the cars defaulted to their safest option: stop and wait.
  • As Waymo explained to TechCrunch, "This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets."


    The Fix: Teaching Cars Context

    In the aftermath, Waymo has rolled out fleet-wide software updates designed to prevent a repeat of this paralysis.

  • Power Outage Context: The vehicles are now being updated to understand "regional outages" as a specific context. Instead of treating every dark light as an isolated anomaly requiring a human double-check, the car will understand that the entire area is out and operate more decisively without constantly phoning home.
  • Emergency Protocols: The company is refining its "minimal risk condition" (MRC) maneuvers to ensure that if a car does need to stop, it pulls over safely rather than idling in the middle of an intersection.

  • Expert Perspective: The Centralization Trap

    The Bottom Line

    The San Francisco blackout exposed a hidden fragility in the autonomous vehicle (AV) model: Centralization.

    We often think of these cars as independent robots, but this incident proved they are still tethered to a "hive mind." When the hive (the remote operations center) gets overloaded, the individual bees (the cars) stop functioning efficiently.

    The Analytical Takeaway: True autonomy means the ability to act independently when the network fails. Until AVs can confidently navigate systemic infrastructure failures without asking for human permission, they will remain vulnerable to "black swan" events. The lesson here isn't that the AI failed to drive—it's that the safety rails were too tight for a crisis.


    Conclusion

    Waymo’s transparency regarding the blackout is a positive step. By admitting that their safety protocols—not their sensors—caused the gridlock, they’ve highlighted a crucial area for development.

    The next time the lights go out in San Francisco, Waymo’s fleet will likely be ready. But the question remains: Can we trust a system that relies on remote humans to function during a disaster?

    What do you think? Would you feel safe riding in a robotaxi during a city-wide blackout?


    Sources:

  • TechCrunch: Waymo Explains Why Its Robotaxis Got Stuck
  • Waymo Official Safety Impact Reports