“Tracking Off” That Wasn’t: What the $425M Google Privacy Verdict Means for Everyone Using Data 🛰️⚖️
Google just learned the hard way that if your product settings say “off”, your backend can’t behave like “meh, still on a little bit.”
In Rodriguez v. Google LLC, a federal jury in San Francisco found that Google invaded the privacy of nearly 100 million users by continuing to collect app-activity data even after they turned off the “Web & App Activity” (WAA) setting in their Google accounts. The jury awarded $425.7 million in compensatory damages.
Plaintiffs are now asking the court for $2.36 billion in profit disgorgement on top of the verdict, arguing that $425M is a rounding error next to what Google allegedly made from the data.
For any company that:
- embeds Google code (Analytics, Firebase, ads SDKs), or
- offers its own “privacy controls” and toggles,
this case is a blueprint for how a jury will read your UI, your privacy policy, and your backend behavior against each other.
What Rodriguez v. Google Is Actually About
Think of it as the “Web & App Activity off” case, not the “Incognito” case – that was a different settlement.
Here’s the core setup:
| 🔍 Element | 📌 What the Case Says |
|---|---|
| Time period | July 1, 2016 – Sept 23, 2024 |
| Users | Roughly 98 million people, covering 174 million devices with Google accounts. |
| Setting at issue | Google’s “Web & App Activity” (WAA) and “supplemental Web & App Activity” (sWAA) settings, which users could turn off / pause. (Google Web App Activity Lawsuit) |
| User belief | Turning WAA off would stop Google from tracking their app and web activity beyond what was strictly necessary. (Corporate and Business Law Journal) |
| Alleged reality | Even with WAA off, Google allegedly continued to collect app-activity data from third-party apps that embedded Google code (analytics, ads SDKs, etc.), and used it for profiling and ad targeting. (Google Web App Activity Lawsuit) |
Internal Google emails shown at trial reportedly described the WAA control and supporting policy language as “misleading.” (Corporate and Business Law Journal)
The complaint site for the case puts it bluntly:
Even with WAA “turned off,” Google “unlawfully accessed their devices to collect, save, and use data concerning their activity on non-Google apps that use Google code.”
How Google Was Still Tracking With “Off” Selected
At a technical level, this is not about Google spying through magic—it’s about the ubiquity of its code.
As described in the filings and commentary:
- Many popular apps (Uber, Amazon, Venmo, Facebook, etc.) embed Google software components—Analytics, Ads SDKs, Firebase, or other code. (Corporate and Business Law Journal)
- When users interacted with those apps, data about that activity flowed to Google’s servers.
- Plaintiffs say that even when WAA/sWAA were disabled, Google still saved and used this app data, instead of discarding or strictly limiting it. (Digital Policy Alert)
Google’s response at trial and in public:
- The company insists that it honored user choices, treating the data more “pseudonymously” when WAA was off and using it in limited ways. (The Verge)
- A spokesperson said the verdict “misunderstands how our products work” and that Google will appeal. (The Verge)
The jury clearly preferred the plaintiffs’ framing: that if a setting appears to say “we won’t track this activity”, continuing to collect and monetize that data is an offensive intrusion into a reasonable privacy expectation.
What the Jury Actually Found
The jury was not asked to approve or outlaw all tracking. It answered three specific substantive claims.
According to the verdict descriptions: (Syracuse Law Review)
- ✅ Liable – Invasion of privacy under the California Constitution
- ✅ Liable – Common-law intrusion upon seclusion
- ❌ Not liable – California Computer Data Access and Fraud Act (CDAFA)
- ❌ No punitive damages
Total award: $425.7 million in compensatory damages—far below the plaintiffs’ original $31 billion damages model, but still a large privacy verdict.
Key elements the jury had to be persuaded of:
- Users had a reasonable expectation of privacy when they turned WAA off. (Syracuse Law Review)
- Google’s continued tracking of app activity despite that setting was highly offensive and non-consensual.
- Google’s own privacy statements and UI contributed to that expectation and to the offensiveness of the conduct.
Those are classic California privacy standards—but now applied to a mainstream UX pattern: a “master toggle” that doesn’t do what consumers think.
The Post-Verdict Fight: $425M vs $2.36B and Behavior Changes
The jury’s number isn’t the end of the story.
- Plaintiffs now argue the $425M verdict is “inadequate” given Google’s data-driven profits and are asking for $2.36 billion in profit disgorgement.
- They also want injunctive relief: real constraints on how Google can collect and use app-activity data when users have WAA off.
- Google is pushing back, arguing that disgorgement would “cripple” essential services and is “wildly disproportionate,” and has moved to decertify the class.
Chief Judge Richard Seeborg now has to decide whether to add profit-based remedies on top of the compensatory verdict.
So for now:
- The verdict stands,
- Google plans to appeal, and
- the remedial phase is still very much in play.
Why This Hits Every Business Using Google Tools (Not Just Google Itself)
If your product embeds Google Analytics, Firebase, Ads, or similar SDKs—or if you sell any service with “off” toggles—Rodriguez is a preview of how plaintiffs and regulators will frame your behavior.
The alignment problem: UI vs reality
This is the heart of it:
Are your privacy settings, privacy policy, and backend behavior all saying the same thing?
In Rodriguez, plaintiffs made the case that:
- the UI & privacy policy told users WAA “controls” how Google collects data across apps and services;
- engineers internally recognized the control and text were “misleading”; (Corporate and Business Law Journal)
- the actual technical behavior kept siphoning and using app data despite WAA being off.
The same three-way comparison can be made for any SaaS product:
| 🧩 Piece | ❓ Question |
|---|---|
| Toggle / setting text | Would a normal user understand what data is and is not collected when this is “off”? |
| Privacy policy / ToS | Do they confirm that understanding, or sneak in caveats that contradict the UI? |
| Backend implementation | Does the system actually stop collecting the data the user thinks they’re turning off—or does it just relabel it “pseudonymous” and keep everything? |
If any one of those is out of sync, you’ve basically written the opening paragraphs of your own Rodriguez-style complaint.
Contract and Product Lessons: How Not to Become the Next Test Case
From a corporate/tech practice perspective, Rodriguez is a consent-design case as much as a privacy case.
Some concrete lessons:
1. “Personalization off” can’t mean “still collecting, just using differently” without explicit explanation
Google’s defense that it treated data “pseudonymously” when WAA was off did not save it. (Kiplinger)
If your product:
- continues to collect data
- but simply changes how it is used or stored,
that distinction needs to be:
- spelled out in the UI, not just buried in a policy; and
- reflected in meaningful limitations on retention and use, not just labeling.
2. “Control over your data” is now boilerplate with teeth
Google is a repeat offender in cases where plaintiffs argue that it over-promised control and under-delivered.
When your marketing says:
- “You’re in control,”
- “Turn this off to stop tracking,”
courts will treat those as privacy assurances, not fluff. Contract language and product design need to back that up.
3. Third-party SDKs are your problem, not just the vendor’s
Rodriguez is about Google’s own tools, but the pattern applies to everyone:
- If your app uses a third-party SDK that continues to collect data in ways that conflict with your in-app settings or your privacy policy, plaintiffs will attach your name too.
Your DPAs and vendor contracts should be explicit about:
- what data SDKs collect, even when certain toggles are off, and
- who is responsible for honoring user-facing settings.
Demand Letters and Litigation Posture After Rodriguez
This verdict gives plaintiffs’ firms and regulators a simple, reusable narrative:
“Your app said this setting turned tracking off. You still collected data. That’s a Rodriguez-type invasion of privacy.”
You can expect:
- Consumer demand letters invoking Rodriguez when privacy toggles don’t behave as advertised;
- B2B demand letters by enterprise customers to vendors whose SDKs or products undermine their own privacy promises;
- Regulatory inquiries looking specifically at the alignment (or misalignment) between UI, policies, and reality.
For a demand-letter-heavy practice, Rodriguez is effectively:
- a fact pattern template (disabled setting; continued collection; monetization), and
- a damages benchmark—a jury found $425.7M reasonable for misaligned tracking controls across ~98M users.
Big Picture: Rodriguez as the “Settings Case” Everyone Will Cite
We’ve had cookie-banner cases, dark-pattern cases, and consent-to-record cases (like Otter). Rodriguez is the high-profile settings case:
- Toggle off
- Data still flows
- Jury calls it invasion of privacy and intrusion upon seclusion, and writes a $425M number under that conclusion.
For anyone designing or lawyering around digital products:
- Treat every privacy setting as if a jury will later ask, “What would a reasonable user think this does?”
- Make sure your privacy policy, SDK contracts, and backend engineering answers to that question match.
- Assume that anything less is not just a regulatory risk—it’s now a proven route to a nine-figure class-action verdict.