Otter.ai Under Fire: When Your AI Notetaker Becomes a Wiretap 📝⚖️

Published: October 29, 2025 • AI, News

AI notetakers were supposed to solve the “who’s going to take notes?” problem in Zoom and Teams meetings.

In August 2025, they got a different job: starring as defendants in a federal class action.

Brewer v. Otter.ai Inc., filed in the Northern District of California, accuses Otter’s “Otter Notetaker” / “OtterPilot” tools of:

  • secretly recording private conversations on Zoom, Google Meet, and Microsoft Teams, and
  • using those recordings and transcripts to train Otter’s AI engine

without proper consent from the people actually speaking on the calls.

This isn’t just another privacy headline. It’s a direct shot at how every AI meeting assistant handles consent, training data, and contracts.


What Brewer v. Otter.ai Actually Claims 🧑‍⚖️

The basics of the case:

📂 Case🧑‍💼 Plaintiff🏛️ Court🗓️ Filed🎯 Core Allegation
Brewer v. Otter.ai Inc., 5:25-cv-06911Justin Brewer (California resident)U.S. District Court, N.D. Cal.August 15, 2025Otter’s meeting assistants record and transcribe private conversations without consent from all participants and then use those recordings to train Otter’s AI, violating federal and state wiretap/privacy laws. ()

According to the complaint and commentary:

  • Otter’s tools join meetings as a virtual participant (“Otter Notetaker” / “OtterPilot”) and record audio and generate transcripts in real time. ()
  • The bot often asks permission only from the host, and sometimes not even that, especially where the host has already integrated their calendar/meeting accounts with Otter. ()
  • Non-Otter users on the call typically never see a separate consent prompt from Otter, even though Otter is recording and shipping their conversations to Otter’s servers. (WUFT)
  • Those recordings and transcripts are then allegedly used to train Otter’s AI systems, beyond what’s necessary just to provide the transcription service. ()

Otter’s privacy policy does admit that user data can be used for AI training when users “explicitly” grant permission (check-the-box). The lawsuit’s position is that, in practice, millions of people never knowingly gave that permission, because they never signed up for Otter in the first place. (WUFT)


How the Alleged Behavior Works in Real Meetings 💻🎧

You can think of Otter Notetaker as an invisible paralegal that never asks non-clients if it can sit in.

Commentary and the complaint describe a few recurring patterns:

🔍 Scenario🧠 What Allegedly Happens🚨 Risk Signal
Calendar-integrated auto-joinAn Otter user connects their Google/Outlook calendar; Otter Notetaker automatically joins meetings as a participant when events start. (WUFT)Participant list suddenly shows “Otter Notetaker,” but no one outside the host understands what that means or has affirmatively consented.
Host-only permissionIf the host is not an Otter user, Otter may show them a one-time prompt to allow recording; no separate prompt to other attendees. (WUFT)From a CIPA/ECPA perspective, that’s a one-party consent model layered on top of a two-party/all-party consent jurisdiction.
Silent background trainingConversation audio is sent to Otter’s servers, transcribed, stored, and (per the complaint) ingested into training pipelines to improve Otter’s speech recognition and AI features. (WUFT)Transcripts of highly confidential meetings may end up as generic training data without any explicit authorization from the people talking.

NPR’s coverage highlights “deceptively and surreptitiously” recording private work conversations and notes Otter has captured over 1 billion meetings across 25 million users. ()

That scale matters in a class action. It turns what might look like a one-off policy quirk into alleged systemic misconduct.


The Legal Theories: Old Wiretap Laws, New AI Tools ⚖️📡

Brewer’s complaint isn’t inventing new law; it’s repurposing existing doctrines.

⚖️ Law / Claim📜 Alleged Violation📌 Otter-Specific Twist
ECPA / Federal Wiretap Act (18 U.S.C. § 2511) (iapp.org)Interception and use of the contents of communications without consent from all relevant parties.Otter Notetaker allegedly “intercepts” meetings in transit and uses them for purposes beyond transcription (training), with many participants never consenting.
California Invasion of Privacy Act (CIPA) (Fisher Phillips)Recording confidential conversations without consent from all parties in a two-party consent state.Otter’s host-only consent flow allegedly falls short of CIPA’s all-party standard for “confidential communications.”
Intrusion upon seclusionHighly offensive intrusion into private affairs that would outrage a reasonable person. (iapp.org)Fact pattern: participants think they’re in a normal Zoom/Teams call, but a third-party bot is silently recording and shipping content to a vendor they don’t know.
Conversion / misuse of dataUnauthorized control over or use of another’s property (here, meeting content). (iapp.org)Plaintiffs say Otter “takes” meeting content and repurposes it as AI training material for its own commercial gain.

Privacy and AI law commentators are already using Brewer as a test case for applying traditional wiretap statutes to modern AI transcription services. (Workplace Privacy Report)


Why This Is Your Problem If You Use AI Notetakers as a Company 🏢🔐

Brewer v. Otter.ai is nominally about Otter. In practice, it’s a warning to any organization that:

  • lets employees use AI meeting assistants (Otter, Copilot, Fireflies, etc.), or
  • sells SaaS that records or transcribes user conversations.

From the company side, you’re staring at three overlapping risk zones:

Regulatory / statutory risk

  • If you operate in or call into all-party consent jurisdictions (California, Pennsylvania, some EU states, etc.), relying on host-only consent is asking for trouble. (Fisher Phillips)
  • Using a tool whose default behavior is “we only ask the host, sometimes” sets you up for CIPA/ECPA exposure in exactly the way Brewer describes.

Contract and confidentiality risk

  • Many NDAs and service agreements assume no third-party recording or data sharing beyond agreed vendors.
  • If your counterparty finds out a confidential negotiation was piped to Otter’s servers and used for training, the first thing you might see is a demand letter alleging breach of confidentiality.

Employee and data-governance risk

  • Sensitive internal meetings (HR issues, layoffs, M&A, privileged attorney–client discussions) may be recorded by an AI bot without the right people even knowing it’s there.
  • That cuts directly against emerging internal AI policies and can create discovery headaches later.

Commentators are already labeling Otter as a “wake-up call” for organizations using AI meeting tools without formal governance, privacy impact assessments, or clear consent processes. (Michal Sons)


Contracts and Policies: How to Not Be the Next Otter.ai 📜🛡️

If you’re advising companies (or running one), the Otter case is a nice clean checklist moment.

On the vendor contract side (your DPA / SaaS agreement with the AI notetaker):

📜 Clause✅ Safer Approach After Brewer
Consent responsibilitySpell out who is responsible for obtaining consent from all meeting participants (vendor vs customer), and in which jurisdictions. Don’t leave it implied.
Training data useDistinguish: data needed to provide the service vs data used for model training / product improvement. Require explicit opt-in for training on meeting content, or carve it out entirely for sensitive calls.
Confidential informationDefine meeting content as Confidential Information and restrict training use unless separately negotiated; require deletion at the customer’s request.
Notice & UIRequire vendor to maintain clear in-meeting notices (e.g., bot name + tooltip + join message) and to provide configurable notice templates you can align with your policies.
IndemnitySeek indemnification for privacy/wiretap claims arising from the vendor’s default consent flows or failure to implement agreed features (e.g., participant prompts, opt-out controls).

On the internal policy side:

  • Decide where AI notetakers are never allowed (board meetings, privileged legal calls, disciplinary meetings, regulated data discussions).
  • Require hosts to announce on the record that an AI notetaker is active, explain briefly what it does, and offer participants a chance to object or leave. (HR Source)
  • Bake Otter-style scenarios into your data protection impact assessments if you operate under GDPR/CCPA-like regimes. (Michal Sons)

Demand Letters Around AI Notetakers ✉️

Brewer is a class action, but this fact pattern scales nicely to pre-suit demand letters, which is where a lot of your work lives.

From different sides:

1. Counterparty / employee / meeting participant letters

  • A client or employee discovers that a confidential call was recorded and used for AI training without their knowledge.
  • Letter alleges violation of CIPA/ECPA and breach of contract/NDAs, demands:
    • cessation of AI recording,
    • deletion of specific transcripts, and
    • possibly compensation / policy changes.

2. Upstream vendor letters

  • A corporate customer realizes their AI notetaker vendor behaves like Otter.
  • Their counsel sends a letter to the vendor:
    • pointing to contractual representations about law compliance and data use;
    • demanding a detailed explanation of consent flows and training practices;
    • reserving rights for any non-compliance with wiretap/privacy laws.

A well-aimed letter in either direction does more than “complain.” It:

  • forces the other side to lock in their story,
  • preserves evidence, and
  • sets up the record if you end up where Brewer is now.

The Takeaway: AI Meeting Bots Are Not “Just Tools” 🎙️🚨

Brewer v. Otter.ai is doing for AI notetakers what the early scraping and Books3 cases did for training data: turning vague unease into specific causes of action.

Practically:

  • If you’re building AI transcription tools, you need all-party consent flows, clear UI notices, and a defensible training policy.
  • If you’re buying them, you need contracts and internal policies that assume a Brewer-type complaint is possible and allocate risk accordingly.
  • If you’re on the receiving end of a surprise bot recording, you now have a clean narrative and legal framework for demand letters and, if necessary, litigation.

The convenience of never taking notes again is nice. The inconvenience of your “meeting assistant” being characterized as an unlawful wiretap in federal court is…less nice.