Otter.ai Under Fire: When Your AI Notetaker Becomes a Wiretap đâď¸
AI notetakers were supposed to solve the âwhoâs going to take notes?â problem in Zoom and Teams meetings.
In August 2025, they got a different job: starring as defendants in a federal class action.
Brewer v. Otter.ai Inc., filed in the Northern District of California, accuses Otterâs âOtter Notetakerâ / âOtterPilotâ tools of:
- secretly recording private conversations on Zoom, Google Meet, and Microsoft Teams, and
- using those recordings and transcripts to train Otterâs AI engine
without proper consent from the people actually speaking on the calls.
This isnât just another privacy headline. Itâs a direct shot at how every AI meeting assistant handles consent, training data, and contracts.
What Brewer v. Otter.ai Actually Claims đ§ââď¸
The basics of the case:
| đ Case | đ§âđź Plaintiff | đď¸ Court | đď¸ Filed | đŻ Core Allegation |
|---|---|---|---|---|
| Brewer v. Otter.ai Inc., 5:25-cv-06911 | Justin Brewer (California resident) | U.S. District Court, N.D. Cal. | August 15, 2025 | Otterâs meeting assistants record and transcribe private conversations without consent from all participants and then use those recordings to train Otterâs AI, violating federal and state wiretap/privacy laws. () |
According to the complaint and commentary:
- Otterâs tools join meetings as a virtual participant (âOtter Notetakerâ / âOtterPilotâ) and record audio and generate transcripts in real time. ()
- The bot often asks permission only from the host, and sometimes not even that, especially where the host has already integrated their calendar/meeting accounts with Otter. ()
- Non-Otter users on the call typically never see a separate consent prompt from Otter, even though Otter is recording and shipping their conversations to Otterâs servers. (WUFT)
- Those recordings and transcripts are then allegedly used to train Otterâs AI systems, beyond whatâs necessary just to provide the transcription service. ()
Otterâs privacy policy does admit that user data can be used for AI training when users âexplicitlyâ grant permission (check-the-box). The lawsuitâs position is that, in practice, millions of people never knowingly gave that permission, because they never signed up for Otter in the first place. (WUFT)
How the Alleged Behavior Works in Real Meetings đťđ§
You can think of Otter Notetaker as an invisible paralegal that never asks non-clients if it can sit in.
Commentary and the complaint describe a few recurring patterns:
| đ Scenario | đ§ What Allegedly Happens | đ¨ Risk Signal |
|---|---|---|
| Calendar-integrated auto-join | An Otter user connects their Google/Outlook calendar; Otter Notetaker automatically joins meetings as a participant when events start. (WUFT) | Participant list suddenly shows âOtter Notetaker,â but no one outside the host understands what that means or has affirmatively consented. |
| Host-only permission | If the host is not an Otter user, Otter may show them a one-time prompt to allow recording; no separate prompt to other attendees. (WUFT) | From a CIPA/ECPA perspective, thatâs a one-party consent model layered on top of a two-party/all-party consent jurisdiction. |
| Silent background training | Conversation audio is sent to Otterâs servers, transcribed, stored, and (per the complaint) ingested into training pipelines to improve Otterâs speech recognition and AI features. (WUFT) | Transcripts of highly confidential meetings may end up as generic training data without any explicit authorization from the people talking. |
NPRâs coverage highlights âdeceptively and surreptitiouslyâ recording private work conversations and notes Otter has captured over 1 billion meetings across 25 million users. ()
That scale matters in a class action. It turns what might look like a one-off policy quirk into alleged systemic misconduct.
The Legal Theories: Old Wiretap Laws, New AI Tools âď¸đĄ
Brewerâs complaint isnât inventing new law; itâs repurposing existing doctrines.
| âď¸ Law / Claim | đ Alleged Violation | đ Otter-Specific Twist |
|---|---|---|
| ECPA / Federal Wiretap Act (18 U.S.C. § 2511) (iapp.org) | Interception and use of the contents of communications without consent from all relevant parties. | Otter Notetaker allegedly âinterceptsâ meetings in transit and uses them for purposes beyond transcription (training), with many participants never consenting. |
| California Invasion of Privacy Act (CIPA) (Fisher Phillips) | Recording confidential conversations without consent from all parties in a two-party consent state. | Otterâs host-only consent flow allegedly falls short of CIPAâs all-party standard for âconfidential communications.â |
| Intrusion upon seclusion | Highly offensive intrusion into private affairs that would outrage a reasonable person. (iapp.org) | Fact pattern: participants think theyâre in a normal Zoom/Teams call, but a third-party bot is silently recording and shipping content to a vendor they donât know. |
| Conversion / misuse of data | Unauthorized control over or use of anotherâs property (here, meeting content). (iapp.org) | Plaintiffs say Otter âtakesâ meeting content and repurposes it as AI training material for its own commercial gain. |
Privacy and AI law commentators are already using Brewer as a test case for applying traditional wiretap statutes to modern AI transcription services. (Workplace Privacy Report)
Why This Is Your Problem If You Use AI Notetakers as a Company đ˘đ
Brewer v. Otter.ai is nominally about Otter. In practice, itâs a warning to any organization that:
- lets employees use AI meeting assistants (Otter, Copilot, Fireflies, etc.), or
- sells SaaS that records or transcribes user conversations.
From the company side, youâre staring at three overlapping risk zones:
Regulatory / statutory risk
- If you operate in or call into all-party consent jurisdictions (California, Pennsylvania, some EU states, etc.), relying on host-only consent is asking for trouble. (Fisher Phillips)
- Using a tool whose default behavior is âwe only ask the host, sometimesâ sets you up for CIPA/ECPA exposure in exactly the way Brewer describes.
Contract and confidentiality risk
- Many NDAs and service agreements assume no third-party recording or data sharing beyond agreed vendors.
- If your counterparty finds out a confidential negotiation was piped to Otterâs servers and used for training, the first thing you might see is a demand letter alleging breach of confidentiality.
Employee and data-governance risk
- Sensitive internal meetings (HR issues, layoffs, M&A, privileged attorneyâclient discussions) may be recorded by an AI bot without the right people even knowing itâs there.
- That cuts directly against emerging internal AI policies and can create discovery headaches later.
Commentators are already labeling Otter as a âwake-up callâ for organizations using AI meeting tools without formal governance, privacy impact assessments, or clear consent processes. (Michal Sons)
Contracts and Policies: How to Not Be the Next Otter.ai đđĄď¸
If youâre advising companies (or running one), the Otter case is a nice clean checklist moment.
On the vendor contract side (your DPA / SaaS agreement with the AI notetaker):
| đ Clause | â Safer Approach After Brewer |
|---|---|
| Consent responsibility | Spell out who is responsible for obtaining consent from all meeting participants (vendor vs customer), and in which jurisdictions. Donât leave it implied. |
| Training data use | Distinguish: data needed to provide the service vs data used for model training / product improvement. Require explicit opt-in for training on meeting content, or carve it out entirely for sensitive calls. |
| Confidential information | Define meeting content as Confidential Information and restrict training use unless separately negotiated; require deletion at the customerâs request. |
| Notice & UI | Require vendor to maintain clear in-meeting notices (e.g., bot name + tooltip + join message) and to provide configurable notice templates you can align with your policies. |
| Indemnity | Seek indemnification for privacy/wiretap claims arising from the vendorâs default consent flows or failure to implement agreed features (e.g., participant prompts, opt-out controls). |
On the internal policy side:
- Decide where AI notetakers are never allowed (board meetings, privileged legal calls, disciplinary meetings, regulated data discussions).
- Require hosts to announce on the record that an AI notetaker is active, explain briefly what it does, and offer participants a chance to object or leave. (HR Source)
- Bake Otter-style scenarios into your data protection impact assessments if you operate under GDPR/CCPA-like regimes. (Michal Sons)
Demand Letters Around AI Notetakers âď¸
Brewer is a class action, but this fact pattern scales nicely to pre-suit demand letters, which is where a lot of your work lives.
From different sides:
1. Counterparty / employee / meeting participant letters
- A client or employee discovers that a confidential call was recorded and used for AI training without their knowledge.
- Letter alleges violation of CIPA/ECPA and breach of contract/NDAs, demands:
- cessation of AI recording,
- deletion of specific transcripts, and
- possibly compensation / policy changes.
2. Upstream vendor letters
- A corporate customer realizes their AI notetaker vendor behaves like Otter.
- Their counsel sends a letter to the vendor:
- pointing to contractual representations about law compliance and data use;
- demanding a detailed explanation of consent flows and training practices;
- reserving rights for any non-compliance with wiretap/privacy laws.
A well-aimed letter in either direction does more than âcomplain.â It:
- forces the other side to lock in their story,
- preserves evidence, and
- sets up the record if you end up where Brewer is now.
The Takeaway: AI Meeting Bots Are Not âJust Toolsâ đď¸đ¨
Brewer v. Otter.ai is doing for AI notetakers what the early scraping and Books3 cases did for training data: turning vague unease into specific causes of action.
Practically:
- If youâre building AI transcription tools, you need all-party consent flows, clear UI notices, and a defensible training policy.
- If youâre buying them, you need contracts and internal policies that assume a Brewer-type complaint is possible and allocate risk accordingly.
- If youâre on the receiving end of a surprise bot recording, you now have a clean narrative and legal framework for demand letters and, if necessary, litigation.
The convenience of never taking notes again is nice. The inconvenience of your âmeeting assistantâ being characterized as an unlawful wiretap in federal court isâŚless nice.