After the Otter.ai class-action: how to choose a meeting recorder you will not regret
The August 2025 federal class-action against Otter.ai changed the calculus for every team buying a meeting transcription tool. Here is what the suit alleges, what your security team will start asking, and how to pick a recorder that holds up under those questions.
After the Otter.ai class-action: how to choose a meeting recorder you will not regret
In August 2025, a federal class-action complaint was filed in the Northern District of California against Otter.ai. Four parallel suits were later consolidated into In re Otter.AI Privacy Litigation. The motion-to-dismiss hearing is scheduled for May 2026.
If you are responsible for picking the meeting recorder your team uses, this is the moment to read the complaint carefully and ask whether the tool you have today would survive the same scrutiny.
This is not a takedown post. We are not going to claim Otter.ai is uniquely bad. The suit names them because they are the most visible cloud notetaker, but the structural questions apply to almost every cloud-based meeting transcription product. What matters here is the pattern of claims and what they imply about how your team should buy.
What the complaint actually alleges
The headline allegation is that Otter "deceptively and surreptitiously" recorded private conversations and used those recordings to train its machine-learning models. Underneath that headline are several distinct legal theories, each of which has independent weight.
Federal wiretap (ECPA, 18 U.S.C. 搂 2510 et seq.). The Electronic Communications Privacy Act prohibits intercepting wire, oral or electronic communications without consent of one party. The complaint argues Otter's bot, joining a Zoom or Meet call as a participant, intercepts communications from people who never gave consent and who often were not even told the bot was there.
California two-party consent (CIPA, Cal. Penal Code 搂 631). California requires consent from all parties to a confidential conversation. The complaint frames Otter's recording of California-resident participants as a per-se CIPA violation when those participants did not consent.
Computer Fraud and Abuse Act (CFAA). A separate theory: that Otter's bot accessed protected computers (the participants' devices, via the meeting platform) without authorisation. This theory is more aggressive and may not survive motion practice, but it is in the complaint.
Common-law intrusion upon seclusion and conversion. Tort claims that translate the federal and state-statute claims into damages theory.
Unfair Competition Law (UCL). A California catch-all for deceptive business practices, here predicated on the consent issues.
You do not need to be a litigator to read those claims and notice what they have in common. They are all variations on a single fact pattern: a third-party software agent joined a private conversation as a participant, recorded it, and used the recording for purposes the participants did not knowingly authorise.
The architectural question hidden inside the lawsuit
The reason this matters when choosing a meeting recorder is that the suit's underlying fact pattern is baked into the architecture of nearly every cloud notetaker. They almost all work the same way:
- The user grants the tool access to their calendar.
- The tool's bot account joins each scheduled call as a participant.
- The bot streams audio (and sometimes video) to the vendor's cloud.
- The cloud transcribes, summarises and stores.
- The user gets notes; the participants may or may not have been notified.
If a federal court rules that the bot-as-participant model is itself a wiretap absent affirmative all-party consent, every product built on that architecture has the same problem. Otter is named because they are the biggest. Fireflies, Read AI, Sembly, Avoma, Tactiq, Grain and several others use the same pattern.
That is the architectural question your security and legal teams will start asking, often without knowing they are asking it: does the recording happen via a third-party agent that joins the call, or does it happen on the recorder's own device?
What changes if you record on your own device instead
Apple's ScreenCaptureKit, available on macOS 14 and later, lets a local Mac app record system audio. No bot joins the call. No third party shows up in the participant list. The user is recording their own outgoing audio stream the same way QuickTime records a screen.
That single architectural difference changes nearly every claim in the Otter complaint:
- ECPA / CIPA / state two-party consent. When you record on your own device, you are typically a party to the conversation. One-party consent (you) suffices in most US states. In two-party states, the user is still in a much cleaner position because no third party intercepted anything.
- CFAA "protected computer" theory. No third-party bot accessed the meeting platform; the user's own device captured the audio it was already playing.
- Intrusion upon seclusion. Harder to argue when the participant whose machine recorded is the one who would be "intruding" on themselves.
- GDPR processor analysis (the European version of this question). No third-party processor sits between you and your audio. There is no DPA to negotiate, no sub-processor list to track, no Schrems II transfer impact assessment. Your data does not move.
The architectural switch from "cloud bot joins your call" to "local capture on your device" is the single biggest privacy and compliance lever in this category. Every other choice (encryption, retention, redaction) is layered on top of that.
Questions your security team will start asking by Q1 2026
Based on what we are already hearing from compliance officers, expect these to be standard questions in any meeting recorder evaluation in 2026:
- Does your tool join meetings as a participant ("bot model") or capture locally on the user's device?
- If it joins as a participant, do you obtain affirmative all-party consent before recording, including from non-Otter-account participants?
- Where is the audio stored? In what jurisdiction? For how long?
- Where is transcription performed? On-device, in your cloud, or via a sub-processor?
- Is the audio used to train your AI models? Even in aggregate or with anonymisation?
- Can the audio be retrieved by your engineering team? Under what circumstances? Logged how?
- What is your data deletion SLA, and is it enforced contractually or merely policy?
- For European customers: are you a processor under GDPR, and have you signed our DPA with your sub-processor list?
Tools that built their architecture around cloud bots will struggle on questions 1, 3 and 4. Tools that record on the user's device, transcribe on the user's device, and never see the audio in their cloud, can answer "we are not a processor for your audio because no audio reaches us" and have that be technically true.
That is a stronger answer than any compliance document can provide, because it does not depend on policy enforcement; it falls out of the architecture.
A note on Otter, fairly
Otter.ai is a competent product made by a real team that has been in this market longer than most. The class-action does not mean the company acted maliciously; it means the standard architecture of cloud meeting notetakers is now under legal scrutiny that did not exist when those products were designed. Several of Otter's competitors will face similar suits before this is resolved.
The honest reading is that the category needs to evolve. Either cloud notetakers add genuine all-party consent flows that interrupt every meeting (which kills the convenience) or the category shifts toward local capture on the user's device.
We bet on the second path. MeetMemo records on your Mac with no bot joining any call, transcribes on the Neural Engine via WhisperKit, summarises locally via Apple MLX, and stores everything on your disk. The audio never reaches our servers because there are no servers in that part of the architecture. You can read the suit, ask your security team the eight questions above, and check our answers against any other tool you are evaluating.
The right meeting recorder for 2026 is one whose architecture survives the suit's questions on its own merits, before any legal review even starts. Pick accordingly.
