Back to blog
Product · · 10 min read

The Bot in the Room

Last month, someone on Hacker News posted: “Otter.ai bot recording meetings without consent.”

The thread exploded. Hundreds of comments from people sharing their own horror stories. A job candidate discovering mid-interview that a bot was recording everything. A board meeting where the AI suddenly started emailing transcripts to participants who never signed up. IT admins blocking Otter entirely because “way too much data for a company to have without a contract.”

In August 2025, Otter got hit with a class action lawsuit. The allegation: recording private conversations and using them to train AI models without consent from everyone in the meeting.

But here’s the thing. Privacy isn’t actually the interesting problem here.

The observer effect

In 1924, researchers at a factory called Hawthorne Works discovered something weird. When they turned up the lights, worker productivity increased. When they dimmed the lights, productivity also increased. It wasn’t about the lighting. Workers behaved differently simply because they knew they were being observed.

Psychologists call this the Hawthorne effect. And it happens in every meeting where a bot is present.

When someone says “I’m going to have my AI assistant join to take notes,” the room shifts. People become slightly more careful. Slightly more performative. The CFO thinks twice before making that off-color joke. The junior engineer hesitates before pushing back on the VP’s bad idea.

This isn’t speculation. Stanford’s research on video call fatigue found that the mere presence of self-view (seeing yourself on camera) increases cognitive load and self-evaluation. Add a recording indicator, and the effect compounds. Add an AI bot as a visible participant? You’re not in a meeting anymore. You’re on the record.

The social tax

Meeting bots create an implicit social tax that falls unevenly.

The person who invited the bot feels empowered. They’ll get a transcript, action items, a searchable record. Great.

Everyone else feels watched. They didn’t ask for this. They’re now performing for an audience they can’t see—some future version of their words, stripped of context, stored on someone else’s servers.

In one Reddit thread, someone described it perfectly: “When someone’s Otter bot joins, I immediately become 20% less creative and 50% more careful about what I say.”

This isn’t paranoia. It’s rational behavior adaptation. And it’s silently degrading the quality of workplace conversations everywhere.

Here’s what makes this genuinely hard: meetings involve multiple people, but recording tools typically ask permission from one.

Zoom asks the host. Teams asks the organizer. Otter asks whoever installed it. But the recording captures everyone—including the California resident who didn’t consent, the contractor who doesn’t know company policy, the job candidate who didn’t realize they were being evaluated by AI.

The Otter lawsuit specifically calls this out: the company “sought permission only from meeting hosts (and sometimes not even them), but not from all participants.”

There’s no good solution here within the bot paradigm. Either you annoy everyone with consent pop-ups before every meeting, or you record people who didn’t agree to be recorded. Both options are bad.

A different approach

We built Oatmeal because we wanted the benefits of meeting transcription without the social tax.

The key insight: you don’t need a bot if you’re not sending audio anywhere.

Oatmeal captures your system audio and microphone directly on your Mac. No participant joins your call. No one else sees a recording indicator. The audio is transcribed locally using on-device AI—it never leaves your machine.

From everyone else’s perspective, it’s just a normal meeting. You’re not making a privacy decision for the room. You’re making one for yourself.

What you give up

I want to be honest about the tradeoffs:

  • Speaker identification is on you. Cloud bots can often identify who said what because they’ve joined as a participant with access to video feeds and speaker data. Oatmeal separates “your mic” from “their audio,” but distinguishing between multiple remote participants requires you to add context.
  • Accuracy varies. Cloud models are trained on more data. Our local Parakeet model hits ~5.8% word error rate—competitive but not always best-in-class for heavy accents or specialized jargon.
  • Apple Silicon only. Real-time local transcription requires serious GPU power. M1 and later, macOS 14.2+. Intel Macs can’t keep up.

If these tradeoffs don’t work for you, use Otter or Granola or Fireflies. They’re fine products, and sometimes cloud transcription is the right choice.

But if you’ve ever hesitated to turn on your meeting bot because you knew it would change the vibe—we built this for you.

The bot-free future

There’s a broader shift happening. Apple is investing heavily in on-device AI. The M5’s neural engine runs local LLMs 4x faster than M4. Models are getting smaller and more efficient every month.

We’re betting that the meeting bot era is transitional—a stopgap while local compute caught up with cloud capabilities. In a few years, asking “can I have my bot join your call?” will feel as antiquated as asking “can I record this on my cassette player?”

The future of meeting capture isn’t a bot that joins your call. It’s AI that runs silently on your own machine, processing your own audio, creating your own private record.

No bot. No social tax. No permission to ask.

Try it

Oatmeal is $79 once—no subscription, no recurring fees.

Press Cmd+Shift+M to start recording. Take notes if you want. When you’re done, you get a transcript and AI summary.

No account. No bot joining your call. No audio leaving your Mac.

That’s it.