AI experts have warned businesses of using technology for workplace surveillance as they say “meetings turn into scripted theatre”.
Businesses are increasingly using workplace surveillance tools with AI meeting transcription and summarisation – but AI experts have warned that the risk shifts from intrusive monitoring to automated judgement at scale.
It creates a permanent record that can be searched, scored, and weaponised in performance management.
These systems turn human work during meetings into legible data: talk time, sentiment, participation and response speed.
People adapt to what is measured, so the organisation drifts towards performative busyness and away from deep work, experts warn.
There is also a fear that data from recorded meetings can be used in HR disputes and in disciplinary processes.
Meetings turn into scripted theatre
Rohit Parmar-Mistry, Founder at Burton-on-Trent-based Pattrn Data, said: “AI meeting tools are being sold as productivity. Combined with surveillance, they become automated judgement: a permanent, searchable record that can be scored and reused in performance management. Once you measure talk time, sentiment, responsiveness and ‘action items’, people optimise for the dashboard.
“Organisations drift towards performative busyness and away from deep work. In our AI Audits, the recurring risk is secondary use creep: data collected for one purpose quietly becomes HR evidence, training data, or a disciplinary shortcut.
“Businesses should add a hard rule: no automated performance scoring from transcripts without a human case file and appeal. If you cannot explain who used a transcript and why, you should not be collecting it.”
Colette Mason, Author & AI Consultant at London-based Clever Clogs AI, said the majority of workers don’t want to be monitored at work.
She added: “Information Commissioner’s Office (ICO) research found 70% of workers find workplace monitoring intrusive, and only 19% would take a job knowing they’d be monitored. Now add AI meeting tools that transcribe, summarise, and score every conversation, and that discomfort becomes something harder to name.
“The ICO has warned against function creep, where data collected for one purpose quietly gets repurposed for another. Their example: access logs reused for performance management. Meeting transcripts are the same risk on steroids. Once the recording exists, it migrates into HR disputes, disciplinary cases, and model training whether anyone planned that or not.
“The ICO ordered Serco Leisure to stop using biometrics to monitor staff. A statutory code on AI and automated decision-making is in progress. But most organisations are adopting these tools faster than any laws. The baseline should be obvious: if you’re searching a permanent record of how someone speaks, thinks, and hesitates, that’s personal data. Treat it as such.”
Treat it as personal data by default
Kate Underwood, Founder at Southampton-based Kate Underwood HR and Training, warned that personal data could be abused.
She continued: “An AI meeting transcript isn’t ‘notes’. It’s a searchable record of your people, and that’s personal data. Yes, treat it as personal data by default, even inside your own business. If it names someone, captures their voice, their views, their mistakes, their tone, their ‘I’m struggling’, their ‘I’ve got childcare’, that’s identifiable.
“UK General Data Protection Regulation (GDPR) doesn’t care that it never leaves the building. Internal still counts. And here’s the sting: transcripts can easily scoop up special category stuff without trying. Health. Union chat. Religion. Politics. Once you’ve got it, you’re responsible for it. Not the software provider. You. If you wave this through as ‘productivity’, you’ll get the opposite.
“People clam up. Meetings turn into scripted theatre. Trust evaporates. Do this instead: no default recording, state the purpose in writing, lock down access, delete fast, and if anyone wants to use transcripts in performance management, stop and do a Data Protection Impact Assessment (DPIA) first.”


