top of page
Search

When Technology Becomes a Tool of Abuse: The Next Frontier of Coercive Control

Updated: Feb 13


For years, coercive control has been understood as something that happens behind closed doors. A private pattern of domination, surveillance, isolation and fear.


That understanding is now dangerously outdated.


Technology has transformed coercive control from something hidden into something ambient, portable and increasingly normalised.


Smart devices, AI tools, location tracking, image manipulation and covert recording technologies are not neutral. In the wrong hands, they become accelerants for abuse.

And women and girls are already paying the price.


From Physical Surveillance to Digital Omnipresence


Survivors of coercive control often describe the same experience. A sense of being watched. Anticipated. Pre-empted.


Historically, this required proximity. Now it doesn’t.


Smart glasses, spyware, shared cloud accounts, AirTags, voice assistants, location services and AI-powered tools allow perpetrators to monitor, record, manipulate and threaten without ever being present.


Meta and Ray-Ban’s AI glasses are only the latest example. Women and girls have already reported being secretly filmed in public spaces, with footage uploaded online, sometimes accompanied by personal details such as names, locations or phone numbers.


The result is predictable and devastating:


  • Harassment

  • Image-based abuse

  • Doxxing

  • Threats

  • Escalation into offline harm


This is not innovation outpacing regulation. This is abuse outpacing accountability.


Tech Abuse Is Coercive Control by Other Means


What matters is not the tool. It’s the function.


Coercive control is about power, domination and the removal of autonomy. Technology now enables perpetrators to:


  • Record without consent

  • Monitor movements in real time

  • Control narratives through manipulated or selectively released content

  • Threaten exposure, humiliation or reputational damage

  • Continue abuse long after separation


Post-separation abuse increasingly relies on technology because it offers plausible deniability. A video “just happened”. A message was “misinterpreted”. Surveillance becomes “co-parenting concern”.


This is where professionals often miss it.


The Jekyll and Hyde Problem for Police and Courts


One of the most consistent failures across policing, family courts and social services is an over-reliance on presentation.


Perpetrators who use tech abuse often appear:


  • Calm

  • Rational

  • Tech-savvy

  • Concerned

  • Credible


Victim-survivors, meanwhile, may present as anxious, distressed, fragmented or fearful. Especially when they know they are being watched or recorded.


Without a working understanding of coercive control dynamics, professionals interpret this backwards.


The result is that abuse is minimised, mutualised or dismissed, while perpetrators exploit systems that were never designed for this level of technological asymmetry.


Why the Law Is No Longer Fit for Purpose


Current legal frameworks were built for a different era.


They struggle to address:


  • Covert recording in public spaces

  • Image-based abuse enabled by AI

  • The cumulative impact of low-level digital intrusions

  • Abuse that sits just below criminal thresholds but creates constant fear


Meanwhile, Big Tech continues to profit from products that increase risk while placing responsibility on users to protect themselves.


That is not safeguarding. That is abdication.


This Is Not About Innovation vs Freedom


This is not a debate about being “anti-tech”.


It is about whether women and girls are expected to surrender privacy, safety and autonomy as the cost of technological progress.


Being filmed without consent is not a feature. Being surveilled without recourse is not freedom. Living under the threat of exposure is not participation in public life.


What Needs to Change


At minimum:


  • Tech abuse must be explicitly recognised as coercive control

  • Police, courts and social services must be trained to identify digital patterns of abuse

  • Regulation must require safety-by-design, not safety-by-complaint

  • Victim-survivors must not bear the burden of proving harm after it occurs


Because once technology enables abuse at scale, the damage is no longer individual.


It becomes systemic.

 
 
 

Comments


  • Instagram
  • Youtube
  • Linkedin

© 2026 by Safe Haven Education. Powered and secured by Wix

bottom of page