There’s a certain charm to grey areas. They’re neither here nor there—like a politician’s promise or a cookie consent banner. Let’s have a wander through three choice examples of how law, tech, and coercion intersect in a fog of plausible deniability. We’ll start where the term “consent” first gained cultural weight—sexual consent—and then follow its cheerful migration into digital life, courtesy of everyone’s favourite data vampires.
1. Sexual consent: A legal fiction?
Let’s be honest. “Sexual consent” exists mostly as a legal construct. It’s the tidy phrase we reach for when courts, lawyers, and HR departments need to put human messiness into bullet points.
In theory, consent means a person voluntarily agrees to a sexual activity, with full freedom and capacity. No pressure. No force. No alcohol-shaped loopholes. And yet, welcome to the grey zone: what does voluntary really look like when society routinely muddies the water with nonsense like “she was asking for it,” “they lived together,” or “they’d done it before”?
Spoiler: wearing a skirt, having a pint, or cohabiting are not tacit contracts for sex.
Cue the debate between “no means no” and its shiny rebrand, “yes means yes”. The latter sounds so much more empowering—until you realise a ‘yes’ said under pressure, fear, or expectation is just as hollow as a coerced ‘no’. So the wheel’s been reinvented, but it’s still stuck in the same muddy ditch.
Here’s a radical idea: how about no coercion full stop? Forget the semantics—stop pushing people’s will around like it’s your shopping trolley.
2. GDPR: Consent theatre for the digital stage
Now, on to digital consent. Same theme, different costume.
The GDPR arrived in 2018 with trumpet fanfare and heroic PR. At long last, the data-hoarding juggernauts would be tamed! Europe’s digital citizens would finally be free to say no!
Except… we weren’t.
Within hours, the internet was plastered with “agree or go away” banners. Facebook and Google were slapped with lawsuits for “coerced consent”. And thus, a new grey area was born—cleverly branded as user empowerment, but actually just a formalisation of the surveillance economy.
See, before GDPR, the data industry operated in the dark—illegal, yes, but at least honest about it. Post-GDPR, they’re doing the same thing under a patina of legitimacy. Now you can “opt out”—but only after clicking 17 tiny buttons buried in cookie menus designed to look like tax forms.
As a charming side note: GDPR doesn’t really apply to state surveillance. So while private firms get sued for tracking you, your own government can collect and process your data with all the oversight of a fox guarding a henhouse. The EU Court of Justice doesn’t get involved if it’s “national security”. That’s right—two rights (privacy and data protection), but only one gets invited to the party.
How about this: no data collection at all, unless there’s a damn good reason and someone’s watching the watchers.
3. FLoC: Google’s Orwellian group hug
Just when you thought tracking couldn’t get creepier, Google introduced FLoC—Federated Learning of Cohorts. Sounds delightfully academic, doesn’t it?
FLoC was meant to replace third-party cookies with something “more private”. Translation: instead of spying on you individually, Chrome puts you in a group of similarly-behaved users and spies on all of you at once. Like a neighbourhood watch, but for your browser history.
It’s all very reassuring, until you realise your browser is now a snitch with an anonymous badge. Even better, Google checks if your cohort visits “sensitive” sites (like political, religious, or medical pages)—but don’t worry, it won’t learn what those topics are, it’ll just quietly exclude your group. So it’s not watching, just peeking.
And when the cohort gets big enough? Oh, the data brokers will have a field day. Pair that with location, shopping habits, time zones, and the ubiquitous footprint you leave online, and the illusion of anonymity goes up in smoke.
Oh, and FLoC was trialled without user consent. Chrome version 89 just turned it on. No “opt in”, no informed decision. Just “new tracking” instead of “old tracking”. Your choice is Coke or Pepsi, but drink up either way.
How about this instead: no tracking. Period.
The algorithmic engine of mayhem
And since we’re being honest—wouldn’t it be nice if the algorithms that run our lives didn’t feed us political bile, conspiracy sludge, or rage bait just to maximise ad revenue?
Just a thought.
Summary
- Consent, whether sexual or digital, isn’t real if it’s coerced.
- Grey areas aren’t neutral—they’re where the powerful hide the rules they don’t want to follow.
- Privacy isn’t preserved by clicking “Accept All”.
- Freedom isn’t choosing the flavour of your surveillance.
If we want clarity, fairness, and autonomy, we need more than checkbox theatre and rebranded tracking. We need to stop pretending the grey area is a compromise. It’s not. It’s a trap.