I couldn’t sleep.
Some Americans watch Downton Abbey for their late-night dose of European drama.
I scroll privacy law case updates.
And yesterday, Luxembourg delivered.
On September 4, 2025, the Court of Justice of the European Union (CJEU) issued its opinion in EDPS v. SRB (C-413/23 P). The Court set aside the General Court’s 2023 ruling and did something important: it clarified that pseudonymised data is not automatically personal data in all circumstances.
“Pseudonymised data must not be regarded as constituting, in all cases and for every person, personal data” (SRB, para. 86).
If the recipient can’t reasonably re-identify the person behind it, then in that recipient’s hands, the data may not be personal at all.
That’s a big deal. If the Court had gone the other way, every pseudonymised dataset in Europe would have been permanently chained to GDPR. Research, audits, and investigations slowed to a crawl.
Instead, the Court said, context matters.
Let’s Rewind
Banco Popular Español was in financial trouble in 2017. The Single Resolution Board (SRB) held a consultation about winding the bank down. People sent in comments — raw, personal opinions about what should happen.
SRB scrubbed names, handed pseudonymised comments to Deloitte, and called it a day. Deloitte could read the comments, but had no keys to link them back to people.
The case worked its way up to Luxembourg.
And the CJEU drew the line:
Think of it like a safe.
SRB had the safe and the keys. Deloitte only had photocopies of what was inside. Without the keys, the copies weren’t valuables.
At para. 87, the Court stressed:
“Another person” only means someone with actual means to re-identify. Not a theoretical someone, someday.
And at paras. 111–112, the Court reminded SRB: you still have obligations from your perspective. Deloitte may see anonymous scraps, but SRB has to treat them as personal at collection.
The takeaway: personal data is relative. It depends on who’s holding it and what they can realistically do with it.
And as soon as I read it, I thought — hang on, I’ve seen this before.
Flashback to California
Different continent. Different law. Same theme.
On June 17, 2024, the Ninth Circuit Court of Appeals in California decided Zellmer v. Meta.
The case centered on Facebook’s “Tag Suggestions.” You remember it. Upload a photo, and Facebook would ask: “Want to tag John? He’s in this picture.”
Creepy. Convenient. Both.
Here’s how it worked:
Clayton Zellmer wasn’t a Facebook user. But his friends uploaded photos of him, which triggered those fleeting signatures. He sued under Illinois’ Biometric Information Privacy Act (BIPA).
The Ninth Circuit wasn’t persuaded.
What the Court Said
At the heart of the opinion:
“Because—on the record before us—face signatures cannot identify, they are not biometric identifiers or biometric information as defined by BIPA” (Zellmer, 2024).
A signature was just a doodle. A template that could be matched against a database was the driver’s license.
One could actually identify. The other couldn’t.
That difference carried the case.
Why It Matters
The Ninth Circuit’s reasoning rhymes with Luxembourg.
Both courts made the same move.
If the data can’t reasonably identify, it’s outside the scope.
And just like SRB, the stakes were real.
If Zellmer had won, every fleeting calculation from a face — even numbers gone in seconds — would have triggered biometric law. That would have gutted privacy-preserving design.
Instead, the Court drew a workable line.
Two Courts, One Principle
Put the two rulings side by side:
Different statutes. Different facts. Same conclusion: Identifiability is the gatekeeper.
Not speculation. Not theory. Practical, real-world identifiability.
And the metaphor works both ways:
The law follows the same instinct: what can you actually do with the data in front of you?
This isn’t abstract for us.
Years before either ruling, we built the Rock around one rule: no images, just math.
Here’s what happens when someone approaches the Rock:
And in our hands, that signature is just math.
We can’t re-identify. We can’t link. We can’t cheat.
That’s the design. And it’s the same principle both SRB and Zellmer confirmed:
If data can’t identify, it doesn’t carry the same legal weight.
The Rock proves something simple but powerful. Access control can be secure and privacy-safe. You don’t have to pick one.
For companies, the signal is clear.
Identifiability is the gatekeeper.
That’s what the Alcatraz Rock does. Facial authentication.
It creates a facial signature that opens doors — but never opens privacy risks.
Want to see how privacy-by-design actually works in practice?
Explore the Rock at www.alcatraz.ai.