Skip to content
    September 8, 2025

    When Data Isn’t Personal: Lessons from Luxembourg and California

    By:  Amy Osteen - General Counsel of Alcatraz 

    I couldn’t sleep.

    Some Americans watch Downton Abbey for their late-night dose of European drama.
    I scroll privacy law case updates.

    And yesterday, Luxembourg delivered.

    On September 4, 2025, the Court of Justice of the European Union (CJEU) issued its opinion in EDPS v. SRB (C-413/23 P). The Court set aside the General Court’s 2023 ruling and did something important: it clarified that pseudonymised data is not automatically personal data in all circumstances.

    “Pseudonymised data must not be regarded as constituting, in all cases and for every person, personal data” (SRB, para. 86).

    If the recipient can’t reasonably re-identify the person behind it, then in that recipient’s hands, the data may not be personal at all.

    That’s a big deal. If the Court had gone the other way, every pseudonymised dataset in Europe would have been permanently chained to GDPR. Research, audits, and investigations slowed to a crawl.

    Instead, the Court said, context matters.

    Let’s Rewind

    Banco Popular Español was in financial trouble in 2017. The Single Resolution Board (SRB) held a consultation about winding the bank down. People sent in comments — raw, personal opinions about what should happen.

    SRB scrubbed names, handed pseudonymised comments to Deloitte, and called it a day. Deloitte could read the comments, but had no keys to link them back to people.

    The case worked its way up to Luxembourg.

    And the CJEU drew the line:

    • For SRB, which kept the keys, the comments were still personal data.
    • For Deloitte, which didn’t have the keys, they weren’t.

    Think of it like a safe.

    SRB had the safe and the keys. Deloitte only had photocopies of what was inside. Without the keys, the copies weren’t valuables.

    At para. 87, the Court stressed:

    “Another person” only means someone with actual means to re-identify. Not a theoretical someone, someday.

    And at paras. 111–112, the Court reminded SRB: you still have obligations from your perspective. Deloitte may see anonymous scraps, but SRB has to treat them as personal at collection.

    The takeaway: personal data is relative. It depends on who’s holding it and what they can realistically do with it.

    And as soon as I read it, I thought — hang on, I’ve seen this before.

    Flashback to California 

    Different continent. Different law. Same theme.

    On June 17, 2024, the Ninth Circuit Court of Appeals in California decided Zellmer v. Meta.

    The case centered on Facebook’s “Tag Suggestions.” You remember it. Upload a photo, and Facebook would ask: “Want to tag John? He’s in this picture.
    Creepy. Convenient. Both.

    Here’s how it worked:

    • Photo uploaded.
    • System detected a face.
    • System created a face signature — a short-lived string of numbers.
    • If the face matched a user’s stored template, Facebook suggested a tag.
    • If not, the signature was deleted almost instantly.

    Clayton Zellmer wasn’t a Facebook user. But his friends uploaded photos of him, which triggered those fleeting signatures. He sued under Illinois’ Biometric Information Privacy Act (BIPA).

    The Ninth Circuit wasn’t persuaded.

    What the Court Said

    At the heart of the opinion:

    “Because—on the record before us—face signatures cannot identify, they are not biometric identifiers or biometric information as defined by BIPA” (Zellmer, 2024).

    A signature was just a doodle. A template  that could be matched against a database  was the driver’s license.
    One could actually identify. The other couldn’t.

    That difference carried the case.

    Why It Matters

    The Ninth Circuit’s reasoning rhymes with Luxembourg.

    • In SRB, pseudonymised comments weren’t personal in Deloitte’s hands because Deloitte had no keys.
    • In Zellmer, face signatures weren’t biometric because they couldn’t identify.

    Both courts made the same move.

    If the data can’t reasonably identify, it’s outside the scope.

    And just like SRB, the stakes were real.
    If Zellmer had won, every fleeting calculation from a face — even numbers gone in seconds — would have triggered biometric law. That would have gutted privacy-preserving design.

    Instead, the Court drew a workable line.

    Two Courts, One Principle

    Put the two rulings side by side:

    • SRB in Europe. Pseudonymised data may not be personal in the recipient’s hands.
    • Zellmer in California. Face signatures may not be biometric if they can’t identify.

    Different statutes. Different facts. Same conclusion: Identifiability is the gatekeeper.

    Not speculation. Not theory. Practical, real-world identifiability.

    And the metaphor works both ways:

    • SRB was about safes and keys.
    • Zellmer was about licenses versus doodles.

    The law follows the same instinct: what can you actually do with the data in front of you?

    This isn’t abstract for us. 

    Years before either ruling, we built the Rock around one rule: no images, just math.

    Here’s what happens when someone approaches the Rock:

    • The image is instantly converted into a facial signature.
    • The images are discarded. Gone.
    • The facial signature cannot be matched with any other information source to determine the identity of an individual.

    And in our hands, that signature is just math.

    • Non-reversible. You can’t turn it back into a face.
    • Disconnected. It isn’t tied to a name, title, or record.
    • Walled off. We don’t hold the keys that could make it personal.

    We can’t re-identify. We can’t link. We can’t cheat.

    That’s the design. And it’s the same principle both SRB and Zellmer confirmed: 

    If data can’t identify, it doesn’t carry the same legal weight.

    The Rock proves something simple but powerful. Access control can be secure and privacy-safe. You don’t have to pick one.

    For companies, the signal is clear.

    • In Europe: Pseudonymisation done right can move data outside GDPR for recipients.
    • In the U.S.: In some states, not every facial calculation is “biometric.” Design matters.

    Identifiability is the gatekeeper.

    That’s what the Alcatraz Rock does. Facial authentication.
    It creates a facial signature that opens doors — but never opens privacy risks.

    Want to see how privacy-by-design actually works in practice?

    Explore the Rock at www.alcatraz.ai.

    Tag(s): Blog

    Other posts you might be interested in

    View All Posts