Tea App Breach: How Sensitive Data Exposure Put Women At Risk

July 30th 2025

When Assurances Don’t Match Reality

Tea, a women-only dating and advice app that briefly topped the U.S. App Store in July 2025, promised a safe space for women – but its failure led to one of the most visible cases of sensitive data exposure in recent memory. During onboarding, the platform asked users to submit selfies and government-issued ID photos for verification, and promised to delete them after use. They didn’t.

404 Media reported that a misconfigured Firebase database – Google’s centralised backend platform – caused the breach. The exposed archive, believed to be a development holdover, contained approximately 13,000 verification selfies and ID photos, along with more than 59,000 in-app images, including public posts, private messages, and comment threads. Tea left roughly 72,000 images publicly accessible on the open internet.

The breach affected only users who signed up before February 2024, when Tea ended its ID verification requirement. By that point, the platform had collected and retained large volumes of sensitive content – and failed to delete the data once it served its purpose.

As Forbes put it:

The Tea breach is not just a case of leaked data; it is a collapse of purpose. A platform built for safety exposed the very identities it was meant to protect. Legal IDs. Facial recognition data. Personal messages.

Additional Sensitive Data Exposure

The scope of the breach widened further. As reported by 404 Media, private messages, including chats sent as recently as July 2025, were also accessible through the same Firebase misconfiguration. Tea stored these messages in a readable format, despite claiming they were anonymous and encrypted. The exposed messages include intimate discussions on sensitive topics like abortion, infidelity, and emotional abuse. Some messages even contained personal contact information, such as phone numbers or meeting locations.

Tea is also facing two class action lawsuits in California. Plaintiffs allege that the app misrepresented its data practices, failed to follow its own deletion policies, and caused emotional distress by exposing highly sensitive personal data. The lawsuits seek monetary damages as well as injunctive relief – demanding stronger data controls, encryption, and deletion enforcement moving forward.

UK GDPR’s Storage Limitation: A Shield Against Breaches

Under UK GDPR, the Storage Limitation principle requires that personal data be:

  • Collected for a specific, legitimate purpose.
  • Retained only as long as necessary.
  • Deleted or anonymised once that purpose is fulfilled.

Tea retained ID verification images and message content long after their original purpose had passed – directly opposing GDPR’s data retention principles. Although the company operates in the U.S., its failure led to widespread sensitive data exposure and offers a cautionary example for any platform handling high-risk personal data, especially when vulnerable communities are involved.

What proper storage limitation would have done:

  1. Reduced post-use risk
    Once ID verification was complete, selfies and document scans should have been deleted. Retaining them served no user-facing function.
  2. Limited breach impact
    Smaller data holdings mean lower exposure. By keeping legacy posts, messages, and verification images, Tea created a vast attack surface.
  3. Simplified compliance and user rights
    Responding to Subject Access Requests (SARs) and Right to Be Forgotten (RTBF) requests becomes easier and more trustworthy with leaner data.
  4. Upheld trust and platform credibility
    Saying you delete data and actually doing so is foundational to user trust. This failure undercut the very promise of safety.
  5. Reduced storage and security overhead
    Holding unnecessary data increases infrastructure and security burden, with no corresponding benefit.

Implementing storage limitation: Best practices

  • Clearly define data retention policies, especially for sensitive verification and messaging content.
  • Communicate transparently: Tell users what you retain, how long you keep it, and why – then follow through.
  • Automate deletion workflows – avoid exceptions or delays due to manual processes.
  • Incorporate deletion into your product lifecycle – don’t let data linger.
  • Regularly audit legacy systems and storage buckets, especially during platform transitions.

Final Thoughts

Tea didn’t just suffer a breach – it failed to follow basic data retention principles. Women trusted the platform to protect their most private moments. Instead, Tea exposed sensitive data that it should have deleted, causing one of the most visible and damaging privacy failures to date.

Storage Limitation is not just a checkbox under UK GDPR. It is a core safeguard that protects people when technology falters.

Get In Touch

If you’re a cultural organisation looking for tailored support, plain English policies, or practical training that empowers your team, we’d love to help. Get in touch for a free 30-minute consultation.

Leave a Reply