Deepfake case study · Image
72,000 Users of the Tea App deepfake (Jul 2025)
A data breach at the Tea app exposed 72,000 user records, including government-issued IDs and selfies. The app's backend database was unsecured due to 'vibe coding,' where developers used AI tools like ChatGPT to…
- Incident date
- Jul 2025
- Target
- 72,000 Users of the Tea App
Updated May 6, 2026 · 1 min read
A data breach at the Tea app exposed 72,000 user records, including government-issued IDs and selfies. The app's backend database was unsecured due to 'vibe coding,' where developers used AI tools like ChatGPT to generate code without rigorous security reviews. This led to a publicly accessible Firebase bucket, lacking authentication, resulting in the exposure of sensitive user data.