AI at Center of Santa Barbara Sex Crimes Case - The Santa Barbara Independent
A Santa Barbara resident was convicted under a new California law for creating and distributing AI-generated child sexual abuse material (CSAM), marking one of the first cases of its kind on the Central Coast.
- Incident date
- May 2026
- Target
- Kaylin Hayman
In a landmark case, a resident of Santa Barbara was convicted under California's new law targeting AI-generated child sexual abuse material (CSAM). The case highlights the increasing legal challenges posed by deepfake technology.
What happened
Dayton Aldrich, a former city employee, was arrested in August 2025 after authorities received information from the National Center for Missing and Exploited Children (NCMEC) about sexually explicit conversations and potential CSAM. Investigators traced the activity to Aldrich's residence and discovered a history of interest in minors on a mobile messaging app. A review of his content revealed AI-generated CSAM, including images of two identifiable minor victims. One victim was a former child actress, and the other was active on TikTok. Aldrich pleaded guilty to possessing CSAM and received a sentence of one year in county jail, two years of probation, and lifetime sex offender registration. The new law, Assembly Bill 1831, was created in response to the rise of deepfake technology and was co-sponsored by Ventura County District Attorney Erik Nasarenko. Kaylin Hayman, whose face was used to create explicit material, advocated for the bill.