Therapist works with clients who've had real images of themselves turned into sexually explicit content, without their consent. All of her current clients identify as woman. This type of image-based abuse, known as an explicit deepfake generated by , is frequently perpetrated by a current or former intimate partner, or a known friend, coworker, or neighbor, as a form of harassment and stalking.

Rossi, a licensed clinical social worker in New York, has seen her clients heal from this betrayal, but the journey is long, and rarely predictable. In some states, creating and distributing an explicit deepfake might be against the law, but even so, local law enforcement may have few resources to investigate such cases. The victim typically has to marshal her own response.

Among her options are attempting to track down the imagery and issue takedown notices where it appears, but there's no guarantee she'll locate all of it. Rossi says explicit deepfakes are often traded between individuals, then downloaded, without the victim's knowledge. Feeling successful one day doesn't mean the next day will be the same.

The imagery may pop up on new platforms. The perpetrator may send it to the survivor's friends, family, and employer. Rossi says survivors naturally become hypervigilant.

They often, impossibly, want to avoid the internet altogether. Sometimes they become fixated on monitoring imagery of themselves online, using the internet excessively to do so. "Being victimized through deepfake.