
Snapchat is a tool of choice for sextortionists, sex traffickers, and child abusers — and the company knows it. Internal documents and whistleblower accounts reveal that reports of abuse have often gone ignored, critical safety fixes were dismissed to protect engagement metrics, and features like My AI have promoted statutory rape. The technology exists to stop this, yet Snap chooses to look the other way.
Snapchat remains a prime tool for sextortionists, sex traffickers, and child abusers to accomplish their crimes. And it’s no surprise.
Internal documents and whistleblower accounts suggest that Snapchat has consistently deprioritized user safety in favor of growth and engagement metrics.
Snapchat’s intentional features and design choices have made it a hub for exploiters. The platform’s disappearing messages feature, marketed as a privacy tool, has been weaponized by predators to coerce minors into sending explicit content, which is then saved or screenshotted for blackmail. Snap’s own research reportedly confirmed:
“One-third of teen girls and 30% of teen boys reported being exposed to unwanted contact on Snapchat in 2022. Internal surveys from Snap revealed that over half of Gen Z users or their friends had experienced catfishing, and many were victims of sextortion. Rather than confront and address these alarming realities, Snap chose to ignore user reports. Alarmingly, one internal investigation concluded that 70% of victims had not reported their abuse because they knew no action would be taken by Snap; indeed, of the 30% that did report, none were addressed.”
Progress Remains Slow and Ineffective
While Snapchat has introduced some safety measures, such as warnings for messages from strangers and restrictions on friend requests, these solutions fall far short of addressing the platform’s systemic issues. Many of these features rely on users recognizing risks and taking action, which is unrealistic for minors who may not fully understand the dangers or feel pressured to not report abuse. Age verification remains weak, allowing underage users to access the platform and predators to pose as young peers. Parental controls are limited and optional, leaving many families without the tools needed to protect their children and leaving children without the privilege of involved and tech-savvy guardians completely on their own. Even reporting tools, though improved, depend on victims overcoming shame and fear to report abuse—an unlikely scenario for many and one that only occurs after a degree of harm has already happened — which is why Snapchat should be doing more prevention to stop the harm from taking place to begin with.
“Snap knowingly incorporates design features that facilitate and enable the distribution of drugs and the sexual exploitation of children, including the distribution of CSAM, child sex trafficking, child pornography, and child predatory activities. These features include ephemeral content, personalization algorithms, Snap Map, and My AI.”
— complaint filed by the Utah Department of Commerce / Utah Office of the Attorney General against Snapchat / Snap, Inc.
It’s Time to Be Honest
The systemic and historic problems at Snapchat are no accident. The unsafe nature of this platform is because of Snapchat’s business model and design philosophy. Make no mistake: this is a platform that, in its very architecture, appears to intentionally allow the creation and sharing of sexually explicit material, including of minors. It appears that if Snap truly prioritized child safety, it would immediately deploy existing technologies to block explicit content creation and sharing, cutting off the tools predators rely on. Yet it hasn’t. Every day, sextortion, child sexual abuse material, and images used to blackmail and traffic children continue to proliferate — a direct result of deliberate design choices.
The tech to stop this exists. Snap’s failure to act is a conscious and knowing decision that leaves children exposed to grave harm.
WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.
It’s important to note that Snapchat’s feature allowing the ephemeral creation/distribution of pornography is a welcome mat for sextortionists.
Below is a non-exhaustive sampling of cases documented in 2025. These and other sexual exploitation issues on Snapchat are historic problems essentially tracing back to the origin of the app, due to both its design and its chronic inability or unwillingness to prevent and moderate risks for exploitation.
A report from Thorn and NCMEC released in June 2024, Trends in Financial Sextortion: An Investigation of Sextortion Reports in NCMEC CyberTipline Data, analyzed more than 15 million sextortion reports to NCMEC between 2020 and 2023. The report found:
**Previous years of proof available upon request
Sextortion is the use of sexual images to blackmail the person depicted in those images. It often encompasses a financial element as well, where predators demand money, threatening to publish explicit images of children if they do not comply.
Teenage boys account for 90% of financial sextortion cases involving minors.
On average, there were 812 reports of sextortion per week to the National Center for Missing and Exploited Children (NCMEC) in the last year of data analyzed, with more than two-thirds of these reports appearing to be financial sextortion, according to a joint NCMEC and Thorn report.
In 2024, 1 in 4 young people (24%) reported having personally experiencing sexual extortion as a minor, including 1 in 5 (20%) who were teens at the time of the survey and 1 in 3 (31%) who were young adults at the time of the survey, according to an online survey of 1,200 young people ages 13-20 in the U.S.
One of the most severe consequences of sextortion is suicide – there have been at least 36 sextortion cases that have resulted in suicide since 2021. The speed at which sextortion can escalate (sometimes within hours between the first contact and suicide) is an example of why we need mass-scale prevention and safety by design – we need these online platforms to do more to prevent strangers from contacting kids because often there is not time to wait for the platform to respond to a report before significant harms occur. We need platforms to take preventative action, not just reactionary.
We must also recognize that adults are victims of sextortion. A survey across 10 countries found that 1 in 7 adults have been threatened with the distribution of sexual images depicting themselves. Adult victims are often underreported, silenced by shame and fear of social consequences, yet they are particularly targeted for their financial resources. Further, when these individuals hold sensitive government positions, sextortion escalates into a national security threat.
Learn more here:
Snapchat remains a premiere destination for sex traffickers and predators to create/produce and distribute child sexual abuse material (CSAM), and also to groom and extort people into sex trafficking.
It’s important to note the intrinsic connection between Snapchat’s features allowing the ephemeral creation/distribution of pornography and the ability for predators to coerce children to generate CSAM, which is then often used as blackmail to further coerce or manipulate them into either continued CSAM production, sex trafficking, or other abuses.
Below are a mere non-exhaustive sample of cases documented in 2025. These and other sexual exploitation issues on Snapchat are historic problems essentially tracing back to the origin of the app due to both its design and chronic inability or unwillingness to robustly prevent and moderate risks for exploitation.
**Previous years of proof available upon request
“This child … was groomed, exploited and then sexually abused by strangers who found her online,” Louisiana’s attorney general, Liz Murrill, said. “This is just one example of the dangers of social media and of human trafficking.”
“[One man] posed as a 17-year-old [on Snapchat] and solicited multiple sexually explicit photos and videos from [the victim.] Over time, [the man] allegedly coerced the girl into creating a Grindr profile, instructing her to list a false age of 18… After the profile was created, reports indicate that her alleged trafficker arranged meetings with three men in Pennsylvania… An additional sexual assault was allegedly recorded during a Snapchat video call…”
“After his victims sent him sexually explicit content, [the man] would sometimes demand additional sexually explicit images and videos from them. Wallin also threatened to publish or otherwise expose the prior pictures and videos sent by the victim or created by defendant if the victim did not comply with his demands. For example, in February and March of 2020, [the man] enticed a victim, who was approximately 9-10 years old at the time, to produce sexually explicit images and videos used on the social media app Snapchat. [The man] admitted in his plea agreement to knowingly causing at least four additional victims – ranging in age from 12 to 16 years old – to produce multiple files of child pornography.”
New Mexico AG Raúl Torrez: Lawsuit Alleging Snapchat Enables Sextortion and Child Sexual Exploitation — New Mexico Office of the Attorney General
Date: Sep 5, 2024 (press release / complaint filings).
The New Mexico AG’s complaint alleges Snapchat’s design and policies facilitated sextortion and sexual exploitation — claiming minors report more sexual interactions and that more trafficking victims are recruited on Snapchat than other platforms.
The alleged harms laid out in the case are summarized to include:
It was also reported: that the AG office set up a decoy account and swiftly encountered sexually exploitive messages and contacts:
The suit maps out the many ways that Snapchat has allegedly become a “breeding ground” for child predators, as demonstrated by an undercover investigation conducted by New Mexico’s Department of Justice. The office first set up a decoy account for a 14-year-old girl with the username Sexy14Heather (“Heather”), initially listing her sign-up age as 18, but later modifying it to a minor’s account. Within a day of merely searching for other 15-year-olds on the app, and without adding any other users, Heather received a friend request from Enzo (Nud15Ans), who swiftly requested they exchange anonymous messages off Snapchat through a ngl.link. After this single exchange, and despite her account being private, Snapchat suggested nearly one hundred other users to Heather, including other adults who sought to exchange sexually explicit content. An additional search that suggested Heather was looking for other users under 18 led to continued recommendations for explicit accounts with usernames like “naughtypics,” “gayhorny13yox,” and “teentradevirgin.” And despite never directly engaging with them, the decoy account received push notifications that referenced even more explicit content.”
Nevada AG Aaron D. Ford: Lawsuit Targeting Snapchat’s Design and Corporate Practices as Hazardous to Children
Nevada Attorney General Aaron Ford launched lawsuits on January 30, 2024, in state court against Snapchat and other social media giants, accusing them of deceptive trade practices and negligence for designing addictive platforms that exploit children’s developing brains for profit; Snapchat, in particular, is branded an “addiction machine” akin to an illegal drug, engineered to demand constant engagement—even during driving or meals—through manipulative features that maximize youth use and emotional hooks, fostering severe harms like sexual exploitation, auto accidents, drug overdoses, suicides, and eating disorders.
The complaint highlights how Snapchat’s ephemeral messaging and camera-first interface lure teens into constant interaction, enabling risks such as sextortion via disappearing threats and private sharing of explicit content that evades easy detection, all while prioritizing revenue over safety.
It alleged:
Utah AG Derek Brown: Lawsuit Alleging Snapchat’s AI and Platform Features Facilitate Sextortion and Exploitation of Minors — —Statement
The Utah AG asserted that:
“Snapchat is designed to steal time and attention away from teens at the expense of their development, health, and welfare. This complaint brings three separate counts, alleging that:
Examples from the Complaint include:
It also asserts in the public complaint:
Automatically detect and block sexually explicit images, especially when sent to or from minor accounts. Provide clear warnings, resources, and prompts to report, block, or remove offending accounts. Notify minors attempting to send nude images about the risks and prevent the image from being sent. Bumble, Apple, Google, and Discord use some form of technology to proactively blur sexually explicit images before they’re viewed—the technology exists, this is possible.
Proactively identify, remove, and block accounts and bots that are posting and promoting pornographic content (whether publicly or in direct messages) and/or selling sex with greater efficiency.
Fix age verification loopholes. Users can lie about their age during signup without robust checks, allowing underage kids to access adult-oriented features or adults to pose as teens, facilitating initial contact and grooming.
Halt minors access to friend-finding mechanics beyond existing phone contacts. Current contact suggestions based on mutual friends, or based on proximity, can expose minors to adult strangers.
Improve moderation and reporting effectiveness and consistency. Reports of inappropriate content or behavior are often not acted upon promptly, allowing extreme or violent material to reach minors and predators to continue operating.
Expand Family Center functions to allow parents to see what their children are exposed to on Stories and other areas of the app; send alerts to parents when their children add or remove friends, settings are changed, or sexually explicit images are attempted to be sent or received.
Report suspected child sexual exploitation to the National Center on Missing and Exploited Children (NCMEC) Cyber Tipline
NCMEC’s Take It Down service: Resource for minors to remove their sexually explicit content from online platforms
App Danger Project: Snapchat
Stop Non-Consensual Intimate Image Abuse (StopNCII) – Resource for adults to remove image-based sexual abuse from online platforms
Protect Young Eyes: Snapchat App and Parental Control Review
Spread the word to hold Big Tech accountable. Use these free resources to post on social media or share via email. Your voice can create change!
We use cookies
We use necessary cookies to make this site work and, with your consent, analytics and advertising cookies to understand usage and improve marketing. You can accept all, choose necessary only, or reopen your choices later. Privacy policy