CRBC News

Ex-Instagram Safety Chief Says Meta's "17x" Policy Allowed 16 Sex‑Trafficking Violations Before Suspension

Former Instagram safety head Vaishnavi Jayakumar testified that Meta had an internal "17x" policy allowing accounts to accrue 16 violations for prostitution and sexual solicitation before suspension. She also said that in March 2020 Instagram lacked a dedicated in‑app reporting path for child sexual abuse material, a change she repeatedly requested. Plaintiffs say internal documents back her testimony and argue Meta's practices harm teens' wellbeing; Meta counters that it now removes accounts for the most serious trafficking violations and has added protections and reporting tools.

Ex-Instagram Safety Chief Says Meta's "17x" Policy Allowed 16 Sex‑Trafficking Violations Before Suspension

Vaishnavi Jayakumar, a former head of safety and well‑being for Instagram, testified in federal court that Meta had an internal "17x" policy that effectively allowed accounts to accumulate 16 violations for prostitution and sexual solicitation before an account would be suspended on the 17th offense. Her testimony appears in court documents filed Nov. 21 in the Northern District of California.

Key testimony and allegations

Jayakumar told the court that in March 2020 Instagram did not offer a dedicated, in‑app reporting option for child sexual abuse material (CSAM), a feature she said she raised repeatedly but was told would be too costly or time‑consuming to build. She described the 17x threshold as "by any measure across the industry, a very, very high strike threshold." Plaintiffs say internal records corroborate her account and contend Meta never made the public, parents or schools aware that accounts could remain active after more than 15 alleged instances of trafficking.

Company response and policy change

Meta has told the court and public that it now applies a "one strike" approach for the most severe violations involving human exploitation and trafficking, removing accounts immediately when a serious violation is confirmed. The company says the broader strike policy originated in 2019 and that the threshold for discipline was reduced over time.

Mental health and broader claims

Beyond trafficking allegations, the plaintiffs' filing accuses Meta and other platforms named in the suit of contributing to an "unprecedented mental health crisis" among U.S. teenagers. The complaint argues the companies prioritized younger users—who generate significant advertising revenue—while omitting or delaying parental and teacher safeguards. Previn Warren, co‑lead counsel for the plaintiffs, compared those practices to tactics once used by the tobacco industry, saying products were knowingly made addictive and marketed in ways that increased usage and profits despite harms.

"Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues," the plaintiffs' attorney said in public comments tied to the filing.

Reporting tools and law enforcement cooperation

Jayakumar testified that adding an option to report CSAM inside the Instagram app would not have required substantial engineering work—"you essentially add an additional option to the existing number of reporting … options that are out there," she said. At the time she raised the issue, Instagram already allowed users to report a range of other violations in the app (for example: spam, intellectual property infringement, and promotion of firearms).

Meta has pointed to prior internal analyses and a February 2021 company post detailing the illegal child‑exploitative content it reported to the National Center for Missing and Exploited Children (NCMEC) in late 2020. The company says it has since developed targeted tools and policies to reduce sharing of such content and adds that Instagram now provides guidance on how users can report child sexual abuse. Meta also says it reports apparent instances of child sexual exploitation to NCMEC in accordance with applicable law.

What's next

The case remains active and the court record will continue to be developed. Plaintiffs maintain their claims are supported by internal documents; Meta disputes the allegations and emphasizes the protections and changes it says it has implemented to protect young users.

Similar Articles