
Meta faces damning lawsuits revealing the tech giant knowingly turned Instagram into a “hunting ground” for predators, choosing profits over children’s lives as internal documents expose years of deliberate negligence that led to teenage suicides.
Story Highlights
- Two families sue Meta after 13-year-old Levi and 16-year-old Murray died by suicide following Instagram sextortion schemes
- Unsealed internal documents show Meta knew since 2019 that Instagram exposed 5.4 million unwanted messages to teens daily
- Meta executives rejected safety fixes to keep teens’ accounts private, prioritizing user engagement over child protection
- Platform’s algorithms recommended 1.4 million teens to potential predators in a single day, according to company’s own data
Corporate Negligence Exposed Through Internal Documents
The Social Media Victims Law Center filed explosive lawsuits in Delaware State Court against Meta, armed with internal company documents that reveal a calculated decision to sacrifice child safety for profit margins. These unsealed records show Meta’s own researchers documented 3.5 million profiles engaged in inappropriate interactions with children through direct messages as early as 2019. Despite clear warnings from internal teams, Meta executives consistently rejected proposals to default teen accounts to private settings, fearing it would reduce user engagement and revenue streams.
The lawsuits center on two tragic cases that exemplify Meta’s corporate callousness. Thirteen-year-old Levi Masiejewski from Cumberland County, Pennsylvania, joined Instagram in August 2023 and was targeted by predators within just two days. Extorted for $300 he didn’t have—possessing only $37—Levi took his own life rather than face the shame orchestrated by criminals who found him through Instagram’s predator-friendly features. Similarly, 16-year-old Murray Dowey from Scotland became another victim in December 2023, dying by suicide after falling prey to the same sextortion schemes that Meta’s own data showed were proliferating on their platform.
Platform Design Deliberately Endangers Children
Meta’s internal surveys revealed the staggering scope of predatory behavior enabled by Instagram’s architecture, with 13% of users aged 13-15 reporting weekly unwanted sexual advances by 2022. The company’s “Accounts You May Follow” feature and public follower lists systematically connected adult predators with vulnerable teenagers, creating what attorney Matthew Bergman accurately describes as a “hunting ground.” These features weren’t accidental oversights but deliberate design choices that Meta maintained despite knowing the deadly consequences for America’s children.
The evidence demonstrates Meta’s executives prioritized engagement metrics over basic child protection, rejecting safety recommendations that would have prevented millions of daily unwanted contacts. This corporate decision-making reflects the broader Silicon Valley mentality that treats American families as expendable revenue sources. Meta’s willingness to facilitate predatory access to children represents a fundamental attack on family values and parental authority, undermining the basic expectation that technology companies won’t actively endanger our kids for profit.
Families Demand Justice Against Big Tech Predation
Trisha Masiejewski, Levi’s mother, captured every parent’s nightmare when she stated, “I never imagined strangers could send messages to teenagers.” Her testimony exposes how Meta deliberately concealed the predatory nature of their platform from families who trusted the company with their children’s safety. The lawsuit seeks not just financial damages but accountability for Meta’s systematic endangerment of minors through negligent platform design that any reasonable company would have recognized as dangerous.
Attorney Bergman’s legal strategy focuses on proving Meta’s knowledge and deliberate indifference, stating the company “chose profits over lives” when presented with clear evidence of harm. This case represents a critical battleground for American families against Big Tech’s exploitation of children. Meta’s recent implementation of private teen accounts in September 2025—only after years of preventable tragedies—proves the company always had the capability to protect children but chose not to until facing legal consequences.
Sources:
SMVLC Files Lawsuits After Multiple Teens’ Suicides Following Instagram Sextortion
Meta sued after teen boys’ suicides, families claim tech giant ignored ‘sextortion’ schemes












