A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

  • MystValkyrie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    6 hours ago

    This problem won’t stop until law enforcement starts treating deepfakes of minors as possession of child pornography with all the legal ramifications that come with it. Young boys need to understand that their actions have consequences.

    In the meantime, no one under 18 should be on social media. I wish AI, deepfakes or in general, could just be illegal, but laws aren’t catching up and people are being victimized.

  • uncouple9831@lemmy.zip
    link
    fedilink
    arrow-up
    63
    arrow-down
    2
    ·
    edit-2
    11 hours ago

    The headline is misleading. She was expelled because she was so frustrated by the incompetence of the administration and the police that she took matters into her own hands and attacked someone. I think it’s justified, but the headline is misleading.

    The same story could be told with “school and police fail woman being attacked” but since that happens every day, it’s not as punchy.

    I am sure people will interpret this as me trying to justify her being expelled or something but you people can fuck right off.

    • Wilco@lemmy.zip
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      6 hours ago

      I sympathize. We should be able to vigilante a MFer if the police will not open a case. Porch pirates stealing packages? Package traps and rocksalt in shotguns. Corrupt government officials … guillotine. Jury nullify this shit.

    • Jyek@sh.itjust.works
      link
      fedilink
      arrow-up
      25
      ·
      edit-2
      12 hours ago

      The headline could have punched so much harder with the truth because it is divisive and justifies multiple ideologies.

      “Preteen expelled for physical retaliation after school fails to protect her from AI deep fake nudes.”

      • Justifies zero tolerance believers
      • Justifies feminists who think she should be a protected class
      • Justifies home school proponents
      • Justifies public School reform proponents
      • Justifies anti-AI crowd
      • Appeals to people for whom children ought to be protected

      Give more truth in the headline and leave the opinions and slant for the editorial section.

      • AA5B@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        10 hours ago

        Yeah but that headline tells the entire story and in a balanced way. You wouldn’t need the content to hold the eyeballs on ads

      • ObjectivityIncarnate@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        7 hours ago

        school fails to protect her from AI deep fake nudes

        I hear you, but what could the school have actually done to prevent this, realistically? Only way I could see is if smartphones etc. were all confiscated the moment kids step on the school bus (which is where this happened, for anyone not aware, it wasn’t in a classroom), and only returned when they’re headed home, and while it probably would be beneficial overall for kids to not have these devices in school, I don’t think that’s realistically possible in the present day.

        And even still, it’d be trivial for the kid to both generate the images and share them with his buddies, after school. I don’t think the school can really be fairly blamed for the deepfake part of this. For not acting more decisively after the fact, sure.

        • Jyek@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          6 hours ago

          That’s not a question for me to answer. It is, in fact, the school faculty’s duty to educate our school children as well as protect them. It is up to them to determine how to do that. It is also true that they failed her in this instance. There are preventative measures that schools can take to stop bullying both on campus and online. Every time a student is bullied into taking their own drastic measures has been failed by the system. In this case, doubly so as on top of her being bullied into retaliation, she was punished by the system for being failed by the system.

      • muusemuuse@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 hours ago

        would this qualify as revenge porn? and pedophelia. and retaliation. and…well, she’s going to have an impressive college fund by the time this is all done.

        • Jyek@sh.itjust.works
          link
          fedilink
          arrow-up
          5
          ·
          11 hours ago

          Yes deep fakes have been reclassified in many US states and much of the EU as revenge porn. Most countries have also classified any sexually explicit depiction of a minor as CSAM or as most people refer to it, child porn.

    • Lady Butterfly she/her@reddthat.com
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      12 hours ago

      That’s my take as well. From what she says she was totally failed by the school and understandably was unhappy and angry. She tried to get others to assault him and for her to be so severely punished, it’s possible her attack was quite severe or there was a history of problems.

  • Taldan@lemmy.world
    link
    fedilink
    arrow-up
    49
    ·
    14 hours ago

    the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them

    If the Sheriff couldn’t get the images, it’s because he didn’t bother to. It’s a well known fact that Snapchat retains copies of all messages

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      6 hours ago

      Is the allegation of CSAM enough to get a warrant? If the Sherrif saw it once then absolutely, but without that?

      Edit: I do imagine if a child testified to receiving it that would be enough to get a warrant for their messages, which would then show it was true, which could then lead to a broader warrant. No child sharing it though would testify to that.

  • FosterMolasses@leminal.space
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    2
    ·
    16 hours ago

    I know everyone’s justifiably outraged over this, but this just makes my heart hurt.

    Imagine being a 13-yr-old girl being terrorized by CP of yourself being spread around the entire school and the adults that are meant to protect you from such repulsive crimes just shrugging their shoulders.

    It’s horrifying.

  • Lemming6969@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    11 hours ago

    Nothing is real or can be considered real anymore. We are going to need new frameworks to handle a world where video of illegal or embarrassing things can be trivially created by anyone.

    People saying the Ai vendor should be liable, but that’s short-sighted for a world where anyone can do this at home with largely anonymous distribution.

  • tangonov@lemmy.ca
    link
    fedilink
    arrow-up
    4
    ·
    11 hours ago

    Create deep fakes of the staff and faculty. Email it to them and their spouses. See if they understand how damaging it is after that. I’m sure they’ll want to give you a knuckle sandwich. Fuck them! If it were my child I would have stood by her decision

  • Tiger666@lemmy.ca
    link
    fedilink
    arrow-up
    20
    arrow-down
    6
    ·
    14 hours ago

    All the staff at that school who were involved should be charged with child pornography.

  • El_guapazo@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    11 hours ago

    I’m sure this would fall under child por n federal laws. But since you to 40% of law enforcement are self admitted domestic violence abusers, they may not want to investigate themselves

  • Spacehooks@reddthat.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    12 hours ago

    I wonder if she posted nudes of the boy on revenge if any action would have been taken against her.

    • Fiery@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      6 hours ago

      The problem is that it’s impossible to take out this one application. There doesn’t need to be any actual nude pictures of children in the training set for the model to figure out that a naked child is basically just a naked adult but smaller. (Ofc I’m simplifying a bit).

      Even going further and saying let’s remove all nakedness from our dataset, it’s been tried… And what they found is that removing such a significant source of detailed pictures containing a lot of skin decreased the quality of any generated image that has to do with anatomy.

      The solution is not a simple ‘remove this from the training data’. (Not to mention existing models that are able to generate these kinds of pictures are impossible to globally disable even if you were to be able to affect future ones)

      As to what could actually be done, applying and evolving scanning for such pictures (not on people’s phones though [looking at you here EU].) That’s the big problem here, it got shared on a very big social app, not some fringe privacy protecting app (there is little to do except eliminate all privacy if you’d want to eliminate it on this end)

      Regulating this at the image generation level could also be rather effective. There aren’t that many 13 year old savvy enough to set up a local model to generate there. So further checks at places where the images are generated would also help to some degree. Local generation is getting easier by the day to set up though, so while this should be implemented it won’t do everything.

      In conclusion: it’s very hard to eliminate this, but ways exist to make it harder.

    • BarneyPiccolo@lemmy.today
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      14 hours ago

      Because our country is literally being run by an actual pedophile ring.

      They’d be more likely to want to know how to do it themselves, than to stop it.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      7 hours ago

      You say this as if the US is the only place generative AI models exist.

      That said, the US (and basically every other) government is helpless against the tsunami of technology in general, much less global tech from companies in other countries.

      • Fedizen@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        7 hours ago

        I’m saying why is it so easy for like 12 year olds to find these sites? Its not exactly a pirate bay situation - you can’t generate these kind of AI videos with just a website copied off a USB and an IP address.

        These kind of resources should be far easier to shutdown access to than pirate bay.

    • Taldan@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      14 hours ago

      Because money is the only thing we, as a country, truly care about. We’re only against things like CP and pedos as long as it doesn’t get in the way of making money. Same reason Trump sharing Larry Nassar and Jeffrey Epstein’s love of “young and nubile” women, as Epstein put it, didn’t kill his political career – he’s the pro-business candidate who makes the wealthy even wealthier

      • Thebeardedsinglemalt@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        11 hours ago

        The orange Nazi could be raping a 12 yr old girl on national tv, but say it’s the libs and drag queens who are the rapists, and his cult with put their domestic terrorist hats back on

        • prole@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          27
          ·
          16 hours ago

          You really want to go down the “which president wants to fuck his daughter” route?

          You sure about that?

        • BarneyPiccolo@lemmy.today
          link
          fedilink
          arrow-up
          7
          ·
          14 hours ago

          EVERYTHING is political these days, you just get tired of defending corrupt, traitor, racist, misogynist, ignorant, incompetent, PEDOPHILE.

          And ANYONE who supports him are all those same things themselves. Repeat: ALL MAGAs are corrupt, treasonous, racist, misogynist, ignorant, incompetent, and PEDOPHILES.

          That includes YOU. You are defending him, that makes YOU a PEDOPHILE.

          • jve@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 hours ago

            That includes YOU. You are defending him, that makes YOU a PEDOPHILE.

            Im with you mostly, but words do mean things, even in this post-fact society.

            • BarneyPiccolo@lemmy.today
              link
              fedilink
              arrow-up
              4
              ·
              11 hours ago

              I understand that, which is why I want to make it very clear that anyone who voted for Trump is a Pedophile.

              Don’t like it? Don’t vote for pedophiles.

              • jve@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                edit-2
                11 hours ago

                Sure, bud.

                I guess that makes all voters politicians, then?

                Or just voters that defend politicians?

                Not real clear how this transitive property is supposed to work.

                • BarneyPiccolo@lemmy.today
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  11 hours ago

                  No, just voters that are MAGAs, which supports and defends pedophiles as an official tentpost of their party philosophy.

                  It’s simple: Anyone who supports and defends pedophiles is a pedophile. If you vote MAGA, which is ANY right wing/conservative candidate, then you are a Pedophile.

                  It’s so simple, even a MAGA pedophile like you can understand it.

        • Echo Dot@feddit.uk
          link
          fedilink
          arrow-up
          30
          ·
          22 hours ago

          Because the question was political. I’m sorry that you’ve got such a teeny tiny brain that you can’t work out that if somebody asks a political question then the response must demonstrably be political. I don’t know how else to put it.

        • michaelmrose@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          22 hours ago

          There is no reason to believe Biden is a villain here meanwhile trump was found to be a rapist in court

      • papertowels@mander.xyz
        link
        fedilink
        arrow-up
        5
        ·
        11 hours ago

        Snapchat allowing this on their platform is the insane part to me. How are they still operating if they’re letting CSAM on the platform??

  • BlameTheAntifa@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    11 hours ago

    The classmates who created and shared them should be arrested and charged with distributing CSAM. It’s unimaginable that this would be tolerated to such an extent and then the victim punished when they are given no other option but to stand up for themselves. This country is sick to it’s core.

    • Soulg@ani.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      7 hours ago

      They should be expelled, but why would children be tried the same as an adult doing this? They’re the same age, it’s not pedophilia, it’s normal expected attraction. Anything beyond expulsion and whatever goes with ai porn harassment between adults is a huge overreaction.

      Would you want two consenting teenagers arrested for csam if they’re texting each other nudes? I would hope not

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 hours ago

      If I can play somewhat the opposite here …… this girl was completely failed by the school system and those parents ought to be demanding serious changes.

      But also schools make what seem unfair actions when they don’t have evidence, can’t identify all the perpetrators and want to get the victim away from her bullies. Even if the school did the right thing about taking it seriously, we probably wouldn’t like their actions

      And even sending the bullies to jail with a kiddie porn conviction may be satisfying but is a bad choice. Bullying your classmates is not really the same as kiddie porn and schools need to find better ways to handle punishment to try to graduate a responsible mature member of society rather than graduate a lifelong criminal

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      14 hours ago

      Guidance Counselors, teachers and administrators don’t like listening to kids anywhere. I use to get in trouble for fighting my bullies when the bullying happened right in front of the teacher.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        11 hours ago

        Also, they will be razor focused on preserving authority over making things right.

        When they make a mistake, well no they didn’t because to admit a mistake is to acknowledge being fallible and to be fallible is to undermine your authority.

        In this case they still torpedoed her shot at extracurricular activities even after amending in the face of overwhelming data that the girl reasonably felt zero recourse after doing everything the right way to start.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    82
    arrow-down
    1
    ·
    1 day ago

    The principal had doubts they even existed.

    Holy shit, this person needs to lose their job. I don’t work in education and I still know that this is a huge problem everywhere.

    • jaselle@lemmy.ca
      link
      fedilink
      arrow-up
      6
      arrow-down
      19
      ·
      edit-2
      1 day ago

      how can we know that in this particular instance they do exist? If this were a reliable way to get someone expelled without any evidence, then if I were a bully I’d accuse other people of making deepfakes of me.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        15 hours ago

        Well in this particular instance they were able to find them and absolutely confirmed they do exist.

        But to at least consider that risk, they should have at least been able to make the offenders scared they would get found out and they would at least stop actively doing it. They should have been able to squash the behavior even before they could realize a meaningful punishment.

        I know when I was in school they would threaten punishment for things that hadn’t been done yet. I think a lot of kids declined to do something because the school had indicated they knew kids would do something and that would turn out badly.

      • CmdrShepard49@sh.itjust.works
        link
        fedilink
        arrow-up
        22
        ·
        1 day ago

        They could ask around about them. Surely one kid would be willing to spill the beans. They’re a bunch of 13-year-olds not criminal masterminds.

          • CmdrShepard49@sh.itjust.works
            link
            fedilink
            arrow-up
            25
            ·
            24 hours ago

            Well the police apparently found additional images depicting eight individuals and arrested two boys, so it seems like some tactic along these lines worked. Meanwhile the school administrators threw up their hands, called the situation “deeply complex,” and did nothing but punish the victim.

            Definitely agree that Snapchat is bad along with most social media and especially with kids. I can’t imagine what it’s like growing up in the current era with all this extra bullshit. I was lucky enough to grow up at a time where people didnt have the internet.

            • ChickenLadyLovesLife@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              16 hours ago

              I’m a school bus driver and a few years ago I had an incident where some kids threw food at me on the bus (goldfish crackers, of all things). Another kid made a video recording of the incident and posted it online and that caused a huge kerfuffle at the school. The admins couldn’t understand that I didn’t give even the tiniest fuck about the posted video.

              • bthest@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                16 hours ago

                “So it doesn’t piss you off that they posted a video? Because now we have to do something about it. And THAT doesn’t that bother you at all?”

                • ChickenLadyLovesLife@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  ·
                  15 hours ago

                  Because now we have to do something about it.

                  Your comment made me realize something. The week prior to this some of the kids threatened to kill me (via dad’s gun and wrapping a plastic bag around my head) and the school did nothing. The goldfish-flinging incident got the kids suspended from the bus for a week. It didn’t occur to me until now that perhaps the admins only did something because of the posted video.

  • lmmarsano@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    edit-2
    1 day ago

    In Lafourche Parish, the school district followed all its protocols for reporting misconduct, Superintendent Jarod Martin said in a statement. He said a “one-sided story” had been presented of the case that fails to illustrate its "totality and complex nature.”

    The “totality and complex nature” is that they suck

    Martin, the superintendent, countered: “Sometimes in life we can be both victims and perpetrators.”

    and that they’re shit. The system fucks up & amplifies the abuse.

    • BarneyPiccolo@lemmy.today
      link
      fedilink
      arrow-up
      4
      ·
      14 hours ago

      This girl asked for help from the “responsible adults” around her, and they failed her. So she did what anyone else would do, she took matters into her own hands, and now she’s seen as the “perpetrator,” which gives themselves permission to ignore her situation, or worse, punish her for demanding they do their jobs.

      Crazy thing is, if her story were a Netflix film, all those same losers would be rooting for her, and all indignant that the school in the story didn’t back her up, but wasn’t it cool how she went all crazy on the perverts with a machete?

      But in real life? Nah, let’s kick HER ass. They’ll show this 13 year old child porn victim who’s boss.