• Lvxferre [he/him]@mander.xyz
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    2
    ·
    edit-2
    4 days ago

    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.


    EDIT: I’m going to abridge what I said above, in a way that even my dog would understand:

    What Grok is doing is harmful, there are victims of that, regardless of some “ackshyually this is not CSAM lol lmao”. And yet you guys keep babbling about definitions?

    Everything else I said here was contextualising and detailing the above.

    Is this clear now? Or will I get yet another lying piece of shit (like @[email protected]) going out of their way to misinterpret what I said?

    (I don’t even have a dog.)

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 days ago

      What exactly have I lied about?

      I’ve never once tried to even insinuate that what grok is doing ok. Nor that it should be. What I’ve said. Is that it doesn’t even matter if there are an actual real person being victimized or not. It’s still illegal. No matter how you look at it. It’s illegal. Fictional or not.

      Your example of Bart in the Simpsons movie is so far out of place I hardly know where to begin.

      It’s NOT because he’s fictional. Because fictional depictions of naked children in sexually compromised situations IS illegal.

      Though I am glad you don’t have a dog. It would be real awkward for the dog to always be the smartest being in the house.

          • EldritchFemininity@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            4 days ago

            They mistook your comment as disagreeing with their take on how there are real victims of Grok’s porn and CSAM and saying that they themselves were supporting CSAM, rather than saying that you agree and were saying Sweeney is supporting CSAM.

          • Lvxferre [he/him]@mander.xyz
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            4 days ago

            Fuck! I misread you. Yes, you’re right, Tim Sweeney is supporting CSAM.

            Sorry for the misunderstanding, undeserved crankiness, and defensiveness; I thought you were claiming I was the one doing it. That was my bad. (In my own defence, someone already did it.)


            Now, giving you a proper answer: yeah, Epic is better sent down the forgetting hole. And I hope Sweeney gets haunted by his own words for years and years to come.

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      24
      ·
      5 days ago

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        5 days ago

        That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

        Non-consensual porn victimises the person being depicted

        This is still true if the porn in question is machine-generated

          • unexposedhazard@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            12
            ·
            5 days ago

            Which they then talk about and point out that victims are absolutely present in this case…

            If this is still too hard to understand i will simplify the sentence. They are saying:

            “The important thing to talk about is, whether there is a victim or not.”

            • Atomic@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              9
              ·
              5 days ago

              It doesn’t matter if there’s a victim or not. It’s the depiction of CSA that is illegal.

              So no, talking about whatever or not there’s a victim is not the most important part.

              It doesn’t matter if you draw it by hand with crayons. If it’s depicting CSA it’s illegal.

                • Lvxferre [he/him]@mander.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  4 days ago

                  I wish I was as composed as you. You’re still calmly explaining things to that dumb fuck, while they move the goalposts back and forth:

                  All of that while they’re still pretending to argue the same point. It reminds me a video from the Alt-Right Playbook, called “never play defence”: make dumb claim, waste someone else’s time expecting them to rebuke that dumb claim, make another dumb claim, waste their time again, so goes on.

                  • unexposedhazard@discuss.tchncs.de
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    4 days ago

                    Its good training for arguing with real life people at least. Because coming up with a good comeback quickly is hard when you have never formulated your thoughts about a subject properly. I think often people misunderstand things at first and then when someone points out their mistake, they realize that they were wrong, but dont want to admit it, so they just double down. I have been that person before too tho…

                • Atomic@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  7
                  ·
                  4 days ago

                  Talking about morals and morality is how you end up getting things like abortion banned. Because some people felt morally superior and wanted to enforce their superior morality on everyone else.

                  There’s no point in bringing it up. If you need to bring up morals to argue your point. You’ve already failed.

                  But please do enlighten me. Because personally. I don’t think there’s a moral difference between depicting “victimless” CSAM and CSAM containing a real person.

                  I think they’re both, morally, equally awful.

                  But you said there’s a major moral difference? For you maybe.

                  • unexposedhazard@discuss.tchncs.de
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    4 days ago

                    If you seriously think that there is no moral difference between someone being sexually abused and them not being sexually abused then maybe you should be in prison for all our safety.

      • Lvxferre [he/him]@mander.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 days ago

        That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

        Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

        There ARE victims, lots of them.

        You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

        Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

        Is this clear now?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          9
          ·
          5 days ago

          Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

          The real thing to talk about is the presence or absence of a victim.

          Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

          • Lvxferre [he/him]@mander.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            4 days ago

            Yes, it certainly comes across as you arguing for the opposite

            No, it does not. Stop being a liar.

            Or, even better: do yourself a favour and go offline. Permanently. There’s already enough muppets like you: assumptive pieces of shit lacking basic reading comprehension, but still eager to screech at others — not because of what the others actually said, but because of what they assumed over it. You’re dead weight in any serious discussion, probably in some unserious ones too, and odds are you know it.

            Also, I’m not wasting my time further with you, go be functionally illiterate elsewhere.

            • Atomic@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              4 days ago

              Ok. You’re right. You saying it’s ok to depict CSAM if there isn’t a victim is not you arguing the opposite. It’s me lying.

              You’re so smart. Good job.

          • dantel@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 days ago

            Is it so hard to admit that you misunderstood the comment ffs? It is painfully obvious to everyone.