• lmmarsano@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 hours ago

    Is this some covert advertising for nudify? I wonder how much search activity for it increased with these articles.

  • Lemming6969@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    20 hours ago

    This new tech means we need to get over nudity and what even is real vs fake in all media. Causing harm due to bodies, let alone fake ones is a societal and cultural failure.

  • altphoto@lemmy.today
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    19 hours ago

    Okay, I’m not in pro, nor opposed to fake nudes. But let’s think about why they exist. We want to see people naked having sex. Its a type of entertainment and the performers are real live actors. That industry will collapse as we know it if AI takes over. Either that or Nabisco will add sex pictures on their chocolate wrappers because porn stars will just be another naked person having sex for a few bucks.

    I for one really would like normal people having sex for my entertainment. If we let AI get the best of ourselves, porn will be, one day soon, gone forever! Muscle guys will forget how to pose together with another five guys so we can still see their penises going in and out of the actresses vagina. The actresses will not known how to prepare mentally and physically to be stretched in such an extreme way. That requires years of practice. I’m only half joking, it definitely takes some time to stretch out the various orifices in such a way.

    I propose to only watch non-AI porn, because that’s the only way to preserve our culture. That’s the only way to keep the actors practicing their moves, the actresses practicing their orgasms and the talent acquisition team hard at work! They need to keep those audition couches filled and warm! And what about the support teams?

    The camera guys need to remember the traditional ways of getting out of the ways of the balls swinging. The make-up girls, the fluffers, the microphone guy, the language translation teams and the special effects guy all need jobs! I mean…waka waka waka, 70’s style music doesn’t write itself! And the cleanup crue? They need the job! And they are ready with brushes, brooms, spatula, water buckets, stain removers, etc. And the furniture guys? Selling the odd couch or bed at consignment. Same for the dressing teams.

    Its an industry we must keep supporting. Porn is us, porn is human. Thank you!

    I’ll take my check when you guys are done fondling each other.

    • Doomsider@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      11 hours ago

      That was great, thanks.

      It does make me wonder if the traditional criticism of porn will go away with AI porn taking over.

      • altphoto@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        7 hours ago

        It’ll be like having a nack for old art forms…and so we went to the theater and watched them do 7 different classical porn positions… There was doggie, reverse cow girl, spitroast. It looked so real! It was a great play!

  • Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    40
    arrow-down
    4
    ·
    2 days ago

    I fully understand this girl and I wish her well but this is a genie that left a bottle that it will never again get back in to, I’m afraid

    • kittenzrulz123@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      We can push this so far underground that the only people who use this are 4chan creeps on the dark web, we cant destroy AI but we can push it to the fringes

    • Bane_Killgrind@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      7
      ·
      edit-2
      2 days ago

      Push it on to credit card processors, webhost operators, domain registrars etc proceeds of crime.

      Edit: they are getting money from somewhere to run the computers these models run on.

      • WolfLink@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        10 hours ago

        I don’t want credit card processor being the judge of what I can spend my money on or domain registrars being the judge of what websites I can visit.

        The person who did a crime should be taken to court and all the intermediaries should have the excuse of being neutral.

      • Clent@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        17
        ·
        2 days ago

        There is no way back on that.

        I can run these models on my local machine. It’s not even a complex model.

        This lawsuit is targeting the profiteers because that’s the only reasonable recourse for an individual.

        The criminal side of things is something a prosecutor needs to handle. Making this a priority becomes a political situation because it requires specific resources.

        • Son_of_Macha@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          3
          ·
          22 hours ago

          Maybe we need to start pointing out it didn’t make people naked, it just fits a naked body it saw in training under the person’s head. It’s Photoshop but faster.

          • Clent@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            21 hours ago

            Not exactly. Head swaps have been a thing for a while.

            These models match the body shape. They are essentially peeling back the layers of clothing. The thinner those layers the more accurate it can be.

        • Bane_Killgrind@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          2 days ago

          Right, and the people disseminating and hosting the tools tailored to criminal harassment should be held accountable, and the people hosting the resulting images. All of these people have their own revenue that can and should be disrupted.

          • ozymandias@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            6
            ·
            1 day ago

            they’re not really tailored to that… also, it wasn’t hard to photoshop naked pictures that long ago.
            but now these “tools” are neural net models… there are thousands of them hosted on dozens of source code repositories… and like op said, you can run it on any high end gaming gpu.
            you can’t outlaw source code like that.
            you could sue this one app maker and try to require that they prove consent and detect underage photos… totally a good idea, but it would do little to stop it…
            they’ll just use a different app
            i think they could prosecute the other people making and distributing the pictures though.

  • EgoNo4@lemmy.world
    link
    fedilink
    arrow-up
    99
    ·
    2 days ago

    Billions of dollars poured into AI for… Nudes. The stupidity of humanity has no boundaries…

    • TriangleSpecialist@lemmy.world
      link
      fedilink
      arrow-up
      60
      ·
      2 days ago

      Techbros trying desperately to solve the very real problem of them not getting any through removing that little obstacle in their way: consent.

    • veroxii@aussie.zone
      link
      fedilink
      arrow-up
      6
      arrow-down
      27
      ·
      2 days ago

      Look this software is vile and needs to be shut down. But this is hardly humanity’s first foray into using new tech for nudes.

      The printing press, photography, cinema, VHS, computers, the internet, etc. Porn, and much of it illegal and unsavoury, was there pretty early on.

      • Rothe@piefed.social
        link
        fedilink
        English
        arrow-up
        37
        arrow-down
        1
        ·
        edit-2
        2 days ago

        This is pretty unique in how it makes it possible to falsify convincing nudes of real persons by people with no technical or artistic knowledge. And it can do it in an instant and in limitless quantities.

        So there is really no historical precedence for this.

        • Scubus@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          Ok, so this technology solves a very really problem: how do we stop the circulation of child porn. Unlike any other time in history, people have the option of convincingly claiming that any picture of them was fake. Everyone knows that its super easy to fake this stuff, so odds are any REAL content could be easily claimed to be fake. Back when i was in school, there were several girls who moved presumably because of bullying due to their nudes being circulated. I dont know about the bullying, but i know the person who was circulating the nudes. If they had the option of convincingly claiming they were fake(it actually goes further than that. Now its LIKELY they’re fake, and can generally be assumed to be so) then that wouldve dramatically lessened the appeal of spreading them.

        • immutable@lemmy.zip
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          2 days ago

          Definitely easier but I think photoshop is probably the closest.

          You still needed some amount of skill, but there was also plenty of people with those skills willing to use them for free (or for a small fee)

          Definitely easier now, but at least the invasion of privacy aspect has been experienced before. Seems like we didn’t really come up with anything as a society for it though.

          I think the incentives will likely align to not regulating this. The thing that society would likely try to regulate is if these fakes end up blasted all over social media, but the social media companies dont want to be regulated so they are incentivized to stop that. Personal consumption, while off putting, is probably not going to rankle enough feathers to be a priority.

          I suspect you end up where we are at, it exists, you can use it, you can search it out, but it doesn’t show up on the larger platforms because they are incentivized to prevent it voluntarily to avoid regulation.

        • bitjunkie@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          4
          ·
          2 days ago

          The camera is pretty unique in how a user with no drawing or painting ability can produce true-to-life images by simply pressing a button…

  • brianpeiris@lemmy.ca
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    3
    ·
    2 days ago

    There ought to be a legal fund for these deepfake lawsuits so we can sue every one of these scummy companies out of existence. I’d donate to it.

      • pulsewidth@lemmy.world
        link
        fedilink
        arrow-up
        11
        ·
        1 day ago

        Yeah great idea genius - except that creating deepfake nudes isn’t a crime in and of itself currently - only online public publication makes it a crime, which Nudify isn’t doing - so funding the police and prosecutors more would do jack shit.

          • kittenzrulz123@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            7 hours ago

            Its literally not the polices job to deal with this? What the hell does the police budget have to do with this? It would literally have costed you nothing to not make the conversation about how much of a reactionary you are.

  • gandalf_der_12te@discuss.tchncs.de
    link
    fedilink
    arrow-up
    9
    arrow-down
    25
    ·
    2 days ago

    I have been thinking about this for a long time. Why is it that it bothers people if others see them naked?

    According to the scientific worldview starting from 1800, the world is real. That means that things you can touch, exist. And things that can not be measured don’t exist. Also the things of interest in the world are those that are “conserved quantities”, like if a hypothetical variable jumps around randomly, it’s not a good data source because it’s volatile and random. The things that matter are masses in space and time, because those are continuous and don’t jump around rapidly. Masses in space and time can only be modified if you work on them (and that requires effort), and no significant change can be brought to masses by purely thinking about them (no “spooky action at a distance”, no “telekinesis”) or wishing for their change (“wishful thinking” is seen as ineffective).

    That makes me wonder: Why do people freak out so much if i think about them? If i think lewd thoughts about somebody who didn’t consent to this, why do people not like that? What difference does it make to them if i think about them? What difference does it make if i look at a picture of them naked? By purely thinking about them, i can not change anything meaningful about reality, therefore it shouldn’t matter, right?

    • AnarchistArtificer@slrpnk.net
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      1 day ago

      Often times when these deep fake nudes are being generated, the most significant real world harm comes from what happens when they get circulated around. I know a teacher at a school where this was an issue, and the girl who was a victim of this was actually interviewed by the police because if it had been a genuine image, then she could’ve been charged with creating CSAM.

      The image had been shared around the school, and the girl in question felt humiliated, even though it wasn’t her real body — if everyone thinks it’s you in the image, then it’s hard to fight that rumour mill. As to why she cared about this, well even if you, as an individual, try really hard to not care, it turns out that a lot of people do care. A lot of people called her a slut for taking such provocative images of herself, even if that’s not actually what happened.

      This goes beyond the deep fake side of things. I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job. The problem here is that it’s not always the individual whose nudes (or faked nudes) are shared who has the biggest problem with that person being seen naked.

      You’re free to think about people naked as much as you like. Hell, if you wanted to generate deepfake nudes, that’d be unethical as hell in my view, but there’s little that could be done to stop you. Do whatever you like in the privacy of your own mind, but if people are getting weirded out, then that suggests that it wasn’t something that stayed contained within one person’s mind.

      • lmmarsano@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        A lot of people called her a slut for taking such provocative images of herself, even if that’s not actually what happened.

        I detect an obvious, unethical solution.

        I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job.

        Seems like a failure of society & maybe an opportunity to shakedown a former employer for a lawsuit payout.

      • gandalf_der_12te@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        22 hours ago

        The image had been shared around the school, and the girl in question felt humiliated, even though it wasn’t her real body — if everyone thinks it’s you in the image, then it’s hard to fight that rumour mill. As to why she cared about this, well even if you, as an individual, try really hard to not care, it turns out that a lot of people do care. A lot of people called her a slut for taking such provocative images of herself, even if that’s not actually what happened.

        This goes beyond the deep fake side of things. I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job. The problem here is that it’s not always the individual whose nudes (or faked nudes) are shared who has the biggest problem with that person being seen naked.

        Okay then the logical step to take is to educate the population about the possibility of nude images being AI generated.

        • nwtreeoctopus@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          21 hours ago

          Sure. We already do that some. But we do the same basic thing around not believing rumors and that doesn’t obliviate the harm they do.

          Putting aside the issue of how many people want the rumor to be true or the deep fakes to be real, people expending effort to say/produce something harmful or uncomfortable is hurtful to the subject/victim. The idea that people could believe it is hurtful.

          This is all exacerbated with young people because their brains are wired to care more about peer socialization and perception than adult brains.

          Even things we know aren’t true damage our reputations and perceptions. I know JD Vance didn’t fuck a couch, but it’s one of the first things that comes to mind when he’s mentioned.

          Education about the reality of AI generated nudes isn’t a bad thing (and, like, every teen already knows this is a thing, anyway), but that doesn’t stop the harm for the subject due to the association with the material.

      • moopet@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 hours ago

        I think this is going to change radically in the near future when people switch to assuming everything they see isn’t real unless there’s solid evidence. At the moment there’s a lot of the population that assumes ai images are real, and that’s going to flip at some point

    • TriangleSpecialist@lemmy.world
      link
      fedilink
      arrow-up
      22
      ·
      2 days ago

      I’m going to assume this is a genuine question asked in good faith.

      I’d say these are pretty wild mental gymnastics, but I think I kind of get where you are coming from because of the following point that I’ll concede: I don’t think that any society which tries to establish just rules and laws on the basis of scientific rationality should ever consider “thought crime” to be a thing, and we should push against that ever becoming something punishable.

      This is however also me doing mental gymnastics to try to be charitable to your messages. Now for why I think people are very understandably bothered by this.

      First of all, in the context of this post, this is not just about thoughts but also producing material that can be shared online. I don’t know whether you’ve followed the news as of late, but it should be pretty clear that for day-to-day life (i.e. without scientific rigour being applied to every aspect) it matters more what is perceived as true as opposed to what is true. From the point of view of the victim, whether the nudes are “real” or not does not matter nearly as much as the fact that knowing people who’ll view them will think that they are. This is, beyond shame, because of the fact that this may then be used against the victim for employment discrimination, harassment, or worse.

      Now let’s move on from that and address just the “thought” bit. Trying to view that through a reductionist, materialistic point of view is pretty misguided in my opinion. Here, you’re dealing with people, feelings, and social relationships. I’d say that learning that someone, anyone, is fantasising about me (and I did not suspect it, and it’s unreciprocated), is, at the very least, likely to change the social dynamic because someone I considered a friend or coworker, and that I interacted with under the assumption that I was the same in their eyes. Furthermore, I’ll add that I am a somewhat strong looking man, and have thus far not felt physically or mentally threatened the (very) few times this has happened. Add power dynamics which are not in your favour in this equation and yeah, no wonder people freak out…

      But all in all, I’d say that analysing human interactions on the basis of human beings being purely rational is naive at best honestly. There are varying degrees, of course, but I don’t believe anyone is purely rational.

    • Juice@midwest.social
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Your materialism is a form of idealism, that collapses into solipsistic conclusions.

      When you limit the scope of phenomenal objects to be only those objects that have a physical quality, that is positivism. It has a nice way of erasing anything human from your analysis. Thought, emotion, social connection, motivation, the will to act all become purely subjective, hence are excluded from the category of objects that are real.

      Your inability to process basic fact is admitted in your own description:

      no significant change can be brought to masses by purely thinking about them

      This is true, but you have no theory of praxis. There is a kind of contemplation that is purely subjective. Like daydreaming for instance, though this could be influenced by objective factors. There is a type of contemplation that develops the self so we are better able to take action, such as studying. And there is contemplation that leads directly to action, like when someone finally decides to leave their abuser, or develop a new flavor of ice cream, etc. These last two forms of contemplation are both subjective and objective. They become objective because they change something in the phenomenal world, they are verifiable.

      Money only exists in the form of bits in a computer, or pieces of paper, some people say “money isn’t real,” but it clearly is as there are consequences if you don’t have any. The same is true with the law. It only exists physically as a piece of paper with some writing on it, but it actually took politicians, lawyers, input from citizens, all this unsubstantial stuff in order to create it, and if it is broken (what object broke?) the police can arrest you and you get punished by a judge. Do laws not exist?

      Money and Laws are social relations. They have no substance, but they are real and verifiable, the paper they are printed on is only symbolic of what it is, how it came into being, and what effects it has on society. You can’t account for any of this, which is why you can’t understand the problem. You can imagine an individual body, you can imagine society and government, but you can’t connect them. You can’t see how society is made by people or how people are made by society.

      The way to fix this is to center the human in our analysis. Maybe a tree exists with or without human work, but many trees are planted. Oil exists in the ground independent of our labor, but what turns in into gasoline is people working on an oil rig (built by people) extracting it, transporting the crude via truck or pipeline (all built and operated by people), refining it (in a refinery built by people), transporting the fuel to a gas station (operated and built by people), and put into your car’s gas tank by you, and that was done for some reason. You witnessed to something in your environment, you thought about it, which led you to want to drive somewhere, which made you want to fill up your gas tank.

      Maybe you wanted to buy a video game, created and marketed to you by people. Why did you want that game? So you could play with friends, or you want to compete on leaderboards, or you played the last game and want to play this one. Out of joy or competitiveness, all these feelings lack substance, but they made you do a thing, and as long as you return home with the game, your contemplation and action became objective.

      This is why it matters that we are responsible with other people, and we account for their feelings and thoughts. Hell, influencing peoples thoughts and opinions is a multi-trillion dollar industry. If they didn’t exist before, the do when others try and influence them.

      things that can not be measured don’t exist.

      Where people are concerned, they do exist. Because it influences peoples ability to act. You can act in a way to free other people or you can oppress them, and the qualities of freedom and oppression are not measurable, but their effects are substantial.

      I’m not sure if your attitude is based on a need to harm other people, or if you really don’t understand. In both cases, what brought you to it was not totally your own. You were exposed to chauvinism in a way that led you to adopt a crappy attitude, or you were taught things a certain way (which is tbf how we are all taught to some degree, though it is wrong). You internalized this, thought about it, said something gross, and people reacted negatively. This is all objective, but only some of it is verifiable.

      This particular misunderstanding you exhibit is one of my favorite topics, and my answer to it is the product of like 15 years of research and discussions. You say

      scientific worldview starting from 1800

      It arguably started before, but it was thoroughly disproven in 1844. Yet it persists. That persistence is not substance, “worldview” isn’t substance, the year 1800 isn’t substance. But it is phenomenon. You’re confused, but hopefully that’s all it is. Hopefully you’ll reconsider and be able to do better. Human development is objective, but it is not inevitable. This is the difference between your deterministic and vulgar attitude and reality.

      In other words, you are an idealist who uses physical phenomenon to disappear much that is real. If we want to become a better materialist, then we have to center people and everything about humans in our analysis, not objects as something that exists independent of human intervention.

      • lmmarsano@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 hours ago

        That was enlightening. However, I detect some limiting value judgements.

        I’m not sure if your attitude is based on a need to harm other people, or if you really don’t understand. In both cases, what brought you to it was not totally your own. You were exposed to chauvinism in a way that led you to adopt a crappy attitude, or you were taught things a certain way (which is tbf how we are all taught to some degree, though it is wrong). You internalized this, thought about it, said something gross, and people reacted negatively. This is all objective, but only some of it is verifiable.

        This is a limiting perspective. Attitudes toward nudity are culturally specific. It’s likely not as taboo or shameful in cultures where nudity is mundane.

        The taboos & rules we follow in our culture don’t need to be that way, and we know that. We know we don’t need to see things the way we do: our arbitrary value judgements are a matter of perspective.

        Hopefully you’ll reconsider and be able to do better. Human development is objective, but it is not inevitable. This is the difference between your deterministic and vulgar attitude and reality.

        It’s not necessarily vulgar. We can take their materialism to an extreme and map all that mental, subjective experience to physical neural states beyond our precise comprehension & merely acknowledge that correspondence exists. That neurochemistry includes some degree of randomness, as do some physical phenomena, so this physical-only view of reality isn’t completely deterministic.

        It resolves to the same effect as your model of understanding reality, which abstracts away that physical detail into practical concepts more conducive to the way we think.

        Anyhow, I think it was a good point that society doesn’t need this backward shame & judgement around nudity or whatever activity goes on in people’s heads. However, society does, and it’s not about to evolve without serious effort.

        • Juice@midwest.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          39 minutes ago

          Yeah I’m taking for granted that taking someone’s picture and turning it into a deepfake nude, such as the OP, is bad because it violates their consent. Social attitudes toward nudity is meh, I personally don’t care except where hygiene is involved. Of all the social norms I’d overthrow, that one is pretty low on the list, its impractical.

          Concerning determinism, I was mostly responding to this

          the things of interest in the world are those that are “conserved quantities”, like if a hypothetical variable jumps around randomly, it’s not a good data source because it’s volatile and random

          To me the phrase “not a good data source” indicates a preconception of rationalism, the assumption that the world is essentially logical, therefore we can intuit anything about the world with pure thought. Because events proceed logically, then events can be understood by evaluating their place as link in a logical “chain.” I don’t dispute this outright, but personally I can’t stand prefiguration. I think it is alienating from actual reality because instead of engaging with reality, and the people in it, we engage with reality through this logical chain. Everything has to fit, else it is illogical.

          I think it’s okay to be like a soft determinist, someone who understands that what happened before effects what happens next. But its easier to do historical materialism by just centering the perspectives and reactions of people, than it is to try and conceive as historical events like links in a logical chain, which often happens with history as history usually ends up justifying the will of whoever is in charge. The best historians, even when they have ideological biases, are able to disseminate messy facts independent of anyone’s narrative

          We can take their materialism to an extreme and map all that mental, subjective experience to physical neural states beyond our precise comprehension & merely acknowledge that correspondence exists. That neurochemistry includes some degree of randomness, as do some physical phenomena, so this physical-only view of reality isn’t completely deterministic.

          Can you elaborate on this? I don’t quite understand what you’re saying.

          I was being a little rough on the poster because I didn’t realize they were being provocative, so certain terms I used, like vulgar, have a negative connotation, but what I meant was it was a kind of orthodox materialism that inhibits change, that is oppressive rather than liberating.

      • gandalf_der_12te@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        i’d say your comment is well-thought-through, and my comment was also kinda provocative.

        Of course i’m aware that the world cannot be purely understood by the material objects around us. I was, however, raised under the assumption that it can be understood that way. I guess i just wanted to hear somebody else confirm my own suspicion that that’s not true, after all.

        • Juice@midwest.social
          link
          fedilink
          arrow-up
          3
          ·
          2 days ago

          A lot of people can sense it, but can’t describe it. My own ability to describe it is amateurish, clunky and abstract. I work with a lot of people who dedicate huge parts of their lives to helping people, who can’t describe it. The social scientists who worked it out are famous, but that part of their work is deemphasized even though it defines their work. And because it is deemphasized, their proponents and followers have committed any number of mistakes and just downright catastrophes.

          I’m glad to hear you were doing a social science experiment and I’m glad I could provide some validation.

      • Phoenixz@lemmy.ca
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        2 days ago

        Well that, I have to say, is more a social thing that we could do very well without. People shouldn’t be shamed for nudity, ever

        I’m not defending these apps at all,just a comment on how most cultures see nudity now. It should be way, way more normalized

        This, of course, especially goes in the US where some people seem to have the idea that if a child sees a boob, they will be mentally damaged for the rest of their lives

        • gandalf_der_12te@discuss.tchncs.de
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          2 days ago

          This, of course, especially goes in the US where some people seem to have the idea that if a child sees a boob, they will be mentally damaged for the rest of their lives

          Especially considering that many states have “open carry” laws for guns, but not for boobs.

  • CerebralHawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    26
    ·
    2 days ago

    Serious question: if you show an AI an image of a 14-year-old girl, taken from her public Instagram account — what the article says was used to generate fake nudes of her to harass her — how does the AI know she’s 14 and not 18? Or to flip it, if you show it an 18-year-old, how does it know she isn’t 14 — or 17?

    What we think of as “a child” is not what the law defines as a child. For the law, it’s 17 and under. For most people, “child” means like, 12 and under. Older than that, they’re a teenager, literally, or also an adolescent. Not saying it makes it okay to look at them. But it does make it harder for technology to determine their legality (age of majority).

    As a human, how do you tell if a young-looking nude woman you see on an NSFW Lemmy comm is legal or not? If she’s in the US, typically you have to be 18 to get a tattoo, so a tattoo in an intimate area implies she was of legal age to get said tattoo, though they can be done by amateurs and there may be some pros who don’t ask for ID. But it’s one way, you see ink, you assume age of majority. At the very least it’s plausible deniability. You can’t say pubic hair (or stubble of the same) because that’s typically shaven/waxed in grown women. It’s a trend, and a popular one at that. The fact is, you don’t. You have some criteria and it helps you sleep at night knowing you have some standards. Maybe you saw a 15-17 year old who passed your internal checks and you thought she was legal. Maybe you’re above it all and you don’t even look at naked people online, but that’s beside the point that many still do.

    To be clear, I’m as against these “nudify” type apps as anybody. I just want to know how they’re expected to tell a perhaps mature teenager from an under-developed adult without asking for ID. Because that’s another slippery slope we don’t want to go down, but already are. Needing ID to access parts of the Internet, with the true purpose being to identify who is looking at what.

    • Scubus@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      The simple answer is that these models are already programmed and released, and they simply dont have that check. Its not a matter of whether they should or whether the ai needs to be taken down, because none of that is realistic. Looking at the future, because these models can be run locally, there is absolutely nothing you can do at this point to ensure a future with no fake porn around, regardless of how idealistic you are.

      So the solution here is security through noise. There are a number of ways to do that, but they all involve dramatic cultural shift, and im quite curious to see what that entails. The first, and imo least likely, is that everyone stops being weird sexual monkeys and stops stigmatizing the human body. People shouldnt have the power over you that comes with being able to show off your nude body. Thats weird. Seeing the body in an exclusively sexual manner is weird anyways. Sexualing nudity, breast feeding, ankles, all this weird cringe shit that all just seems like barely hidden fetishes that everyone seems to engage in. The human body shouldnt be something to be ashamed of as a concept. I could understand not being proud of your body and not wanting to show it off, but the idea that someone walking around in the nude is inheritely sexual just seems to speak volumes about the people saying that to me.

      The second option is to make the models as accessible as possible. The idea being to flood the internet with so much ai slop that it becomes unusable, or everything on it is considered fake. This is a future where you see some horrible accident online, and it is your immediate assumption that it is fake. Kinda like a movie, or a scene from a video game, or most likely some strange advertisement. The default state of the itnernet becomes entertainment, not education or social engagment. The people you match with on tinder are assumed to be bots, the person responding to your comment is likely a bot, the person who posted that cool meme was a bot. The default assumption is that its all fake, for entertainment purposes only. Yes, there are nudes of you floating around. Theres also photos of you on the moon, and getting your ladydick sucked by the president, and of you refracting light to form a trans pride flag. Its all fake, and everyone knows it.

      Lastly, we have security through oppression. This is where the government uses this whole thing as an excuse to install ever increasing surveilance software on your devices in order to ensure that youre not engaging with illegal software they had a hand in making. The only way to ensure this porn isnt getting produced? Well, in order to connect to the internet you need to have a goverment issued app that scans every file for signs of ai editing. Whats that? The porn is still being produced? I guess that means its getting brought in from an offline source. We need a worm that embeds itself into every file to target machines that are kept on intranets. You know what, its easier to just take everyones computers away unless theyre government sanctioned. Nanny states are surveilance and police states where effectively open sourced software is concerned.

    • Bristlecone@lemmy.world
      link
      fedilink
      arrow-up
      18
      arrow-down
      2
      ·
      2 days ago

      None of this is an issue, unless you are a creep who is interested in riding that barely legal line for thrills. That in and of itself is a creepy thing in my opinion, I don’t condone it, especially if you are specifically trying to blur that legal line for yourself. You’re playing with a hideous fire. You’re asserting that those 5 years, from 13 to 18, make a big difference somehow, but there are 12 years from when a person turns 18 to when they are 30 years old. Shoot for that range and stick to the plenty of reliable online spaces providing LEGAL ADULT models, and that should be as far as you have to think about it. I’m writing this message as a courtesy to you in your life. Stop worrying about this question at all and play it safe ALWAYS. Prioritize legal agency first and foremost, then you can get more creative within that framework as you and your partner become familiar. Don’t focus on the young as possible thing AT ALL is my advice, and especially don’t perseverate on it as much as you are here unless you are writing legislation on it or something. When I was a dating adult, I actually wouldn’t date under 21, because that, in my opinion, is an actual adult with a couple years of adult experience under their belt. There are very few reasons to be worried about someone’s specific age within this relatively short span of someone’s life.

      The app mentioned here is disgusting and I’m blown the fuck away that it is not illegal already, btw… Full stop

      • peopleproblems@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        It’s also super easy to figure out if someone isn’t 21- invite them to a bar that doesn’t serve food. Boom move on.

        • baines@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          2
          ·
          21 hours ago

          there is a case of a guy charged for a girl he picked up in a bar

          she had a fake id

          • FauxLiving@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            21 hours ago

            You can get a good fake id i.e. one printed on the same printers and same printing stock as the DMV for under $100. When I was in college this knowledge was a common as knowing who to buy weed from.

      • CerebralHawks@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        12
        ·
        2 days ago

        Ha ha. But, why download anything when it’s all online? Point taken — you don’t have a good answer, so you insult to deflect. The deflection is obvious, the intent, not so much so. A lesser man might think YOU have CSAM on your hard drive. But I think you just don’t know and rather than saying you don’t know or simply saying nothing, you try to derail the conversation.

        Why is that?

        • SlippiHUD@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          3
          ·
          2 days ago

          The point you’re trying to make is forgiving AI for abusing minors (children) because it can’t “know” thier age. You’re making a hebeophile vs pedophile argument for a computer, which just makes you sound like a pedophile.

          Also, The internet functions by downloading everything you look at. Any image your computer is displaying has been downloaded to your computer, it doesn’t just stay on the web. So you may still want to scrub your drive.

          • FauxLiving@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            21 hours ago

            The point they’re trying to make is that people often throw accusations when sensing that they’re unable to support their argument.

            Implying someone a pedophile is not an argument, it is a personal attack. If you have an argument then make it, if not don’t just sling shit and be a toxic internet person.

    • jjjalljs@ttrpg.network
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      2 days ago

      If they can’t do it while adhering to our society’s desired rules (ie: no sexualizing of minors) then they shouldn’t be allowed to do it at all. It’s not our responsibility to figure out how to solve the AI company’s self created problems for them.

    • ParadoxSeahorse@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      2 days ago

      You’ve clearly given it some thought, but not enough.

      If it can’t tell how old they are… IT DOESN’T MATTER, IT’S NON-CONSENSUAL!