But people in China certainly seem to think this is an example of AI. China Digital Times reports that it wasn’t just the same-sex image that was altered, but the distributor did also cut out some straight sex scenes from the film… As one user implies, the AI is a more sophisticated and “terrifying” form of censorship because smart viewers have figured out ways to get around obvious censorship or can tell the difference, but that’s much harder with AI. Here’s a smattering of some of the comments, sarcasm implied on the last one:

in the future, we won’t even be able to tell if we’re watching the original film or not.

This is nauseating because it not only interferes with the integrity of the plot, it disrespects the sexual orientation of the actors.

Awesome! Next, let’s use one-click AI to re-release “Brokeback Mountain,” “God’s Own Country,” “Lan Yu,” and “Happy Together” as “restored” hetero romances

  • Megaman_EXE@beehaw.org
    link
    fedilink
    arrow-up
    18
    ·
    edit-2
    1 day ago

    The dangerous thing that I think AI and censorship cause is a lack of trust and feeling of cooperation within a nation. I believe it causes people to close themselves off and become distrustful of their government.

    This might just be me projecting my own bias, but the rise of rampant misinformation and the general breakdown of society and the social safety net(or rather the breakdown of what minimal safety nets were there to begin with) has caused me to distrust and question anything I see. It has caused me to just double down my thinking based around my own personal worldview rather than accept what I am told or directly see.

    It makes me a lot more resistant to go with the flow. I’m not sure if that’s beneficial for a countries government or not. I can’t see it being helpful, though, when I’m likely not going to believe what they tell me or view things with such high skepticism.

    Edit: the one thing this could do though is if a group had enough information on a specific person is they could push misinfo to specific groups and people with a spin that they believe the targets would be more likely to fall for. For example, maybe I would be more likely to believe something negative about something that I already have a negative bias towards. If someone wanted to manipulate me further, they could customize or tailor that to manipulate me further.

    I feel like I’m going into tinfoil hat territory, but at the same time I feel like it’s probably not that far fetched. Maybe at some point I just go back to monkey and live in the woods lol.