• MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        15
        arrow-down
        1
        ·
        edit-2
        1 month ago

        Ew gross.

        I’m not going to keep the scalps of any Nazi I kill while defending my home and loved ones.

        I’ll just use pen and paper to keep track.

        (I’m not bothered by your comment at all, but am attempting to humorously “yes, and” with it.

        I am attempting a homorous misdirect where the reader thinks I’m disgusted by threatening to kill Nazis, but then I’m actually just offended by inefficient messy ways of keeping tracked track of any killed Nazis.)

  • lordnikon@lemmy.world
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    3
    ·
    1 month ago

    This is not a story of the algorithm predicting what you like. It’s showing if you expose a human to the same content over and over again. It can change their way of thinking in order to like the thing they are exposed to. Even more so if they don’t know how it works and the person think everyone is into slime. I need to be into it too to fit in. It’s very powerful if you want to manipulate populous. It’s algorithm induced Stockholm Syndrome.

  • Eggyhead@lemmings.world
    link
    fedilink
    arrow-up
    41
    ·
    1 month ago

    I’m certain my YouTube feed is trying to radicalize me into some kind of culture warrior. It’s really annoying. I deleted all of my watch history to try and reset it and it just got way worse real quick. I watch one stupid video, now all I see are angry tubers upset that people don’t think exactly like they do and enjoy things they don’t. Then they convince themselves they’re more enlightened than anyone else because they make this content and ban anyone who makes fun of them, all while claiming to be “free speech advocates” of course.

    YouTube got bad so fast it’s left my head spinning.

    • ShepherdPie@midwest.social
      link
      fedilink
      arrow-up
      23
      ·
      1 month ago

      Have you tried clicking the 3 dots on these outrage videos and selecting “don’t recommend channel” or a mix of that and “not interested?” I started to see a bunch of right wing political trash in my feed a while back since a lot of my watched videos could be considered adjacent (cars/trucks/offroading/home improvement/dash cam vids/etc) to what these people like and I haven’t really had this issue again.

      • rothaine@lemm.ee
        link
        fedilink
        arrow-up
        30
        ·
        1 month ago

        It’s wild that right wingers are always complaining about big tech censoring them when YouTube and Facebook are pushing far-right content so much

        • MajorHavoc@programming.dev
          link
          fedilink
          arrow-up
          10
          ·
          edit-2
          1 month ago

          It’s wild that right wingers are always complaining about big tech censoring them when YouTube and Facebook are pushing far-right content so much

          I’ve got a conspiracy theory about this:

          1. Everyone likes kittens.
          2. Some of us who like kittens think about how to act decently to each-other, some of the time.

          Leading to:

          1. Right wingers who like kittens will sometimes see something “woke” in their algorithm feed, and they feel attacked.
        • Fermion@feddit.nl
          link
          fedilink
          arrow-up
          10
          ·
          1 month ago

          They still think that YouTube and Facebook are representative of the average person. They don’t understand how incredible curated those feeds are. I think that’s where some of the “silent majority” mythos comes from. Everything they see is people agreeing with them, therefore it’s impossible that Joe Biden got more votes in 2020.

      • Jiggle_Physics@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        I have done this. I have told them not to post me shit from channels, and topics, over, and over. Best it seems it can go is like 2 months. When I tried deleting my history, and turning it off, it got SO MUCH WORSE. Even making a new account was orders of magnitude worse. As sad as it is, I am actually getting a better result… I have long been at the point where I do not click on things I am not familiar with, or without suggestion from a trusted source. So I just don’t look at recommended anymore. Just look for the indicator of new stuff from my subs, or look at things I specifically search for.

    • Jyrdano@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      1 month ago

      Yup, last week Ive clicked on a YT video of certain game, shut it down about 1 minute in after realising it was just another rage-baiting angry youtuber lamenting how the game is too woke. Now all I get is recommendations of angry anti-woke youtube videos bashing the game I actually enjoy.

    • Fleppensteyn@feddit.nl
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      I started with a clean profile: I never log in to YT so it’s just using a local cookie you can always clear to start over.

      Anyways, I just searched a few sciencey things to feed the algorithm and now I’m getting loads of crazy fake “science” and conspiracies and the rest is all extremist right wing bullshit.

      YouTube is getting useless.

    • Ephera@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      If you want it to just not recommend things, you might prefer switching to an RSS feed, or to something like NewPipe.

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    arrow-up
    26
    ·
    1 month ago

    The only reference I have for this was someone who I knew who rubbed said slime on herself for YouTube when she was 17 to build a following for when she turned 18 and started camming.

  • Gil Wanderley@lemmy.eco.br
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    1 month ago

    And that is why I only open videos about topics I am only mildly interested on or from controversial channels in incognito mode even though I actually pay for ad-free YouTube Premium.

    On my defense, that is the only streaming service I pay for.

    • yonder@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 month ago

      At least premium has the benefit of paying creators more for your watch time, which is nice.

    • Agent641@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 month ago

      TFW you forget to wear protection when clicking on a weird video and you permanently scar your algorithm. You try to heal it, but days or weeks later, you are showing your boss a video on marine grade industrial sealant and Chappell Roan Pink Pony Club shows up in your recommended videos and you have to lie and say you have no idea what it is. When he is gone, you play it again.

      • ObjectivityIncarnate@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        clicking on a weird video and you permanently scar your algorithm.

        It’s trivial to delete individual videos from your watch history, even moreso if you just saw it. Doing so makes it as if you never clicked on it in the first place.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    11
    arrow-down
    7
    ·
    1 month ago

    As recent advances in AI have shown, humans are really quite predictable when you throw enough data and compute at the problem. At some point the algorithm will be sophisticated enough that it’ll be able to get to know you better than you know yourself, and will be able to provide you with things you had no idea were what you really wanted.

    Interesting times.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        1 month ago

        Yes, but recent advances have really rubbed it in our faces in ways that are a lot harder to deny. Humans haven’t become fundamentally more or less predictable over time but recent advances have shown how predictable we are.

        • MajorHavoc@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          1 month ago

          Yep. I learned from an algorithm that I might enjoy music by “The Beatles”. The algorithm was quite correct, but I think my having simple tastes, and the Beatles having amazing music is due most of the credit.

      • Chris@feddit.uk
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 month ago

        Yes, I heard/saw/read that this is exactly what Amazon do, some years back now. They know who you are, what stage of life you are at, and they know what you want before you do.

    • Schmeckinger@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 month ago

      Yeah algorithms keep throwing stuff at me I would probably like to watch, but I don’t click on it to not get even more brain damage.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 month ago

      I had this exact experience with music algorithm recommendations:

      The algorithm analyzed all the songs I asked it to play, and concluded (correctly) that I might enjoy listening to the Beatles. (True story.)

      (Now a bit of sarcasm:) I look forward to future insights, in other art forms, such as perhaps the writings of Shakespeare or the paintings of Leonardo Da Vinci.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      That is not what happened.

      Humans aren’t static. You don’t actually have these secret hidden likes AI can discover, instead, you grow to like the stuff that becomes familiar. You’re being trained.

    • Krauerking@lemy.lol
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      Yeah doubtful. I think it finds something you will engage in and push on it over and over again until people get normalized to it.

      I think it’s more like cold reading from a psychic. It’s gonna use generic generalized data about the big identifiers for you like age and gender and as you respond try to change its answer to what it needs to based on what you gave it.

      That’s not new or magical in any way. And it can be really wrong about the broad stuff if you don’t fit in with generic identifying groups related to you.

      It really just feels like a sales pitch for the middle class to buy more stuff.

    • Ephera@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      Problem is that none of the algorithms actually care about showing you things you like.

      Ads try to sell you on things that you wouldn’t otherwise buy. Occasionally, they may just inform you about a good product that you simply didn’t know about, but there’s more money behind manipulating you into buying bad products, because it’s got a brand symbol.

      And content recommendation algorithms don’t care about you either. They care about keeping you on the platform for longer, to look at more ads.
      To some degree, that may mean showing you things you like. But it also means showing you things that aggravate you, that shock you. And the latter is considered more effective at keeping users engaged.