These scammers using Mr Beasts popularity, generosity, and (mostly) deep fake AI to scam people into downloading malware, somehow do not go against Instagrams community guidelines.

After trying to submit a request to review these denied claims, it appears I have been shadow banned in some way or another as only an error message pops up.

Instagram is allowing these to run on their platform. Intentional or not, this is ridiculous and Instagram should be held accountable for allowing malicious websites to advertise their scam on their platform.

For a platform of this scale, this is completely unacceptable. They are blatant and I have no idea how Instagrams report bots/staff are missing these.

  • Daxtron2@startrek.website
    link
    fedilink
    English
    arrow-up
    169
    ·
    1 year ago

    I’ve reported Nazis, violent threats, and literal child pornography on Instagram that then told me it didn’t go against their guidelines.

      • gaael@lemmy.world
        link
        fedilink
        English
        arrow-up
        60
        arrow-down
        1
        ·
        1 year ago

        I don’t think you understand how hard and resource-intensive it is to fight against the nipple crowd. I for one am grateful that they chose to do something about the real issues ! Yes, a world with free nazis is kind of a bother, but most of us would survive. Can you imagine the horror of a world with free nipples ? We would all be doomed, that’s for sure. /big s

    • MoonRaven@feddit.nl
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      But if you make a clear joke in a joke group, you get flagged and can’t get it reviewed.

      • Daxtron2@startrek.website
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 months ago

        No usually I report it to NCMEC who has better resources to deal with it. Cops very rarely care or are able to do anything.

  • DarkMessiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    84
    ·
    1 year ago

    Sounds like a good time to make Mr Beast aware of these, he has a lot of disposable income to burn on a lawsuit or three.

    • andros_rex@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      11 months ago

      These scam ads have been an issue for at least a year. I’m pretty sure they’re automated and there’s very little that can be done to trace them to their original sources. I’m sure if Mr. Beast did threaten to sue Meta, then they would just start filtering “beast” from ads.

      • PorkSoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        I’m pretty sure they’re automated and there’s very little that can be done to trace them to their original sources.

        Start by holding the ad account holder liable. When I worked in digital marketing and ran ad accounts, I had to upload my driver’s license.

        • Deckweiss@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          You live in a civilized country.

          There are others where you can get a stack of fake drivers licenses for a couple groshen.

    • TORFdot0@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Honestly protecting vulnerable people from these scams is probably more generous than the usual philanthropy he does

  • Icalasari@kbin.social
    link
    fedilink
    arrow-up
    74
    ·
    1 year ago

    So what they are saying is they are willing to take liability and thus be open to being sued over this as they know of the scams but say they do not break community guidelines

    Got it

          • wikibot@lemmy.worldB
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            Here’s the summary for the wikipedia article you mentioned in your comment:

            "Kill Switch" is the eleventh episode of the fifth season of the science fiction television series The X-Files. It premiered in the United States on the Fox network on February 15, 1998. It was written by William Gibson and Tom Maddox and directed by Rob Bowman. The episode is a "Monster-of-the-Week" story, unconnected to the series' wider mythology. "Kill Switch" earned a Nielsen household rating of 11.1, being watched by 18.04 million people in its initial broadcast. The episode received mostly positive reviews from television critics, with several complimenting Fox Mulder's virtual experience. The episode's name has also been said to inspire the name for the American metalcore band Killswitch Engage. The show centers on FBI special agents Fox Mulder (David Duchovny) and Dana Scully (Gillian Anderson) who work on cases linked to the paranormal, called X-Files. Mulder is a believer in the paranormal, while the skeptical Scully has been assigned to debunk his work. In this episode, Mulder and Scully become targets of a rogue AI capable of the worst kind of torture while investigating the strange circumstances of the death of a reclusive computer genius rumored to have been researching artificial intelligence. "Kill Switch" was co-written by cyberpunk pioneers William Gibson and Tom Maddox. The two eventually wrote another episode for the show: season seven's "First Person Shooter". "Kill Switch" was written after Gibson and Maddox approached the series, offering to write an episode. Reminiscent of the "dark visions" of filmmaker David Cronenberg, the episode contained "many obvious pokes and prods at high-end academic cyberculture." In addition, "Kill Switch" contained several scenes featuring elaborate explosives and digital effects, including one wherein a computer-animated Scully fights nurses in a virtual hospital. "Kill Switch" deals with various "Gibsonian" themes, including alienation, paranoia, artificial intelligence, and transferring one's consciousness into cyberspace, among others.

            article | about

  • Buddahriffic@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    ·
    11 months ago

    Companies serving ads should have at least partial liability for them. If they can’t afford to look into them all, then maybe they are too big or their business model just isn’t as viable as they pretend it is.

    • Patches@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      11 months ago

      They are too big. There is no maybe about it.

      You best start believing in late stage capitalism, you’re in one.

      • ArxCyberwolf@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        11 months ago

        We’re already at the point where companies are cannibalizing themselves to grow more, like cancer. They’re going to destroy themselves trying to endlessly grow. And you know what? Thank FUCK for that.

    • Liz@midwest.social
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 months ago

      I absolutely agree. If you’re serving up the ad, you have to take responsibility for the contents.

  • peereboominc@lemm.ee
    link
    fedilink
    English
    arrow-up
    57
    ·
    1 year ago

    Same with YouTube ads. Lots of scam’s and reporting it always ends in my report getting denied…

    • JigglypuffSeenFromAbove@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      Google also doesn’t care. I kept seeing the same scammy ads and sensationalist articles on my news feed, over and over, even after reporting them several times.

      The only solution was to blacklist those sources so they don’t show up on my feed. I feel bad for other people who might get scammed though.

    • TORFdot0@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      I had to uninstall the YouTube app and start using vinegar via safari on iOS because I got tired of being insulted by deepfakes who called me stupid for not falling for their fake stimulus scam.

    • HiddenLayer5@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      11 months ago

      I tried to report a scam givaway ad I saw on the YouTube homepage. It told me to sign in first. I promptly closed the tab right then.

  • Viper_NZ@lemmy.nz
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    1
    ·
    1 year ago

    On Twitter I’ve reported:

    • Pictures of dead babies/toddlers
    • Pictures of murdered people
    • Death threats towards public figures
    • Illegal videos of terrorist acts
    • Ads for illegal weapons (tasers)
    • So so much crypto spam

    Things found by Twitter to go against their community standards? 0

  • whatever@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    ·
    11 months ago

    […] Mr Beasts popularity, generosity […]

    Mr. Beast feels so unlikable to me, I really can’t understand his popularity. But that’s beside the point, sorry. Fuck instagram!

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      23
      ·
      11 months ago

      My understanding is he gives a lot of his money away to various causes so I suppose that’s why people like him.

      But of course equally he is part of that annoying YouTuber trend of bouncing around the screen being very loud and thinking that that’s a substitute for personality.

      • whatever@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        11 months ago

        It is an interesting business model. Good for the people he spends money on, but no one should have that much money to begin with. And I am sure he takes his cut.

        But without having watched many videos of him (about 2), his appearance just screams devious weasel to me.

        • HiddenLayer5@lemmy.ml
          link
          fedilink
          English
          arrow-up
          16
          ·
          edit-2
          11 months ago

          The two biggest charity events he’s had, Team Trees and Team Seas, he did literally nothing but pitch the idea. He was giving away luxury shit and engaging in his usual hedonism during the period he was telling his viewers to donate, and it’s not like he did any of the work either, he just contracted with established environmental nonprofits. So why is he there again? Why didn’t he just tell people to donate to those nonprofits directly?

          Also, he definitely profited from both charity events and they were more marketing events for himself than anything. All the videos have ads and he made no mention of donating the ad revenue so one can only assume he kept it (because if he was going to donate the ad revenue he absolutely would not pass up on making that known to everyone), not to mention the amount of engagement it brought to his other videos and his brand as a whole. That’s also assuming he doesn’t do what most influencer charity campaigns do and directly take a big cut of the donations as a marketing fee or something.

          • Suburbanl3g3nd@lemmings.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            11 months ago

            He had you donate to him instead of directly for the same reason businesses ask you to donate to X charity at the registers - tax breaks. I mean, I’m not an account but I imagine this is why he did it that way.

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    1 year ago

    Like many, I’ve reported lots of stuff to basically every social media outlet, and nothing has been done. Most surprising, a woman I know was getting harassed from people setting up fake accounts of her. Meta did nothing, so she went to the police…who also did nothing. Her MP eventually got involved, and after three months the accounts were removed, but the damage had gone on for about two years at that point.

    As someone that works in tech, it’s obvious why this is such a hard problem, because it requires actual people to review the content, to get context, and to resolve in a timely and efficient manner. It’s not a scalable solution on a platform with millions of posts a day, because it takes thousands (if not more) of people to triage, action, and build on this. That costs a ton of money, and tech companies have been trying (and failing) to scale this problem for decades. I maintain that if someone is able to reliably solve this problem (where users are happy), they’ll make billions.

    • jjjalljs@ttrpg.network
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      I’m going to argue that if they can’t scale to millions of users safely they shouldn’t.

      If they were selling food at huge scales but “couldn’t afford to have quality checks on all of what they ship out”, most people probably wouldn’t be like “yeah that’s fine. I mean sometimes you get a whole rat in your captain crunch but they have to make a profit”

      Also I’m pretty sure a billionaire could afford to pay a whole army of moderators.

      On the other hand, as someone else said, they kind of go to bat for awful people more often than not. I don’t really want to see that behavior scaled up.

      • EnderMB@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        You’re probably right, but as a thought exercise, imagine how many people you would need to hire across multiple regions, and what sort of salary these people deserve to have, given the responsibility. That’s why these companies don’t want to pay for it, and anyone that has worked this kind of data entry work will know that it can be brutal.

        IMO, governments should enforce it, but that requires a combined effort across multiple governments.

    • Patches@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 months ago

      But it is scalable. Do you have any idea how much fuckin money these social media sites make? They absolutely can afford it. We just don’t force them too.

    • Rodeo@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      11 months ago

      That costs a ton of money

      As if they don’t have it?

      Fuckin please. I’m so sick of hearing that something to “too expensive” for a multi billion dollar, multinational corporation.

    • Facebones@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I get a TOS flag anytime I mention that using one’s faith to justify bigotry and violence though, so we know there’s at least one group fb goes to bat for - Christofascists.

    • Queen HawlSera@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      11 months ago

      I had something like that happen.

      I report death threats against me from transphobic bigots, that specifically cited me being trans as why they wanted to kill me. Reported it as hate speech and a threat of violence. “We’re sorry, this does not violate community guidelines.”

      Later I made a self-deprecating joke about being white.

      Three month ban for “Racism and Bigotry”

      Facebook is a fucking joke, and not a funny one either.

  • Scotty_Trees@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    11 months ago

    Not that this helps anyone, but I gave up Instagram the day Facebook bought it. I don’t regret it and my mental health is better for it. Using Instagram made me depressed as hell.

    • Zaderade@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      I deleted Facebook a couple years ago. Instagram is my guilty pleasure for car reels and god damn dancing toothless. It seems like the end of my ig use is getting closer

      • Zealousideal_Fox900@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        11 months ago

        Facebook now is basically hard right wing clowns protected from repprts and boomers whinging about problems they made up. There are still holdouts (groups) that aren’t ruined but facebook is trying its best to do so.

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    1 year ago

    Enshittification has become the new way of life for tech firms like Meta.

    They lay off workers and decrease user safety, because that leads to more ad buys. This year’s record profits need to exceed last year’s record profits, even though a fourth of you are fired. More profit, or else…

    • Scotty_Trees@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Godspeed to Pixelfed, but Instagram absolutely killed photo sharing platforms for me. I really want nothing to do with them anymore.

  • Stefen Auris@pawb.social
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 year ago

    I doubt they’re missing them. They simply don’t care and will continue to not care until something happens that makes the money generated by the ADs not worth it.