• sylver_dragon@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 days ago

    Google processes over 5.9 trillion searches per year

    That number has nothing to do with the problem. They don’t need to review every search, they need to review every advertising link they have been paid to place (not every link indexed). Presumably, they already have the infrastructure in place to track those links and verify that they comply with laws such as CSAM, copyright or other areas where they actually have some accountability in those areas. The number of paid advertisement links will be far smaller than that 5.9 trillion number.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      5 days ago

      So they need to review every website? That’s not as daunting, there’s only 1.1 billion websites with only about 17% (roughly 193 million) being actively maintained and updated. Compared to the number of searches it’s certainly much smaller, but that’s still a huge dataset that has to be reviewed.

      Face it, this is not a simple thing that can just be solved by throwing AI at it. The only way search could exist in this environment is if it was subscription based or a public utility.

      For the record, I favor search being a public utility. Nationalize Google.

      • sylver_dragon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        So they need to review every website?

        I’m going to assume you’re just trolling now. I refuse to believe that someone can be this stupid, without actually doing it intentionally. Well done, you got me for a few comments. But, I’m done feeding the troll.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 days ago

          You have to review every website that the search engine can access or else you can’t actually stop the problem. Literally how else could you do it? A chatbot can’t reliably flag everything on its own, humans are going to have to actually look for false-positives and false-negatives, and with a billion websites that’s a lot of labor.

          Also thanks for taking sledge hammer to my self-esteem for literally no fucking reason, as I expect from reddit.world