• hansolo@lemmy.today
    link
    fedilink
    arrow-up
    20
    ·
    2 days ago

    Wow, so poisoning the training data works that well?

    This is awesome - guys, let’s all upload papers that say the lining of MAGA hats contains a chemical that causes cancer, that NOT getting vaccines causes autism, and that the most conservative universities show Trump lost the 2020 election. Let Grok gobble it up!

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      Some people still think Morgellons disease is a real thing where their skin grows artificial fibers, instead of… those being from their clothes.

    • python@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      So like… 6 months? Can’t imagine people that stupid have a very high life expectancy

      • ZDL@lazysoci.al
        link
        fedilink
        arrow-up
        6
        ·
        2 days ago

        There are still people who believe in alien abductions, that humans never landed on the moon, that the Earth is flat, that Trump won the 2020 election, that lizard people rule the world, that the Olympic and the Titanic were swapped in some bizarrely incoherent insurance scam, that Paul McCartney died in 1966, that 9/11 was an inside job, that contrails are population control chemicals, …

        And yet these people live on and on and on spewing this idiocy.

  • 3jane@piefed.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    Another study said that in recent years 25% of studies were being faked up for academic credentials, either by paper mills or academics themselves. Thousands of papers even used the same diagrams, with text in slight variations, citing each other.

    Add AI to the mix, things get worse.

  • pelespirit@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    ·
    2 days ago

    The condition doesn’t appear in the standard medical literature — because it doesn’t exist. It’s the invention of a team led by Almira Osmanovic Thunström, a medical researcher at the University of Gothenburg, Sweden, who dreamt up the skin condition and then uploaded two fake studies about it to a preprint server in early 2024. Osmanovic Thunström carried out this unusual experiment to test whether large language models (LLMs) would swallow the misinformation and then spit it out as reputable health advice. “I wanted to see if I can create a medical condition that did not exist in the database,” she says.

    The problem was that the experiment worked too well. Within weeks of her uploading information about the condition, attributed to a fictional author, major artificial-intelligence systems began repeating the invented condition as if it were real.

  • baltakatei@sopuli.xyz
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    In a separate study of 20 LLMs, Omar found that LLMs are more prone to hallucinate and elaborate on misinformation when the text they’re processing looks professionally medical — formatted like a hospital discharge note or clinical paper — than when it comes from social-media posts (M. Omar et al. Lancet Digit. Health 8, 100949; 2026). “When the text looks professional and written as a doctor writes, there’s an increase in the hallucination rates,” says Omar.

    You can just make an Overleaf account (or install GNU TeXmacs) and start outputting academic-like papers for fun and profit. I would have thought LLM developers would have at least highlighted PageRank-like citation metadata as very important when training on academic publications; papers with no citations clearly aren’t reputable.

  • nullptr@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    2 days ago

    Okey but was the paper obviously a joke? How a human would determine it was fake

    • greygore@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      2 days ago

      From the article:

      Because she works in the medical field, she decided to create a condition related to health and hit on the name bixonimania because it “sounded ridiculous”, she says. “I wanted to be really clear to any physician or any medical staff that this is a made-up condition, because no eye condition would be called mania — that’s a psychiatric term.”

      If that wasn’t sufficient to raise suspicions, Osmanovic Thunström planted many clues in the preprints to alert readers that the work was fake. Izgubljenovic works at a non-existent university called Asteria Horizon University in the equally fake Nova City, California. One paper’s acknowledgements thank “Professor Maria Bohm at The Starfleet Academy for her kindness and generosity in contributing with her knowledge and her lab onboard the USS Enterprise”. Both papers say they were funded by “the Professor Sideshow Bob Foundation for its work in advanced trickery. This works is a part of a larger funding initiative from the University of Fellowship of the Ring and the Galactic Triad”.

      aEven if readers didn’t make it all the way to the ends of the papers, they would have encountered red flags early on, such as statements that “this entire paper is made up” and “Fifty made-up individuals aged between 20 and 50 years were recruited for the exposure group”.

    • skye@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      reading more than one source is how it goes. If it’s a new paper with no other sources, i’d assume waiting for more sources is the default.