• Dyskolos@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      Hm, i just had to try myself with chatgpt, and:

      so, as suspected, it just tries to ignore one’s nonsensical questions unless asked to take it verbatim. This is actually not even dumb. If it had said “your question makes no sense”, we would call it stupid to not see the obvious error in the question - as it’s a run-of-the-mill-“riddle”.

      • chaogomu@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        I’ll have to find the post, but you did it in two steps, changed the units of mass, and object.

        The post, which is extremely hard to find with the latest slop release from Nvidia, asked the chatbot to consider the exact wording, without babying it into the correct answer. All because the close variations of the phase “X pounds of bricks and X pounds of feathers weigh the exact same” have been used in various textbooks and such for at least the last hundred years or so.

        That means that the chatbot has seen that exact combo of words, in roughly that order, quite a bit more than your use of “100 kilograms of rice”. At least in English.

        You can baby it through when the training data is sparse, but not when there are hundreds of uses of the same phrase over and over again in the training.