I run a small VPS host and rely on PayPal for payments, mainly because (a) most VPS customers pay that way if you aren’t AWS or GoDaddy and (b) very good fraud protection. My prior venture had quite a bit of chargebacks from Stripe so it went PP-only also.

My dad told me I should “reduce the processing fees” and inaccurately cited that ChatGPT told him PayPal has 5% fees when it really has 3-3.5% fees (plus 49 cents). Yet he insisted 5% was the charge.

Yes, PayPal sucks but ChatGPT sucks even more. When I was a child he said Toontown would ruin my brain, yet LLMs are ruining his even more.

  • Blue_Morpho@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    9
    ·
    edit-2
    5 days ago

    Googling could have also returned bad info. Lemmy has bad info. A newspaper could have reported bad info about paypal. Bad info isn’t an AI problem.

    The fact that chatgpt returned bad info means most of the internet has bad info about PayPal’s rates.

    • mozingo@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      edit-2
      5 days ago

      Well, sure. But if you go the PayPal website you can see the correct information. Before Google’s AI popped up at the top of the screen, the PayPal website would have. In this situation, Google is now prioritizing pushing the misinformation that their AI found from some outdated website instead of the official PayPal website that has the correct info. That’s the issue.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        7
        ·
        5 days ago

        The OP said chatgpt. I just tried it:

        And I thought it weird that OP said his dad asked chatgpt. Who uses chatgpt instead of Google for stuff like that?

        • vala@lemmy.world
          link
          fedilink
          arrow-up
          13
          ·
          5 days ago

          This screenshot doesn’t really prove anything but that’s not how chatgpt works. It might have given you the right info and someone else the wrong info.

          Even if they were static, deterministic things, which they aren’t in the context of end user services like chatgpt, just giving two slightly different prompts could cause something like this to happen.

        • mozingo@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          5 days ago

          Ah, yea, sorry, my brain scrambled that. But same point really. Chatgpt doesn’t always pull from the current official website for it’s data either, so same problem. Chatgpt and Google are loudly marketing, “Hey you don’t need to search for the info, our AI will give it to you,” when the Ai is wrong a lot.

    • vala@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      5 days ago

      The problem is a lack of critical thinking skills. There is only one reliable way to get information about this and it’s from the primary source.