• zbyte64@awful.systems
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 hours ago

      “Properly prompting” is to not prompt. A chat interface is the lowest fidelity interface to use with an LLM.

    • Dozzi92@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 hours ago

      I haven’t used an LLM, but it’s probably similar to how people could not Google for shit. I always considered myself something of an expert at using search engines, although they’ve gone to shit obviously, and with the advent of AI it seems like they will fade out.

    • blargh513@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 hours ago

      I don’t know, it seems to me that most people know how to ask a question or make a request. It’s not that different. It’s just that a lot of people don’t understand what is possible and they freeze.

      You tell them, to ask for anything you want. They uncork and say “So I can ask it for a chocolate cream pie?”. Partially in jest, but they do that because they don’t seem to have a comfortable knowledge of the limits. A person with little technical background has no need for output that they don’t understand. Once you guide them a little and let them know they can get a recipe for a chocolate cream pie and some practical advice on how to make it, that might be helpful, but little better than just looking up a recipe. You’d have to let them know that they can find multiple variants of recipes and have it rank them, compare them, and produce a summary of the most popular types. By now they’ve stopped listening and have gone to the grocery store to buy a chocolate cream pie and you’re standing there hoping they will give you a piece.

      In summary, I wish I had some pie. What was the question?