• lectricleopard@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    6
    ·
    2 days ago

    It gave you the wrong answer. One you called absurd. And then you said “Really good stuff.”

    Not to get all dead internet, but are you an LLM?

    I dont understand how people think this is going to change the world. Its like the c suite folks think they can fire 90% of their company and just feed their half baked ideas for making super hero sequels into an AI and sell us tickets to the poop that falls out, 15 fingers and all.

    • Eheran@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      1 day ago

      So you physically read what I said and then just went with “my bias against LLMs was proven” and wrote this reply? At no point did you actually try to understand what I said? Sorry but are you an LLM?

      But seriously. If you ask someone on the phone “is it raining” and the person says “not now but it did a moment ago”, do you think the person is a fucking idiot because obviously the sun has been and still is shining? Or perhaps the context is different (a different location)? Do you understand that now?

      • lectricleopard@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        2
        ·
        1 day ago

        You seem upset by my comment, which i dont understand at all. Im sorry if I’ve offended you. I don’t have a bias against LLMs. They’re good at talking. Very convincing. I dont need help creating text to communicate with people, though.

        Since you mention that this is helping you in your free time, then you might not be aware how much less useful it is in a commercial setting for coding.

        I’ll also note, since you mentioned it in your initial comment, LLMs dont think. They can’t think. They never will think. Thats not what these things are designed to do, and there is no means by which they might start to think if they are just bigger or faster. Talking about AI systems like they are people makes them appear more capable than they are to those that dont understand how they work.

        • Eheran@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          23 hours ago

          Can you define “thinking”? This is such a broad statement with so many implications. We have no idea how our brain functions.

          I do not use this tool for talking. I use it for data analysis, simulations, MCU programming, … Instead of having to write all of that code myself, it only takes 5 minutes now.

          • lectricleopard@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            14 hours ago

            Thinking is what humans do. We hold concepts in our working memory and use stored memories that are related to evaluate new data and determine a course of action.

            LLMs predict the next correct word in their sentence based on a statistical model. This model is developed by “training” with written data, often scraped from the internet. This creates many biases in the statistical model. People on the internet do not take the time to answer “i dont know” to questions they see. I see this as at least one source of what they call “hallucinations.” The model confidently answers incorrectly because that’s what it’s seen in training.

            The internet has many sites with reams of examples of code in many programming languages. If you are working on code that is of the same order of magnitude of these coding examples, then you are within the training data, and results will generally be good. Go outside of that training data, and it just flounders. It isn’t capable and has no means of reasoning beyond its internal statistical model.

          • Clent@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            21 hours ago

            We have no idea how our brain functions.

            This isn’t even remotely true.

            You should have asked your LLM about it before making such a ridiculous statement.