• I Cast Fist@programming.dev
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    Carla Rover once spent 30 minutes sobbing after having to restart a project she vibe coded. Rover has been in the industry for 15 years, mainly working as a web developer. She’s now building a startup, alongside her son, that creates custom machine learning models for marketplaces.

    Using AI to sell AI, infinite money glitch! /s

    “Using a coding co-pilot is kind of like giving a coffee pot to a smart six-year-old and saying, ‘Please take this into the dining room and pour coffee for the family,’” Rover said. Can they do it? Possibly. Could they fail? Definitely. And most likely, if they do fail, they aren’t going to tell you.

    No, a kid will learn if s/he fucks up and, if pressed, will spill the beans. AI is, despite being called “intelligent”, not learning anything from its mistakes and often forgetting things because of limitations - consistency is still one of the key problems for all LLM and image generators

    • squaresinger@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 days ago

      If you bring a 6yo into office and tell them to do your work for you, you should be locked up. For multiple reasons.

      Not sure why they thought that was a positive comparison.

    • Knock_Knock_Lemmy_In@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      AI is, despite being called “intelligent”, not learning anything from its mistakes

      Don’t they also train new models on past user conversations?

        • Knock_Knock_Lemmy_In@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Chatgpt5 can count the number of 'r’s, but that’s probably because it has been specifically trained to do so.

          I would argue that the models do learn, but only over generations. So slowly and specifically.

          They definitely don’t learn intelligently.

          • hark@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            That’s the P in ChatGPT: Pre-trained. It has “learned” based on the set of data it has been trained on, but prompts will not have it learn anything. Your past prompts are kept to use as “memory” and to influence output for your future prompts, but it does not actually learn from them.

            • Knock_Knock_Lemmy_In@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 day ago

              The next generation of GPT will include everyone’s past prompts (ever been A/B tested on openAI?). That’s what I mean by generational learning.

              • hark@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 day ago

                Maybe. It’s probably not high quality training data for the most part, though.