OC donut steel (but repost wherever lol)

Dude says: “I’m not worried about the AI apocalypse, I always say “thank you” to them!”
Robots later catch him and state: “Throw that one in the grinder, his “thank you” used 748kw/h every day”

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    6 days ago

    yes, the models are bigger, but Wh/prompt is still the metric to look at. 300W for 3 seconds is the same amount of energy as 14.3kW for 0.021 seconds, roughly. i don’t know how fast a machine like that can spit out a single response because right now i’m assuming they’re time-slicing them to fuck, but at least gpt4o through duck.ai responds in about the same time.
    if it running an 800GB model (which i think is about where gpt4o is) takes the same amount of time to respond as me running an 8GB model (i know the comparison is naive) then it would be about… twice as efficient? 0.25Wh for me compared to 11.9Wh/100 for them. and that’s without knowing how many conversations one of those things can carry on at the same time.

    Edit: also, this is me ignoring for the sake of the discussion that the training is where all the energy use comes from.

    • Norah (pup/it/she)@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 days ago

      Edit: also, this is me ignoring for the sake of the discussion that the training is where all the energy use comes from.

      AFAIK that’s no longer true now that uptake (read: it being jammed into everything) is much higher now.

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        oh that’s interesting, i assumed that it wasn’t actually being used despite being in everywhere but i’ve not seen any stats.