• aesthelete@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    14 hours ago

    So sure, Fuck AI (mostly) as it exists today but it won’t be long before it’s as ubiquitous as tablets and smartphones.

    In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO. The true lasting effects from this hype cycle are likely the capabilities that are being driven into smaller language models that don’t have out of control resource requirements.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      13 hours ago

      In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

      I agree, which is why I shared that I recently saw a prototype ASIC-esque PCI card. The local hardware is coming, the models just need to settle down some before anyone will commit to building that hardware.

      In the '90s and '00s you needed a zillion dollars of custom Silicon Graphics workstations and months of processing to do the FX for movies like “The Terminator”. In 2020 you could replicate it in a few hours with commodity hardware.

      The LLMs and AI will be the same, it just needs more than 5 years to get there.

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      ·
      13 hours ago

      In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

      LLMs as they are, can already run on smartphones, which pretty are ubiquitous themselves.

      So a flagship phone would have 12-16 gigs of RAM these days I believe. A low-end phone 4 gigs.

      Here are the sizes of some different parameter count versions of Qwen 3.5, a popular Chinese open-weight LLM:

      27B: 17 GB - not yet possible to run on current flagship phones, but once the RAM crisis ends, I could see this happening.

      9B: 6.6 GB

      4B: 3.4 GB

      2B: 2.7 GB

      0.8B: 1 GB.

      For any recently manufactured device, there will be versions of multiple popular LLMs that will run on the RAM size they have available.

      • aesthelete@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        5 hours ago

        Most people do not have a smartphone with that amount of RAM. But ultimately, yeah, eventually it’ll run on readily available hardware or it’ll go into a dustbin.

        There’s already ollama and stuff. It’ll stick around.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          4 hours ago

          I mean fairly low end phones are 4 GB now. They could likely afford running a model that fits in 1GB of RAM. Different models for different classes of phone even for the same manufacturer will likely be a thing.