• Solaris@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    13 hours ago

    Why tf did they use that thumbnail with the Framework laptop in the background 😭 also, who was stupid enough to think the conversations were private in the first place?

    • Broadfern@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      20 hours ago

      People who refuse to exercise critical thinking skills, which is a terrifying percentage of the population.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      16 hours ago

      I think this is what has always worried me most. I do what I can to avoid being tracked across the internet and tracked by search engines.

      AI is like a wire tap lol.

      • badgermurphy@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        Its a lot like a personal attendant that is an amazing spy and incompetent boob at attending to your personal needs.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      18 hours ago

      Nothing beyond shipping laptops with NPUs, which isn’t unusual since that’s what Intel’s and AMD’s laptop CPUs come with these days.

  • ColeSloth@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    15 hours ago

    I use an APK called Off Grid and load ai onto that (right now I’m using genna 4). It’s all done on my phone. Nothing on the cloud. No data sent anywhere. Completely local. No entities get shit from it. The only way I’ll use ai.

    • qualia@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 hours ago

      I highly value privacy, but the gap between local LLMs vs top of the line cloud LLMs (e.g. Claude & DeepSeek) is still too great for me to switch completely to the former.

      I’ll use PWAs to sandbox LLMs from everything else (and each other) and try to create semantic distance between the user and the queries.

      How about that leaked Claude source code? Is there a reliably clean version of that available anywhere yet?

      • Bob Robertson IX @discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 hours ago

        I have my local LLM currently setup and it runs just as well as Sonnet 4.6 from a quality standpoint, and for performance it is slightly slower but it’s still faster than I can respond.

        This is with a Strix Halo APU with 128GB unified memory using the latest Qwen3.6 models with llama.cpp.

  • Elilol@fedinsfw.app
    link
    fedilink
    Español
    arrow-up
    2
    ·
    18 hours ago

    “Even more concerning, in some cases weak or non-existent access controls mean that simply having a link to a conversation can grant access to its content, making chats publicly accessible to anyone including trackers who has the URL,” highlights Narseo Vallina Rodríguez, Research Associate Professor at IMDEA Networks Institute.