• quick_snail@feddit.nl
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      6 days ago

      No. WSJ reported that Claude was used in both Venezuela and determining targets in Iran.

      The reason Hegseth declared it a supply chain risk was because Anthropic refused to allow it to be used in autonomous wrapons.

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        6 days ago

        in both Venezuela

        For kidnapping that bus driver?

        determining targets in Iran.

        You mean murdering school girls?

        Yeah, that sounds like the same old technology under capitalism. No “AI” detected.

    • Keeponstalin@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      6 days ago

      This is AI designed for plausible deniability of targeting civilians. Tested in Gaza with the Lavender and Where’s Daddy AI systems

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      6 days ago

      To be fair, “AI” is a nonexistent grift.

      Whether autopilot or autocomplete, it’s the same old shit just regrifted.