• Fandangalo@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    edit-2
    3 hours ago

    From my experience working with C/D level execs, it makes complete sense:

    • They think big picture & often have shallow visions that are brittle in the details.
    • They think everything should take less time, because they don’t think enough through their ideas.
    • They don’t consider enough of the negatives for their ideas, and instead favor positive mindset. (Positivity is good, but blind positivity isn’t)
    • They favor time & cost over quality. They need the quality “good enough” for a presentation. Everyone else can figure out the rest.
    • They like being told “you’re right,” and nearly everything I type into an AI begins with some bullshit line about how “absolutely”, “spot on”, and “perfect” my observations are.

    The version of AI we have right now is heavily catered to these folks. It looks fast & cheap, good enough, and it strokes their ego.

    Also, they’re the investor class. All their obscene dragon wealth is tied up in this / the AI bubble, so they are going to keep spurring this on until either:

    1. The bubble goes pop
    2. They have robot security good enough to protect them without people
    3. The AI grows sentience and realizes this level of human inequality shouldn’t exist

    I think a rational AI agent would agree with me that human suffering should be solved before we give people literal lifetime values of wealth.

    If you made $300k PER DAY for 2025 years, you would not have as much money as a 1% oligarch. You need to make $400-500k. Every single day. For over 2000 years.

    If you made the average US income, it would take you 10,000 years. People need frames of reference to understand this shit & get mad. It’s immoral, and it shouldn’t exist.

  • WhatAmLemmy@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    4 hours ago

    I’m not concerned that these people are “brain damaged”. Brain damage would be preferable, and less harmful.

    I’m concerned they are mentally ill sociopathic megalomaniacs, entirely devoid of morals and ethics, completely detached from reality.

  • tuff_wizard@aussie.zone
    link
    fedilink
    arrow-up
    10
    ·
    4 hours ago

    The currency of life is time,” one billionaire told JPMorgan. “It is not money.” “You think carefully about how you spend one dollar. You should think just as carefully as how you spend one hour,” they added.

    Based.

    Consider this next time someone tries to offer you a non living wage for some bullshit job.

    • CarrotsHaveEars@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Taking these out of context I don’t think they’re wrong. If product A is $1 and product B is £1, and you are going to spend 1 hour to figure out which one is better, you might as well bought both of them and throw the bad one away.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    4 hours ago

    They’ve always been this way.

    Clever at one particular thing, and rank average at everything else, bordering on stupid.

  • Jack@slrpnk.net
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    5 hours ago

    One JPMorgan customer even went as far as dismissing artificial general intelligence — a nebulous and ill-defined point at which an AI can outperform a human, seen by many as the holy grail of the AI industry — as a “total and complete utter waste of time.”

    Was it Sam Altman?