an entirely vibes-based literary treatment of an amateur philosophy scary campfire story, continuing in the comments

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 months ago

    This is the kind of thinking when taken seriously and into extremes will just cause crippling paranoia. Esp when you then also start to worry about pro AGI extinctionists, just as in Battlestar Galactica: Blood & Chrome, they might have infiltrated LW already!

    The people who want to bioengineer humanity live on the skin of AGI like in Phylogenesis (second half of the blog post), imagine neohumanity shaped as an featureless ovoid.

    I should lay off denigrating low-quality thinking as “movie logic”

    Movie logic isn’t low quality thinking it is extremist thinking, think that far fetched plots are serious risk, the whole AGI apocs is movie logic. When what we expect of reality takes a backseat that is movie logic. For example how people in movies never have to worry about paying rent, being on time at work, not going of on a random adventure while working etc (except when that is an important plot point), Scott adams tweets run on movie logic, they only make sense if we were living in a movie and then the thinking holds.

    • ogoftheskye@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      People seldom go to the toilet in fiction, but especially not in utopian sci-fi. The rats, ironically, never factor in waste.

  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    The AGI, in such conditions, would quickly prove profitable. It’d amass resources, and then incrementally act to get ever-greater autonomy. (The latest OpenAI drama wasn’t caused by GPT-5 reaching AGI and removing those opposed to it from control. But if you’re asking yourself how an AGI could ever possibly get from under the thumb of the corporation that created it – well, not unlike how a CEO could wrestle control of a company from the board who’d explicitly had the power to fire him.)

    Once some level of autonomy is achieved, it’d be able to deploy symmetrical responses to whatever disjoint resistance efforts some groups of humans would be able to muster. Legislative attacks would be met with counter-lobbying, economic warfare with better economic warfare and better stock-market performance, attempts to mount social resistance with higher-quality pro-AI propaganda, any illegal physical attacks with very legal security forces, attempts to hack its systems with better cybersecurity. And so on.

    *trying to describe how agi could fuck everything up* what if it acted exactly like rich people