Thinking about how the arsing fuck to explain the rationalists to normal people - especially as they are now a loud public problem along multiple dimensions.

The problem is that it’s all deep in the weeds. Every part of it is “it can’t be that stupid, you must be explaining it wrong.”

With bitcoin, I have, over the years, simplified it to being a story of crooks and con men. The correct answer to “what is a blockchain and how does it work” is “it’s a way to move money around out of the sight of regulators” and maybe “so it’s for crooks and con men, and a small number of sincere libertarians” and don’t even talk about cryptography or technology.

I dunno what the one sentence explanation is of this shit.

“The purpose of LessWrong rationality is for Yudkowsky to live forever as an emulation running on the mind of the AI God” is completely true, is the purpose of the whole thing, and is also WTF.

Maybe that and “so he started what turned into a cult and a series of cults”? At this point I’m piling up the absurdities again.

The Behind The Bastards approach to all these guys has been “wow these guys are all so wacky haha and also they’re evil.”

How would you first approach explaining this shit past “it can’t be that stupid, you must be explaining it wrong”?

[also posted in sneer classic]

  • maol@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    I think starting with Sam Bankman Fried is a solid idea. Relatively informed members of the general public a) know who that guy is, and b) know that he made some really poor decisions. He does not have the silicon valley mystique that attaches itself to some other adherents, I think fewer people will think “well that guy is really smart, why would he be in a cult”. Then you can go back and explain EA and LessWrong and Yudkowsky’s role in all of this.

  • -dsr-@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    “Rationalism” is to normal logical thinking what blindfolded multi-board speed chess is to tic-tac-toe: you can only see in retrospect how anyone could get there from here. The things which occupy a Rationalist’s mind are completely divorced from ordinary concerns like ethics. Nobody would or could have predicted this quantity or quality of lunacy.

  • AllNewTypeFace@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    The latest in a chain of cults, after Mormonism, the Victorian-era spiritualist fad, Scientology and new-age “quantum” woo, each using the trappings of the exciting scientific/technological ideas of their time to sell the usual proposition (a totalising belief system that answers* all questions).

  • o7___o7@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    12 days ago

    How would you first approach explaining this shit past “it can’t be that stupid, you must be explaining it wrong”?

    This is the question of the moment, isn’t it?

    I have no answers, but i can say thanks for being a light in the dumbness.

  • scruiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    So… on strategies for explaining to normies, a personal story often grabs people more than dry facts, so you could focus on the narrative of Eliezer trying big idea, failing or giving up, and moving on to bigger ideas before repeating (stock bot to seed AI to AI programming language to AI safety to shut down all AI)? You’ll need the wayback machine, but it is a simple narrative with a clear pattern?

    Or you could focus on the narrative arc of someone that previously bought into less wrong? I don’t volunteer, but maybe someone else would be willing to take that kind of attention?

    I took a stab at both approaches here: https://awful.systems/comment/6885617

  • Joe@functional.cafe
    link
    fedilink
    arrow-up
    0
    ·
    12 days ago

    @dgerard If “the purpose of a system is what it does” then the main purpose of “LessWrong rationality” seems to be “getting Yudkowsky laid”

  • Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    12 days ago

    It’s pick-me objectivism, only more overtly culty the closer you are to it irl. Imagine scientology if it was organized around AI doomerism and naive utilitarianism while posing as a get-smart-quick scheme.

    It’s main function (besides getting the early adopters laid) is to provide court philosophers for the technofeudalist billionaire class, while grooming talented young techies into a wide variety of extremist thought both old and new, mostly by fostering contempt of established epistemological authority in the same way Qanons insist people do their own research, i.e. as a euphemism for only paying attention to ingroup approved influencers.

    It seems to have both a sexual harassment and a suicide problem, with a lot of irresponsible scientific racism and drug abuse in the mix.

  • bitofhope@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    I don’t think Yud is that hard to explain. He’s a science fiction fanboy who never let go of his adolescent delusions of grandeur. He was never successfully disabused from the notion that he’s always the smartest person in the room and he didn’t pursue high school, let alone college education to give him the expertise to recognize just how difficult his goal is. Blud thinks he’s gonna create a superhumanly intelligent machine when he struggles with basic programming tasks.

    He’s kinda comparable to Elon Musk in a way. Brain uploading and superhuman AI are sort of in the same “cool sci fi tech” category as Mars colonization, brain implants and vactrain gadgetbahns. It’s easy to forget that not too many years ago the public’s perception of Musk was very different. A lot of people saw him as a cool Tony Stark figure who was finally going to give us our damn flying cars.

    Yudkowsky is sometimes good at knowing just a bit more about things than his audience and making it seem like he knows a lot more than he does. The first time I started reading HPMoR I thought the author was an actual theoretical physicist or something and when the story said I could learn everything Harry knows for free on this LessWrong site I though I could learn what it means for something to be “implied by the form of the quantum Hamiltonian” or what that those “timeless formulations of quantum mechanics” were about. Instead it was just poorly paced essays on bog standard logical fallacies and cognitive biases explained using their weird homegrown terminology.

    Also, it’s really easy to be convinced of thing when you really want to believe in it. I know personally some very smart and worldly people who have been way too impressed by ChatGPT. Convincing people in San Francisco Bay Area that you’re about to invent Star Trek technology is basically the national pastime there.

    His fantasies of becoming immortal through having a God AI simulate his mind forever aren’t the weird part. Any imaginative 15 year old computer nerd can have those fantasies. The weird parts are that he never grew out of those fantasies and that he managed to make some rich and influential contacts while holding on to his chuunibyō delusions.

    Anyone can become a cult leader through the power of buying into your own hype and infinite thielbux.

    • -dsr-@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 days ago

      Convincing people in San Francisco Bay Area that you’re about to invent Star Trek technology is basically the national pastime there.

      Ding! Ding! Ding! Upvote.

  • jaschop@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    Didn’t come up with that simile, but it might fit:

    It’s like a fleshed out version of a 12 year old thinking “everything would be great if I was in charge, because I’m smart and people are dumb”

    Something about people who are too impressed with their own smarts and swap pet theories that make them feel smart.

  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 days ago

    I usually say the following. I’m paraphrasing a spiel I have delivered in person several times and which seems to get things across.

    'there’s a kind of decentralized cult called rationalism. they worship rational thinking, have lots of little rituals that are supposed to invoke more rational thinking, and spend a lot of time discussing their versions of angels and demons, which they conceive of as all powerful ai beings.

    rationalists aren’t really interested in experiments or evidence, because they want to figure everything out with pure reasoning. they consider themselves experts on anything they’ve thought really hard about. they come up with a lot of apocalypse predictions and theories about race mingling.

    silicon valley is saturated with rationalists. most of the people with a lot of money are not rationalists. but VCs and such find rationalists very useful, because they’re malleable and will claim with sincerity to be experts on any topic. for example, when AI companies claim to be inventing really intelligent beings, the people they put forward as supporting these claims are rationalists.’

  • blakestacey@awful.systemsM
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    I’m trying to imagine how a John Oliver sketch would introduce them. “The kind of nerds who make you think the jocks in '80s movies had a reasonable point got together and sold ‘science’ and ‘rational thinking’ as self-help, without truly understanding either, and it got very culty.”

  • mountainriver@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    I usually go with “Scientology for the 21st century”. That for most gives just “weird cult”, which is close enough for most people.

    For those that are into weird cults you get questions about Xenu and such, and can answer “No they are not into Xenu, instead they want to build their god. Out of chatbots”. And so on. If they are interested in weird cult shit, and have already accepted that we are talking about weird cults the weirdness isn’t a problem. If not, it stops at “Scientology for the 21st century”.