I read this quote today, and it resonated:
"The unborn” are a convenient group of people to advocate for. They never make demands of you; they are morally uncomplicated, unlike the incarcerated, addicted, or the chronically poor; they don’t resent your condescension or complain that you are not politically correct; unlike widows, they don’t ask you to question patriarchy; unlike orphans, they don’t need money, education, or childcare; unlike aliens, they don’t bring all that racial, cultural, and religious baggage that you dislike; they allow you to feel good about yourself without any work at creating or maintaining relationships; and when they are born, you can forget about them, because they cease to be unborn. You can love the unborn and advocate for them without substantially challenging your own wealth, power, or privilege, without re-imagining social structures, apologizing, or making reparations to anyone. They are, in short, the perfect people to love if you want to claim you love Jesus, but actually dislike people who breathe. Prisoners? Immigrants? The sick? The poor? Widows? Orphans? All the groups that are specifically mentioned in the Bible? They all get thrown under the bus for the unborn. - David Barbary, Methodist pastor
It certainly rings true for white American evangelicals, but it quickly occurred to me it applies pretty well to longtermists too. Centering the well-being of far-future simulated super-humans repulses me, but it seems very compelling to the majority of the EA cult.
Perhaps present-day humans are more obviously aided by questioning literally any aspect of hyper-capital. Better to cast out to the far future and insist (without any real basis) that fellating billionaires is the best course.
Perhaps the beneficiaries of the most efficient public health interventions (the previous focus of the movement) are somehow more difficult for them to identify with…
This is the most unhinged sub I’ve ever been on. Love it.
Every server is sacred.
Every server is great.
If a server is wasted,
Acasualrobotgod gets quite irate!
I spend a lot of time campaigning for animal rights. These criticisms also apply to it but I don’t consider it a strong argument there. EA’s spend an estimated 1.8 million dollar per year (less than 1%, so nowhere near a majority) on “other longterm” which presumably includes simulated humans, but an estimated 55 million dollar per year (or 13%) on farmed animal welfare (for those who are curious, the largest recipient is global health at 44%, but it’s important to note that it seems like the more people are into EA the less they give to that compared to more longtermist causes). Farmed animals “don’t resent your condescension or complain that you are not politically correct, they don’t need money, they don’t bring cultural baggage…” yet that doesn’t mean they aren’t a worthy cause. This quote might serve as something members should keep in mind, but I don’t think it works as an argument on its own.
This quote might serve as something members should keep in mind, but I don’t think it works as an argument on its own
Putting aside the idea of it being an argument, I think you gotta have a bit more self-esteem in your cause, mate. I’m no animal rights activist, but even I can see that animals are living beings that can be harmed, unlike the unborn or simulated. It’s absolutely a worthy cause.
less than 1%…on other long-term…which presumably includes simulated humans.
Oh it’s way more than this. The linked stats are already way out of date, but even in 2019 you can see existential risk rapidly accelerating as a cause, and as you admit much moreso with the hardcore EA set.
As for what simulated humans have to do with existential risk, you have to look to their utility functions: they explicitly weigh the future pleasure of these now-hypothetical simulations as outweighing the suffering of any and all present or future flesh bags.
The linked stats are already way out of date
Do you have a source for this ‘majority’ claim? I tried searching for more up to date data but this less comprehensive 2020 data is even more skewed towards Global development (62%) and animal welfare (27.3%) with 18.2% for long term and AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc). Utility of existential risk reduction is basically always based on population growth/ future generations (aka humans) and not simulations. ‘digital person’ only has 25 posts on the EA forum (by comparison, global health and development has 2097 post). It seems unlikely to me that this is a majority belief.
Calling it a majority might be unwarranted. EAs have bought a lot of mosquito nets, and most of those donations were probably not made with the thinking “can’t lift-and-shift this old brain of mine into the cloud if everyone dies of malaria”.
That said, the data presented on that page is incredibly noisy, with a very small sample size for the individual respondents who specified the cause they were donating to and numbers easy to skew with a few big donations. There’s also not much in there about the specific charities being donated to. For all I can tell they could just be spinning some AI bullshit as anything from public health to criminal justice reform. Speaking of which,
AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc)
AI is to climate change as indoor smoking is to fire safety, nearterm AI problems is an incredibly vague and broad category and I would need someone to explain to me why they believe AI has anything to do with pandemics. Any answer I can think of would reflect poorly on the one holding such belief.
the data presented on that page is incredibly noisy
Yes, that’s why I said it’s “less comprehensive” and why I first gave the better 2019 source which also points in the same direction. If there is a better source, or really any source, for the majority claim I would be interested in seeing it.
Speaking of which,
AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc)
AI is to climate change as indoor smoking is to fire safety, nearterm AI problems is an incredibly vague and broad category and I would need someone to explain to me why they believe AI has anything to do with pandemics. Any answer I can think of would reflect poorly on the one holding such belief.
You misread, it’s 18.2% for long term and AI charities [emphasis added]
18.2% is not a majority, but it’s 18.2% higher than it would be in a movement that didn’t have a serious fucking problem
The way this is categorized, this 18.2% is also about things like climate change and pandemics.
What benefits did the Longtermist stuff on pandemics do in the actual pandemic?
If, as I suspect, it was of no benefit, it belongs in the same pile as hindering the acasualrobotgod.
Short answer: “majority” is hyperbolic, sure. But it is an elite conviction espoused by leading lights like Nick Beckstead. You say the math is “basically always” based on flesh and blood humans but when the exception is the ur-texts of the philosophy, counting statistics may be insufficient. You can’t really get more inner sanctum than Beckstead.
Hell, even 80000 hours (an org meant to be a legible and appealing gateway to EA) has openly grappled with whether global health should be deprioritized in favor of so-called suffering-risks, exemplified by that episode of Black Mirror where Don Draper indefinitely tortures a digital clone of a woman into subjugation. I can’t find the original post, formerly linked to from their home page, but they do still link to this talk presenting that original scenario as a grave issue demanding present-day attention.
I’m all for dismantling the meat industry but there is a lot of political confusion going around animal welfare circles, a lot of projecting ideals onto animals, and it’s a very important thing to keep in mind. A lot of the movement is straight up reactionary.
Without wishing to be rude, this seems like a comically false equivalence. On an obvious count: farmed animals bring a lot of baggage. Nobody wants to go to a slaughterhouse, which would be the genuine equivalence here between dealing with a real, messy, argumentative human being, versus just eating the beef with the picture of the friendly cow on the packaging, i.e. advocating for a cost-benefit which favours people who don’t exist yet.
A key difference is that animals exists here and now, and I think most humans would viscerally understand animal shouts of pain as requests for help/food/space etc…
The quote is less about the unborn, and more about the real and ignored needs of disenfranchised people.
Help your fellow humans first and foremost, (which I would argue is well served by treating animals well, for sanitary, eco-system, or even selfish mental well-being by not having our souls marred by brutality)
Actual beings with needs: humans, animals > the unborn >>>>>> unrealistic hypothetical humans.
It rings very true,
The [un]simulated, with the extra icky purpose of presenting of veneer of ethics to back any an all arguments under the sun, to pour money into the latest fad that tickles a billionaire’s fancy.
You can’t quite (yet) do that with pro-life advocacy.
Is it me or is there a big ex-evangelical presence in rationalism?
@saucerwizard Rationalism overlaps with TESCREAL and stuff like Extropianism and Cosmism which was invented by a straight-up Russian Orthodox theologian and philosopher, Nikolai Federov, in the late 19th century, to provide a teleological imperative for space colonization. It borrowed its structural skeleton from Christianity.
It does seems so. Someone else remarked similarly in the aella thread a while back
there very much is, yes
Maybe I’m paranoid but I can’t help but feel that the recent spate of “omg people have having too few children!” on HN is just another way to promote anti-abortion policies to the non-religious.
Representative example: https://news.ycombinator.com/item?id=39499490 (linked article originally published on Quillette, natch)
They also mean “the wrong people are having too many children”.
Also:
Poor black people with lots of kids, using government assistance: “Don’t have kids you can’t afford!”
Middle-class white people putting off having kids because they can’t afford them: “Don’t give us that excuse, start breeding!”
I was going to link out to a Kinder, Küche, Kirche propaganda poster but honestly there’s enough nazi shit in all this natalist rhetoric that there’s no need for us to add any. 😔
It’s white supremacy mainly, but yes, treating women as wombs is absolutely part of the package.
From all I’ve read about classical fascism, misogyny is an integral part of it. It’s just not something that stands out since the baseline for misogyny was much higher in the interwar years.
And worries about population was widespread outside fascism too. Two of the patron saints of Swedish social democracy, the Myrdals, were famous for their polemic Kris i befolkningsfrågan (1931, sv_SE) which led to decisions about child support and the construction of flats that were better for families with children.
“en positiv befolkningspolitik bör icke inriktas på att få enstaka fattiga familjer att föda ett mycket stort antal barn, utan att förmå det stora flertalet att föda låt oss säga t. ex. 3 barn.”
Transl: a positive population policy should not focus on having fewer poor families having very many children, rather that most families should have let’s say 3 children.
Charlie Stross called the Singularity “the rapture for nerds”.
The rapture *of* the Nerds, and I got the phrase from Ken MacLeod, who says he got it from someone else (but forgot who it was).
The design patterns of Christian fundamentalism show up strongly in singularitarianism (minus the God’n’Jeezus show). Not surprising given its ancestry lies in Russian Cosmism.
So advocating for “the uploaded” is like advocating for the souls of the elect in heaven, after the ain’t-happened-yet Rapture.
(minus the God’n’Jeezus show)
well, Roko did put in the foundaions