[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
1) While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...
2) Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
3) Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
4) The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
5) It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.
"... insanity is often marked by the dominance of reason and the exclusion of creativity and humour. Pure reason is inhuman. The madman’s mind moves in a perfect, but narrow, circle, and his explanation of the world is comprehensive, at least to him."
This sounds a lot like the psychopath Anton Chigurh in the movie No Country for Old Men. His view of the world is he is the arbiter of people's destiny, which often involves them being murdered.
Another thing I'll add after having spent a few years in a religious cult:
It's all about the axioms. If you tweak the axioms you can use impeccable logic to build a completely incorrect framework that will appeal to otherwise intelligent(and often highly intelligent) people.
Also, people are always less rational than they are able to admit. The force of things like social connection can very easily warp the reasoning capabilities of the most devout rationalist(although they'll likely never admit that).
Im kinda skeptical these folks were following some hyper-logical process from flawed axioms that led them to the rigorous solution: "I should go stab our land-lord with a samurai sword" or "I should start a shootout with the Feds".
The rationalist stuff just seems like some meaningless patter they stuck ontop of more garden variety cult stuff.
The axioms of rationality, morality, etc. I've always found interesting.
We have certain axioms, (let me chose an arbitrary, and possibly not quite an axiomy-enough example): "human life has value". We hold this to be self-evident and construct our society around it.
We also often don't realize that other people and cultures have different axioms of morality. We talk/theorize at this high level, but don't realize we have different foundations.
Wow, what a perfect description of why their probability-logic leads to silly beliefs.
I've been wondering how to argue within their frame for a while, and here's what I've come up with: Is the likelihood that aliens exist, are unfriendly, and AGI will help us beat them higher or lower than the likelihood that the AGI itself that we develop is unfriendly to us and wants to FOOM us? Show your work.
It’s pointless. They aren’t rational. Any argument you come up with that contradicts their personal desires will be successfully “reasoned” away by them because they want it to be. Your mistake was ever thinking they had a rational thought to begin with, they think they are infallible.
Much of philosophy throughout history seems to operate this way.
I think philosophy is a noble pursuit, but it's worth noting how often people drew very broad conclusions, and then acted on them, from not very much data. Consider the dozens of theories of the constitution of the world from the time of the Greek thinkers (even the atomic theory doesn't look very much at all like atoms as we now understand them), or the myriad examples of political philosophies that ran up against people simply not acting the way the philosophy needed them to act to cohere.
The investigation of possibility is laudable, but a healthy and regular dose of evidence is important.
AGI would be extremely helpful in navigating clashes with aliens, but taking the time to make sure it's safe is very unlikely to make a difference to whether it's ready in time. Rationalists want AGI to be built, and they're generally very excited about it, e.g. many of them work at Anthropic. They just don't want a Move Fast and Break Things pace of development.
It seems that you didn't understand the main point of the exposition. I'll summarize the ops comment a bit further.
Points 1 and 2 only explain how they are able to erroneously justify their absurd beliefs, they don't explain why they hold those beliefs.
Points 3 through 5 are the heart of the matter; egotistical and charismatic (to some types of people) leaders, open minded, freethinking and somewhat weird or marginalized people searching for meaning plus a way for them all to congregate around some shared interests.
TLDR: perfect conditions for one or more cults to form.
No, it’s the “rationality.” Well maybe the people too, but the ideas are at fault.
As I posted elsewhere on this subject: these people are rationalizing, not rational. They’re writing cliche sci-fi and bizarre secularized imitations of baroque theology and then reasoning from these narratives as if they are reality.
Reason is a tool not a magic superpower enabling one to see beyond the bounds of available information, nor does it magically vaporize all biases.
Logic, like software and for the same reason, is “garbage in, garbage out.” If even one of the inputs (premises, priors) is mistaken the entire conclusion can be wildly wrong. Errors cascade, just like software.
That's why every step needs to be checked with experiment or observation before a next step is taken.
I have followed these people since stuff like Overcoming Bias and LessWrong appeared and I have never been very impressed. Some interesting ideas, but honestly most of them were recycling of ideas I’d already encountered in sci-fi or futurist forums from way back in the 1990s.
The culty vibes were always there and it instantly put me off, as did many of the personalities.
“A bunch of high IQ idiots” has been my take for like a decade or more.
> As I posted elsewhere on this subject: these people are rationalizing, not rational.
That is sometimes true, but as I said in another comment, I think this is on the weaker end of criticisms because it doesn't really apply to the best of that community's members and the best of its claims, and in either case isn't really a consequence of their explicit values.
> Logic, like software and for the same reason, is “garbage in, garbage out.” If even one of the inputs (premises, priors) is mistaken the entire conclusion can be wildly wrong. Errors cascade, just like software.
True, but an odd analogy: we use software to make very important predictions all the time. For every Therac-25 out there, there's a model helping detect cancer in MRI imagery.
And, of course, other methods are also prone to error.
> That's why every step needs to be checked with experiment or observation before a next step is taken.
Depends on the setting. Some hypotheses are not things you can test in the lab. Some others are consequences you really don't want to confirm. Setting aside AI risk for a second, consider the scientists watching the Trinity Test: they had calculated that it wouldn't ignite the atmosphere and incinerate the entire globe in a firestorm, but...well, they didn't really know until they set the thing off, did they? They had to take a bet based on what they could predict with what they knew.
I really don't agree with the implicit take that "um actually you can never be certain so trying to reason about things is stupid". Excessive chains of reasoning accumulate error, and that error can be severe in cases of numerical instability (e.g. values very close to 0, multiplications, that kind of thing). But shorter chains conducted rigorously are a very important tool to understand the world.
>They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm".
Which leader said anything like that? Certainly not Eliezer or the leader of the Center for Applied Rationality (Anna Salamon) or the project lead of the web site lesswrong.com (Oliver Habryka)!
Hello, can confirm, criticism is like the bread and butter of LW, lol. I have very extensively criticized tons of people in the extend rationality ecosystem, and I have also never seen anyone in any leadership position react with anything like this quote. Seems totally made up.
> Rationalists, by tending to overly formalist approaches,
But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe". Few to none in the community know squat about actually computing a posterior probability, but they'll all happily chant "shut up and multiply" as a justification for whatever nonsense they instinctively wanted to do.
> Precision errors in utility calculations that are numerically-unstable
Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do. The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid. It's because in the face of substantial uncertainty about the world (and your own calculation processes) reasoning things out can only take you so far. A useful tool in some domains, but not a generalized philosophy for life ... The cognitive biases they obsess about and go out of their way to eschew are mostly highly evolved harm mitigation heuristics for reasoning against uncertainty.
> that is particularly susceptible to internally-consistent madness
It's typical for cults to cultivate vulnerable mind states for cult leaders to exploit for their own profit, power, sexual fulfillment, etc.
A well regulated cult keeps its members mental illness within a bound that maximized the benefit for the cult leaders in a sustainable way (e.g. not going off and murdering people, even when doing so is the logical conclusion of the cult philosophy). But sometimes people are won over by a cult's distorted thinking but aren't useful for bringing the cult leaders their desired profit, power, or sex.
> But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe".
I broadly agree with this criticism, but I also think it's kind of low-hanging. At least speaking for myself (a former member of those circles), I do indeed sit down and write quantitative models when I want to estimate things rigorously, and I can't be the only one who does.
> Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do.
This, on the other hand, I don't think is a valid criticism nor correct taken in isolation.
You can absolutely make meaningful predictions about the world despite uncertainties. A good model can tell you that a hurricane might hit Tampa but won't hit New Orleans, even though weather is the textbook example of a medium-term chaotic system. A good model can tell you when a bridge needs to be inspected, even though there are numerous reasons for failure that you cannot account for. A good model can tell you whether a growth is likely to become cancerous, even though oncogenesis is stochastic.
Maybe a bit more precisely, even if logic cannot tell you what sets of beliefs are correct, it can tell you what sets of beliefs are inconsistent with one another. For example, if you think event X has probability 50%, and you think event Y has probability 20% conditional on X, it would be inconsistent for you to believe event Y has a probability of less than 10%.
> The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid
When I thought about founding my company last January, one of the first things I did was sit down and make a toy model to estimate whether the unit economics would be viable. It said they would be, so I started the company. It is now profitable with wide operating margins, just as that model predicted it would be, because I did the math and my competitors in a crowded space did not.
Yeah, it's possible to be overconfident, but let's not forget where we are: startups win because people do things in dumb inefficient ways all the time. Sometimes everyone is wrong and you are right, it's just that that usually happens in areas where you have singularly deep expertise, not where you were just a Really Smart Dude and thought super hard about philosophy.
I noticed years ago too that AI doomers and rationalist type were very prone to (infinity * 0 = infinity) types of traps, which is a fairly autistic way of thinking. Humanity long time ago decided that infinity * 0 = 0 for very good practical reasons.
> Humanity long time ago decided that infinity * 0 = 0
I'm guessing you don't mean this in any formal mathematical sense, without context, infinity multiplied by zero isn't formally defined. There could be various formulations and contexts where you could define / calculate something like infinity * zero to evaluate to whatever you want. (e.g. define f(x) := C x and g(x) := 1/x, What does f(x) * g(x) evaluate to in the limit as x goes to infinity? C. And we can interpret f(x) as going to infinity while g(x) goes to zero, so we can use that to justify writing "infinity * 0 = C" for an arbitrary C... )
So, what do you mean by "infinity * 0 = infinity" informally? That humans regard the expected value of (arbitrarily large impact) * (arbitrarily small probability) as zero?
They actively look for ways for infinity to happen. Look at Eli's irate response to Roko's basilisk. To him even being able to imagine that there is a trap means that it will necessarily be realised.
I've seen "rationalist" AI doomers who say things like "given enough time technology will be invented to teleport you into the future where you'll be horifically tortured forever".
It's just extrapolation, taken to the extreme, and believed in totally religiously.
> Humanity long time ago decided that infinity * 0 = 0 for very good practical reasons.
Among them being that ∞ × 0 = ∞ makes no mathematical sense. Multiplying literally any other number by zero results in zero. I see no reason to believe that infinity (positive or negative) would be some exception; infinity instances of nothing is still nothing.
I'm interested in #4, is there anywhere you know of to read more about that? I don't think I've seen that described except obliquely in eg sayings about the relationship between genius and madness.
I don't, that one's me speaking from my own speculation. It's a working model I've had for a while about the nature of a lot of kinds of mental illness (particularly my own tendencies towards depression), which I guess I should explain more thoroughly! This gets a bit abstract, so stick with me: it's a toy model, and I don't mean it to be definitive truth, but it seems to do well at explaining my own tendencies.
-------
So, toy model: imagine the brain has a single 1-dimensional happiness value that changes over time. You can be +3 happy or -2 unhappy, that kind of thing. Everyone knows when you're very happy you tend to come down, and when you're very sad you tend to eventually shake it off, meaning that there is something of a tendency towards a moderate value or a set-point of sorts. For the sake of simplicity, let's say a normal person has a set point of 0, then maybe a depressive person has a set point of -1, a manic person has a set point of +1, that sort of thing.
Mathematically, this is similar to the equations that describe a spring. If left to its own devices, a spring will tend to its equilibrium value, either exponentially (if overdamped) or with some oscillation around it (if underdamped). But if you're a person living your life, there are things constantly jostling the spring up and down, which is why manic people aren't crazy all the time and depressed people have some good days where they feel good and can smile. Mathematically, this is a spring with a forcing function - as though it's sitting on a rough train ride that is constantly applying "random" forces to it. Rather than x'' + kx = 0, you've got x'' + kx = f(t) for some external forcing function f(t), where f(t) critically does not depend on x or on the individual internal dynamics involved.
These external forcing functions tend to be pretty similar among people of a comparable environment. But the internal equilibria seem to be quite different. So when the external forcing is strong, it tends to pull people in similar directions, and people whose innate tendencies are extreme tend to get pulled along with the majority anyway. But when external forcing is weak (or when people are decoupled from its effects on them), internal equilibria tend to take over, and extreme people can get caught in feedback loops.
If you're a little more ML-inclined, you can think about external influences like a temperature term in an ML model. If your personal "model" of the world tends to settle into a minimum labeled "completely crazy" or "severely depressed" or the like, a high "temperature" can help jostle you out of that minimum even if your tendencies always move in that direction.
Basically, I think weird nerds tend to have low "temperature" values, and tend to settle into their own internal equilibria, whether those are good, bad, or good in some cases and bad in others (consider all the genius mathematicians who were also nuts). "Normies", for lack of a better way of putting it, tend to have high temperature values and live their lives across a wider region of state space, which reduces their ability to wield precision and competitive advantage but protects them from the most extreme failure-modes as well.
There's another way around it. People that see themselves as "freethinkers" are also ultimately contrarians. Taking contrarianism as part of your identity makes people value unconventional ideas, but turn that around: It also means devaluing mainstream ideas. Since humanity is basically an optimization algorithm, being very contrarian means that, along with throwing away some bad assumptions, one also throws away a whole lot of very good defaults. So one might be right in a topic or two, but overall, a lot of bad takes are going to seep in and poison the intellectual well.
I mean, isn't the problem that they actually aren't that smart or rational. They're just a group of people who've built their identity around believing themselves to be smart...
They're also not freethinkers. They're a community that demand huge adherence to their own norms.
Great summary, and you can add utilitarianism to the bucket of ideologies that are just too rigid to fully explain the world and too rational for human brains not to create a misguided cult around
Ok but social clustering is how humans work. Culture translated to modern idiomatic language is “practice of a cult”. Ure translates to “practice of”, Ur being the first city so say historians; clusters of shared culture is our lived experience. Forever now there have been a statistical few who get stuck in a while loop “while alive recite this honorific code, kill perceived threats to memorized honorific chants”.
We’ve observed ourselves do this for centuries. Are your descriptions all that insightful?
How do you solve isolation? Can you? Will thermodynamics allow it? Or are we just neglecting a different cohort?
Again due to memory or social systems are always brittle. Everyone chafes over social evolution of some kind, no matter how brave a face they project in platitudes, biology self selects. So long as the economy prefers low skilled rhetoricians holding assets, an inflexible workforce constrains our ability to flex. Why is there not an “office worker” culture issue? Plainly self selecting for IT to avoid holding the mirror up to itself.
Growing up in farmland before earning to STEM degrees, working on hardware and software, I totally get the outrage of people breaking their ass to grow food while some general studies grad manages Google accounts and plays PS5 all night. Extreme addiction to a lived experience is the American way from top to bottom.
Grammatically correct analysis of someone else. But this all gets very 1984 feeling; trust posts online, ignore lived experience. It’s not hard to see your post as an algebraic problem; the issues of meatspace impact everyone regardless of the syntax sugar analysis we pad the explanation with. How do you solve for the endless churn of physics?
It's equally fascinating to see how effectively these issues are rapidly retconned out of the rationalist discourse. Many of these leaders and organizations who get outed were respected and frequently discussed prior to the revelations, but afterward they're discussed as an inconsequential sideshow.
> TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult.
I still think cults are a rare outcome. More often, I've seen people become "rationalist" because it gives them tools to amplify their pre-existing beliefs (#4 in your list). They link up with other like-minded people in similar rationalist communities which further strengthens their belief that they are not only correct, but they are systematically more correct than anyone who disagrees with them.
> They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
I have never seen this and I've been active around this around for almost two decades now.
> isolation
Also very much doesn't match my experience. Only about a quarter of my friends are even rationalists.
I disagree. It's common for any criticisms of rationalism or the rationalist community to be dismissed as having ulterior motives. Even the definition of rationalism is set up in a way that it is de facto good, and therefore anyone suggesting anything negative is either wrong or doesn't know what they're talking about.
Maybe so! They didn't kick me out. I chose to leave c. early 2021, because I didn't like what I saw (and events since then have, I feel, proven me very right to have been worried).
This is a very insightful comment. As someone who was 3rd-degree connected to that world during my time in the bay, this matches the general vibe of conversations and people I ran into at house parties and hangouts very very well.
It's amazing how powerful isolation followed by acceptance is at modifying human behavior.
i see 2 - superiority complex and lack of such an "irrational" thing like empathy. Basically they use crude logical-like looking constructions to excuse their own narcissism and related indulgences.
>The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community
It's precisely those kind of people though that would ever be so deluded and so little self conscious as to start a group about rationality - and declare themselves its arbiters.
A 2023 post on Rationalism forum LessWrong.com warned of coming violence in the Zizian community. “Over the past few years, Ziz has repeatedly called for the deaths of many different classes of people,” the anonymous post read. Jessica Taylor, a friend of Baukholt’s, told Open Vallejo she warned Baukholt about the Zizians, describing the group on X as a “death cult.”
This story just keeps getting more and more bizarre. Reading the charges and supporting affidavits, the whole thing is reading more and more like some sort of Yorgos Lanthimos film. The rationalist connection - a literal sequel to the 2022 events (in turn a sequel to the 2019 CFAR stuff) - is already weird enough. But I can't get over the ridiculousness of the VT situation. I have spent time in that area of VT, and the charged parties must have been acting quite bizarre for the clerk to alert the police. Checking into a motel wearing all black, open carrying does NOT cut it. The phones wrapped in foil is comical, and the fact that they were surveilled over several days is interesting, especially because it reads like the FBI only became aware of their presence after the stop and shootout?
The arresting agent seems pretty interesting, a former risk adjuster who recently successfully led the case against a large inter-state fraud scheme. This may just be the plot of Fargo season 10. Looking forward to the season arc of the FBI trying to understand the "rationalist" community. The episode titled "Roko's Basilisk", with no thematically tied elements, but they manage to turn Yudkowsky into a rat.
This story happened in my backyard. The shootout was about 40 minutes from me but Youngblut and Felix Bauckholt were reported by a hotel clerk dressed in tactical gear and sporting firearms in a hotel a few blocks from me.
Weird to see a community I followed show up so close to home and negatively like this. I always just read LW and appreciated some of the fundamentals that this group seems to have ignored. Stuff like rationality has to objectively make your life and the world better or its a failed ideology.
Edit: I've been following this story for over a week because it was local news. Why is this showing up here on HN now?
> Weird to see a community I followed show up so close to home and negatively like this.
I had some coworkers who were really into LessWrong and rationality. I thought it was fun to read some of the selected writings they would share, but I always felt that online rationalist communities collected a lot of people with reactionary, fascist, misogynistic, and far-right tendencies. There’s a heavily sanitized version of rationality and EA that gets presented online with only the highlights, but there’s a lot more out there in the fringes that is really weird.
For example, many know about Roko’s Basilisk as a thought exercise and much has been written about it, but fewer know that Roko has been writing misogynistic rants on Twitter and claiming things like having women in the workforce is “very negative” for GDP.
The Slate Star Codex subreddit was a home for rationalists on Reddit, but they had so many problems with culture war topics that they banned discussion of them. The users forked off and created “The Motte” which is a bit of a cesspool dressed up with rationalist prose. Even the SlateStarCodex subreddit has become so toxic that I had to unsubscribe. Many of the posts and comments on women or dating were becoming indistinguishable from incel communities other than the rationalist prose style.
Even the real-world rationalist and EA communities aren’t immune, with several high profile sexual misconduct scandals making the news in recent years.
It’s a weird space. It felt like a fun internet philosophy community when my coworkers introduced it years ago, but the longer I’ve observed it the more I’ve realized it attracts and accepts a lot of people whose goals aren’t aligned with objectively “make the world better” as long as they can write their prose in the rationalist style. It’s been strange to observe.
Of course, at every turn people will argue that the bad actors are not true rationalists, but I’ve seen enough from these communities to know that they don’t really discriminate much until issues boil over into the news.
>In the second half of the 5th century BCE, particularly in Athens, "sophist" came to denote a class of mostly itinerant intellectuals who taught courses in various subjects, speculated about the nature of language and culture, and employed rhetoric to achieve their purposes, generally to persuade or convince others. Nicholas Denyer observes that the Sophists "did ... have one important thing in common: whatever else they did or did not claim to know, they characteristically had a great understanding of what words would entertain or impress or persuade an audience."
The problem then, as of now, is sorting the wheat from the chaff. Rationalist spaces like /r/SSC, The Motte, et. al are just modern sophistry labs that like to think they're filled with the next Socrates when they're actually filled with endless Thrasymachi. Scott Alexander and Eleizer Yudkowsky have something meaningful (and deradicalizing) to say. Their third-degree followers? Not so much.
It's somewhat odd to represent a community as being right wing when the worst thing to come from it was a trans vegan murder cult. Most "rationalists" vote Democrat, and if the franchise were limited to them, Harris would have won in a 50 state landslide.
The complaint here seems to be that rationalists don't take progressive pieties as axiomatic.
I have had some rather… negative vibes, for lack of a better term, from some of the American bits I've encountered online; but for what it's worth, I've not seen what you described in the German community.
There is, ironically, no escape from two facts that was well advertised at the start: (1) the easiest person for anyone to fool is themselves, and (2) politics is the mind-killer.
Reading about the Roko’s Basalisk saga, it seems clear that these people are quite far from rational and of extremely limited emotional development. It reads like observing a group of children who are afraid of the monster in the closet, which they definitely brought into existence by chanting a phrase in front of the bathroom mirror…
Members of these or other similar communities would do well to read anything on them dispassionately and critique anything they read. I’d also say that if they use Yudkowsy’s writings as a basis for understanding the world, that understanding is going to have to the same inadequacies of Yudkowsky and his writings. How many people without PhDs or even relevant formal education are putting out high quality writing on both philosophy and quantum mechanics (and whatever other subjects)?
Clearly this is a poorly organized movement, with wildly different beliefs. There is no unity of purpose here. Emacs or vi, used without core beliefs being challenged?!
And one does not form a rationalist movement, and use emacs after all.
After seeing this news, I recall watching a video by Julia Galef about "what is rationality". Would it be fair to say that in this situation, they lack epistemic rationality but are high in instrumental rationality?
If they had high instrumental rationality, they would be effective at achieving their goals. That doesn’t seem to be the case - by conventional standards, they would even be considered "losers": jobless, homeless, imprisoned, or on the run.
This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
This is the kind of thing where it is warranted that the feds gets every single wiretap, interception, and surveillance possible on everyone involved in the zizian movement.
Calling them a roaming band or "tactical death squad" is giving far too much credit. It is a handful of crazy people who convinced themselves that a couple murders would solve their problems.
In particular the attack on border patrol was obviously random and illogical. And the fact that no one was convicted of the Pennsylvania murders seems to reflect more on the police and prosecutors than the intelligence of the perpetrators.
Speaking of random and illogical, what prompted the Border Patrol to stop their car in the first place, I wonder? None of the news stories have elaborated on that.
Split the beliefs from the crime. A bunch of murderers were caught. Given they are dangerous killers, one killing a witness and one faking their death yeah they should get warrants.
Pretty hard to do that when the beliefs explicitly endorse murder. Ziz used to run a blog on which she made thinly veiled death threats, argued for a personal philosophy of hair-trigger escalation and massive retribution, raged at the rationalist community for not agreeing with her on that philosophy and on theories of transness, and considered most people on Earth to be irredeemably evil for eating meat.
It appears the ven diagram of the beliefs and crimes overlap quite a bit. Sometimes the beliefs are that certain crimes should be committed.
This is a free country (disputably) and you should be able to think and say whatever you want, but I also think it is reasonable for law enforcement in the investigation of said crimes to also investigate links to other members in the movement.
Yes, thank you for saying so- reading about all this, but especially all the people chiming in who already knew about a lot of it? The fact that the founder of LessWrong coined the term “alignment,” a subject I’ve read about many times… it feels like learning lizard people always walked among us
Honestly it feels like this is the first time people are realizing that six degrees of separation means that crazy people can usually be connected to influential people. In this case they're just realizing it with the rationalists.
Rest assured, I'm pretty sure among the easiest ways to make yourself the target of surveillance is to do anything interesting at all involving technology. All serious AI researchers, for example, should assume that they are the victims of this.
>This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
I don't exactly see how it's different from a group of habitual alcoholics discussing politics and having a fatal disagreement, which is a normal day of the week in any police department with enough demographics to have this sort of low-effort low-gain crime. It's more scandalous because of details and people involved are more interesting, but everyone will forget about it after a week, as they don't matter.
Is the appellation in the headline, "radical vegan trans cult," a true description?
> Authorities now say the guns used by Youngblut and Bauckholt are owned by a person of interest in other murders — and connected to a mysterious cult of transgender “geniuses” who follow a trans leader named Jack LaSota, also known by the alias “Ziz.”
The NY Post tried to frame them as "radical leftist", but that's a big stretch. I don't think most rationalists would consider themselves leftist. The article also seems to be leaning into the current "trans panic" - pretty typical for the NYP.
I also dislike Right/Left categorizations. Most people don't even know the history of the terms and their roots in the French Revolution. Though the "Cult of Reason" established then certainly had the Left categorization at the time.
But is the trans element not a major part of this cult? It seemed to be from the linked story in the top link. But if there is something incorrect there, or false in the NYP reporting, you should point it out. If it is a major element of this cult, then far from complaining about NYP, I would complain about any news organization leaving it out of its reporting.
Who is making a statement about "most rationalists" here? The claim is about a trans vegan murder cult, which doesn't appear to be a natural member of the right side of the political spectrum.
> Is the appellation in the headline, "radical vegan trans cult," a true description?
For this small group, yes. Their leader believes in Nuremberg-style trials for people who eat meat. If you want to go down the rabbit hole, it gets much weirder: https://zizians.info/
The cult does seem to target people who identify as trans - OP has some discussion of this. Not sure if that justifies calling it a "radical vegan trans cult" though. Trans folks seem to be overrepresented in rationalist communities generally, at least on the West Coast - but there may be all sorts of valid reasons for that.
I’m from Burlington and a couple weeks ago downtown I noticed a group of 2 or 3 people walking past me in full black clothing with ski masks (the kind you rob banks with).
I thought it was strange, having never seen that before except on Halloween, but didn’t think to alert any authorities specifically because Burlington is filled with people dressing differently and doing strange things. But 99% of the time it’s totally non violent and benign.
I can't speak to Burlington but in philly balaclavas (which is what those masks are called) are quite common and have been since 2020. I suspect this is true of many cities. It's been the subject of some controversy involving mask bans. In fact seeing someone in all black with a ski mask on is a pretty typical, if intimidating, fashion.
Sneer Club is one of the most nasty, uncharitable, and bad-faith subs out there these days. They generally hate HN, as well. I think any community which exists solely to make cheap shots at another community is poison at its core, SC’s parasocial relationship with LW is a perfect example.
N-gate was among the only things that made this website worth reading. They solely existed to make "cheap shots" at HN. If that's "poison", than I don't want an antidote!
Yeah! I devoured the entire series of posts in one go back then, I had no idea about all the people and their ties. Plus it was a super engaging read, I could imagine being there.
I really loved the language describing the singularity as "an inescapable runaway feedback loop which leads to the ascension of an enemy god". Beautiful.
Holy shit, there are 7 chapters to that last one. Chapter 1 is fucking mind-blowing. I could never figure out why they were obsessed with Roko's basilisk but it makes total sense now considering how it all started.
This is such an epic unbelievable story. We live in this world?
Chapter 6 covers how StarSlateCodex comes into the picture, by the way. I always wondered that too.
Most of the news coverage I've seen of this story is omitting what some might consider a relevant detail: almost all the members of this group are trans.
This is a divisive topic, but failing to mention this makes me worry a story is pushing a particular agenda rather than trying to tell the facts. Here's what the story looks like if the trans activism is considered central to the story:
While Ngo's version is definitely biased, and while I don't know enough about the story to endorse or refute his view, I think it's important to realize that this part of the story is being suppressed in most of the coverage elsewhere.
it's been an exhausting couple of weeks for me, as a trans person. one executive order after another, explicitly attacking us. scrambling to update all my documents, navigating a Kafkaesque bureaucracy with constantly shifting rules.
now this.
there are like six Zizians. there are millions of trans people. I'm sure that many of the Zizians being trans says something about the Ziz cult, but Ziz doesn't say anything about "trans activism."
any evil one trans person does, is used to stain all trans people. recognize this tendency; don't let this become like blood libel.
I know I sound crazy saying what I'm about to say but it is the truth as I understand it and I think it's important.
It appears to me that there is a certain modality of thought that occurs much more often in people with hEDS, specifically those with TNXB SNPs. If you're super deep into type theory the odds substantially increase that you have hEDS - it's how I found out that I had it. And this same group is far more likely to be trans than the general population. A link that would be far more obvious if hEDS wasn't so underdiagnosed.
Additionally, it appears to me that mental disorders are often caused by auto-immune conditions which is extremely common in those with hEDS. So with a strong selection bias on math ability and trans and you're gonna end up with a lot of hEDS people who are strongly predisposed to mental disorders. I know someone with hEDS who obsessively studies the link between hEDS and serial killers - not something I want to be associated with the stats were pretty convicting. I do think it is possible that two TNXB SNPs are sufficient to explain why I think the way I do, why I'm far more like Ted Kaczynski than I would like to be. Of note; Ted Kaczynski did consider gender reassignment back in 1966.
Which is to say two things, I think what people are observing is a real phenomena and it is not purely from personal biases, though I'm not denying personal biases play a part in perception. And perhaps with that in mind the solution is in fact in diagnosing and treating the underlying auto-immune conditions. And to put a hat on a hat on my 'crazy' I think people are going to find that GLP1-Agonists like ozempic, specifically at the lower doses, are quite helpful in managing auto-immune conditions, among other things.
Yes, this. Please. I am so very tired, every day I wake up to the news that more legal protections are being stripped from me and the people I care about. I didn't need "trans terror" flashed in my face in large boldface type on top of everything else tonight. The GP didn't make this clear, but The Post Millenial is apparently a far-right publication and the author of that article seems to have built his brand on painting large groups of people as violent.
I am so so sorry you have to deal with this. As an Australian I have been watching on in horror this week at the way trans persons are being demonized and oppressed in your country. I know HN is meant to be an apolitical space, but I hope that the mods here have the sense to allow a certain amount of push back again this fascist nonsense.
The Zizians.info site (linked by one of the HN posts re: this story) mentions that the Zizians did target people who identified as transgender for indoctrination, so this is not really surprising. People who are undergoing this kind of stress and marginalization may well be more vulnerable to such tactics.
The Ziz method of indoctrination involves convincing his minions they are trapped inside a mental framework they need to break free of. Trans people already feel trapped in a body not aligned with who they are, and are naturally susceptible to this message (and therefore natural targets for recruitment).
I think the relevance of their transness is not very significant.
The lesswrong apocalypse cult has been feeding people's mental illness for years. Their transness likely made them more outsiders to the cult proper, so e.g. they didn't get diverted off into becoming Big Yud's BSDM "math pets" like other women in the cult.
I doubt they are significantly more mentally ill than other members of the cult they just had less support to channel their vulnerability into forms more beneficial to the cult leaders.
Yudkowsky wrote an editorial in Time advocating for the use of nuclear weapons against civilians to prevent his imagined AI doomsday... and people are surprised that some of his followers didn't get the memo that think-pieces are for only for navel gazing. If not for the fact that the goal of cult leaders is generally to freeze their victims into inaction and compliance we probably would have seen more widespread murder as a result of Yud cult's violent rhetoric.
>I doubt they are significantly more mentally ill than other members.
Why would this certain group defy the US trend of being 4-7x more likely afflicted by depressive dissorder? We are talking about a demographic with a 46% rate of suicidal ideation and you doubt that's significant why?
Marginalized groups seem to be a target / susceptible to this kind of thing.
I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration. But the idea wasn't unusual or all that out there so I didn't think much of it. But that group showed up again and again and eventually someone asked and their theory all but seemed to imply that nobody else could possibly have ... feelings and that lack of understanding made those people lesser and them greater.
It seemed to come from some concept that their experience imparted some unique understanding that nobody else could have, and that just lead down a path that lead to zero empathy / understanding with anyone outside.
Reddit encounters are always hard to understand IMO so I don't want to read too much into it, but that isolation that some people / groups feel seem to potentially lead to dark places very easily / quickly.
This group formed in the SF Bay Area, which is known for being one of the most accepting places in the world for LGBT people. If marginalization were the main cause, it seems to me that the group would have been located somewhere else. I think it's more likely that these people had an underlying mental disorder that made them likely to engage in both violent behavior and trans identity.
One big difference the Zizians have with the LessWrong community is that LW people believe that human minds cannot be rational enough to be absolute utilitarians, and therefore a certain kind of deontology is needed.[1] In contrast, the Zizians are absolutely convinced of the correctness of their views, which leads them to justify atrocities. In that way it seems similar to the psychology of jihadists.
>I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration.
The death of the author is a reasonable approach to reading a work. But what you said reminded me of the more delusional view in which a) the watcher/reader's approach is the only "correct" one, and b) anyone who disagrees is *EVIL*. An instance of this happened among Tumblrinas obsessed with the supposed homosexual relationship between Holmes and Watson on BBC's Sherlock, and who were certain that the next episode of the show would reveal this to the world. Welp. <https://np.reddit.com/r/the_meltdown/comments/5oc59t/tumblr_...>
There is a well-documented correlation between gender dysphoria, mental health conditions, and autism spectrum disorder. These overlapping factors may contribute to increased vulnerability to manipulative groups, such as cults.
Thanks the pronouns were confusing me and making it hard for me to follow the complex story. I assumed I made a mistake when the article mentions a Jack and refers to them as Jack the whole way through but uses she at the end.
Unfortunately the gendered language we use is also the mechanism to provide clues and content as you read the story. So if I can rely on that they need to call it out to help the reader.
It goes unmentioned because there is an unwritten rule in progressive media that marginalized groups must never be perceived as doing wrong, because that will deepen their marginalization.
In practice it creates a moral blind spot where the worst extremists get a pass, in the name of protecting the rest. Non-progressive media are all too happy to fill in the gap. Cue resentment, cue backlash, cue Trump. Way to go, guys!
The fact that many are transgender seems to be relevant because it’s a recruiting and manipulation tactic, not because of a connection to “trans activism.” I haven’t seen any evidence of that connection besides people involved being transgender.
I don't think it's so much pushing an agenda, as it is avoiding a thermonuclear hot potato of modern life. If you start talking about gender identity, everyone has STRONG opinions they feel they must share. Worse, a subset of those opinions will be fairly extreme, and you're potentially exposing yourself to harassment or worse. If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
So if you can tell the story without the possibly superfluous detail of the genders of the people involved, that's a pretty obvious angle to omit. Andy Ngo is obviously not doing this, but that's really only because he has a very clear agenda and in fact his entire interest in this story probably stems from that.
Yes, that's a reasonable possibility as well. It's not proof of an agenda, and might be prudent, but I do think it's a form of bias. There's a thin line between skipping "possibly superfluous" details and skipping core parts of a story that might provide evidence for viewpoints one disagrees with. The result is still that readers need to realize that they are being presented with a consciously edited narrative and not an unbiased set of facts.
No, that is omitting quite a significant detail. If apparently the majority of people have X characteristic that is a tiny percentage in the overall population there is some correlation or something newsworthy there,
> If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
That’s not true: 99% percent of news outlets have absolutely no fear supporting trans activism.
It’s trivial to find hundreds of such cases from sfgate with a google search.
I suspect the answer is closer to "one trans person preyed upon the trust and vulnerability of people in their circle" and that happened to include multiple other trans people (who were likely extremely vulnerable to the charismatic charms of a cult leader).
I'm sorry, but Andy Ngo is beyond "biased" - he is deliberately derogatory towards the trans community whenever he has the opportunity.
If the gender identities of the Zizians aren't being brought up in the mainstream press, it's likely because it isn't relevant to the story. Responsible reporting includes not bringing up details that could encourage moral panic or hate crimes if they aren't demonstrably relevant to the story. This kind of "well actually" response is really no different than the racist complaints people make in the comments of every crime story that failed to mention how black the accused was, when race wasn't a factor, and nobody ever cares when the accused is white.
Most cults are filled with straight, cigender, white people. If every story about cult violence brought this up, its connection to the story would be rightfully mocked as contrived.
Citing the guy who tries to dig up dirt on every trans person he can isn't exactly a revelation. It's exactly what I would expect Ngo to do, and only because it validates his neo-fascist peers' anti-trans views.
The fact that a death/murder cult might be deliberately targeting vulnerable trans folks for recruitment and indoctrination can certainly be relevant. I agree that merely talking about "a vegan trans" murder cult, as some media outlets did, would be something rather different however.
I don't know, seems like if we're trying to do that kind of targeting, LessWrong is the better place to start. 100% of these people are LessWrong people, right?
You think LessWrong is a better place to start probably because you've heard a lot about trans young people whereas you probably haven't heard much about and probably don't know much about Less Wrong, but I am confident that if you were to get to know us, you'll find that we are mostly good people and we have a healthy community.
> Bauckholt was a biological male who identified as trans and used feminine pronouns. He was an award-winning youth math genius who later graduated from the University of Waterloo in Ontario, Canada ..... Around 2021, he was hired as a quantitative trader at Tower Research Capital in New York.
Honestly, being trans is the least interesting thing about this dude (girl?). This is not some random angry person with uncontrolled emotions.
I'm probably the furthest thing from an active supporter of trans (whatever you take it to mean, I'm old-fashioned about gender). But how does it matter to this story at all? You could take any group of people and find a crazed group of killers among them.... And people tend to stick to people like them, so again, how is it relevant?
Are people confusing transgender and transhumanism somehow? Looking up rationalist philosophy it seems to be about 'improving' the human species, human potential movement related, people becoming cyborgs and living forever as AI uploads, etc.? Vaguely eugenic in outlook, if more individualist. I suppose such a philosophy views gender as an irrelevant issue, so recruiting transgender people would be something they do?
A good rule of thumb is that people who view philosophy as something other than an amusing pasttime are best avoided, especially when they're spending their time trying to recruit others into their cult.
I know something else that is over-represented in killings. Soldiers. And soldiers are mostly male. So male is the natural killing machine, right?
But male humans are mostly selected to be soldiers by design. In some countries the only possible gender for soldiers.
So mabe it could be that there is some other agenda at play here? Mabe it is not related to trans but to grooming a target group into becoming cult members? Why is it that we always have to think there is a /Big Conspiracy/ somewhere? Don't spread around fud that you have no clue about with words like "omitting something I consider relevant" without making damn sure it realy is relevant. You just feed the trolls if you keep doing this.
Andy Ngo is not a credible source of news about trans people. Media Matters describes him as a “right wing troll” who spread misinformation about this issue [1] and The Advocate points out that right wing media have repeatedly ignored the facts around supposed trans shooters and continued to spread misinformation on the subject. [2]
Media matters is just as biased as Andy Ngo. They are the definition of a hit piece mill, and will find any reason possible to criticize popular figures with right wing beliefs.
IMO, the media frenzy on the subject was part of a corporate plot to promote certain beliefs in order to silence contrarian ideas which could trigger a conversation around corporate negligence topics such as the increase of endocrine disruptors in our environment and their effects on our health.
The plot worked for some extent. Hence, I cannot fully express myself here in clear language.
We can see that health has become a central topic of American politics but we're still dancing around some of the more important issues, because implying certain connections is taboo.
The first step to fixing a problem is acknowledging it. If not fixed, it will get worse until it becomes impossible to ignore.
WARNING: The Post Millennial is an extremist website.
I can’t believe that getting “news” about an extremist group from another extremist organization is a productive way to make sense of the world.
Honestly, read whatever you want but just be aware that radical extremist exist and commit horrific crimes and other radical extremist will exploit that.
It is radical extremism that’s dangerous in and of itself—not just a particular brand of radical extremism.
In what way is this supposed to be a relevant detail? Unless you think that they are killing people because they are trans, why should you report that they are part of a marginalized group? If they were mostly blondes or had freckles, should that be part of the story too?
It seems as if the group targeted trans member in their recruitment - and then used evidence of general marginalization to justify their crimes.
If you look up old reddit threads about the murder of the landlord, you can see many people defending the crime as the landlord was transphobic. It's not just a random detail like freckles, it seems like the identity shaped the way this group interacted with the world.
It's basically impossible that it's a random irrelevant detail, I'd say any such detail is fair to share.
For example, if every member of this group was Indian American I'd consider that a fair detail to note, the chances of that happening at random are minuscule, yet that's orders of magnitude more probable than all of them being trans for no reason.
Doesn't sound rationalist to me (from Ziz quoted section of article linked below):
"Ziz
Impostors keep thinking it's safe to impersonate single goods. A nice place to slide in psyche/shadow, false faces, "who could ever falsify that I'm blaming it on my headmate!"
Saying you're single good is saying, "Help, I have a Yeerk in my head that's a mirror image of me. I need you to surgically destroy it, even if I'm then crippled for life or might die in the process. Then kill me if I ever do one evil act for the rest of my life. That's better than being a slave. Save me even though it is so easy to impersonate me. And you will aggro so many impostors you'll then be in a fight to the death(s) with. Might as well then kill me too if I don't pass an unthinkable gom jabbar. That'll make us both safer from them and I care zero about pain relative to freedom from my Yeerk at any cost."
It's an outsized consequentialist priority, even in a doomed timeline, to make it unsafe to impersonate single goods.
Critical to the destiny of the world. The most vulnerable souls impostors vex. To bring justice to individual people, from collective punishment."
This Ziz person is really unhinged. I read some of their writing, it reminds me of every eloquent, manipulative narcissist I've met. They are never as smart as they think they are - or as smart as they want you to think they are - though they may be smart, charming, and engaging. They've created an alternate universe in their mind and haphazardly abuse whatever ideas they've encountered to justify it.
They write and talk in their group lingo so outsiders can't understand it without diving deep into their lore, mindset and community. It's a common thing. Seen it numerous times. Don't waste your time.
Head of LessWrong and generally active rationality community leader here. Happy to answer any questions people have. These people haven't been around the community for a long time, but I am happy to answer questions with my best guesses on why they are doing what they are doing.
They've been banned on LW and practically all in-person things for like 5+ years now. My guess is the reason why they hung around the rationality community this much years ago is just that it's a community with much higher openness to people and ideas than normal, especially in the Bay Area. IMO in this instance that was quite bad and they should have been kicked out earlier than they did end up getting kicked out (which was like 4 years ago or so).
The HN title added the word "rationalist", which isn't in the source article. This is editorializing in a way that feels kind of slander-y. Their relationship to the bay area rationalist community is that we kicked them out long before any of this started.
I mean, they seemed kind of visibly crazy, often saying threatening things to others, talking about doing crazy experiments with their sleep, often insinuating violence. They were pretty solidly banned from the community after their crazy "CFAR Alumni Reunion" protest stunt, and before then were already very far into the fringes.
In addition to the tragedy of the killings, I worry that this will give rationalism _as a concept_ a bad name. Deep thought is so important to progress, and already is somewhat stigmatized. My church underwent a nasty split when I was a kid, and the reason I heard was “pastor Bobby read too many books”. Obviously it wasn’t as simple as that, but the message was clear — don’t read too many books. I suspect this will interpreted similarly.
It feels that our world has a resurgence of anti-intellectual, anti-science, anti-data, etc movements. I hate that.
Sadly rationalism is a movement where it's easy for someone who doesn't understand to wrap their desires in "rationality" and carry them out without any moral guilt.
Rationalism? The term has been used a lot of times since Pythagoras [0], but the combination of Bay Area, Oxford, existential risks, AI safety makes it sound like this particular movement could have formed in the same mold as Effective Altruism and Long-Termism (ie, the "it's objectively better for humanity if you give us money to buy a castle in France than whatever you'd do with it" crowd that SBF sprung from). Can somebody in know weigh in?
- SBF and Alameda Research (you probably knew this),
- the Berkeley Existential Risk Initiative, founded (https://www.existence.org/team) by the same guy who founded CFAR (the Center for Applied Rationality, a major rationalist organization)
- the "EA infrastructure fund", whose own team page (https://funds.effectivealtruism.org/team) contains the "project lead for LessWrong.com, where he tries to build infrastructure for making intellectual progress on global catastrophic risks"
- the "long-term future fund", largely AI x-risk focused
Rationalism is simply an error. The thing being referred to here is "LessWrong-style rationality", which is fundamentally in the empirical, not rational school. People calling it rationalism are simply confused because the words sound similar.
(Of course, the actual thing is more closely "Zizian style cultish insanity", which honestly has very very little to do with LessWrong style rationality either.)
Just like HN grew around the writing of Paul Graham, the "rationalist community" grew around the writings of Eliezer Yudkowsky. Similar to how Paul Graham no longer participates on HN, Eliezer rarely participates on http://lesswrong.com anymore, and the benevolent dictator for life of lesswrong.com is someone other than Eliezer.
Eliezer's career has always been centered around AI. At first Eliezer was wholly optimistic about AI progress. In fact, in the 1990s, I would say that Eliezer was the loudest voice advocating for the development of AI technology that would greatly exceed human cognitive capabilities. "Intentionally causing a technological singularity," was the way he phrased it in the 1990s IIRC. (Later "singularity" would be replaced by "intelligence explosion".)
From 2001 to 2004 he started to believe that AI has a strong tendency to become very dangerous once it starts exceeding the human level of cognitive capabilities. Still, he hoped that before AI starts exceeding human capabilities, he and his organization could develop a methodology to keep it safe. As part of that effort, he coined the term "alignment". The meaning of the term has broadened drastically: when Eliezer coined it, he meant the creation of an AI that stays aligned with human values and human preferences even as its capabilities greatly exceed human capabilities. In contrast, these days, when you see the phrase "aligned AI", it is usually being applied to an AI system that is not a threat to people only because it's not cognitively capable enough to dis-empower human civilization.
By the end of 2015, Eliezer had lost most of the hope he initially had for the alignment project in part because of conversations he had with Elon Musk and Sam Altman at an AGI conference in Puerto Rico followed by Elon and Sam's actions later that year, which actions included the founding of OpenAI. Eliezer still considers the alignment problem solvable in principle if a sufficiently-smart and sufficiently-careful team attacks it, but considers it extremely unlikely any team will manage a solution before the AI labs cause human extinction.
In April 2022 he went public with his despair and announced that his organization (MIRI) will cease technical work on the alignment project and will focus on lobbying the governments of the world to ban AI (or at least the deep-learning paradigm, which he considers too hard to align) before it is too late.
The rationalist movement began in November 2006 when Eliezer began posting daily about human rationality on overcomingbias.com. (The community moved to lesswrong.com in 2009, at which time overcomingbias.com became the personal blog of Robin Hanson.)
The rationalist movement was always seen by Eliezer as secondary to the AI-alignment enterprise. Specifically, Eliezer hoped that by explaining to people how to become more rational, he could
increase the number of people who are capable of realizing that AI research was a potent threat to human survival.
To help advance this secondary project, the Center for Applied Rationality (CFAR)
was founded as a non-profit in 2012. Eliezer is neither an employee nor a member of the board
of this CFAR. He is employed by and on the board of the non-profit Machine Intelligence
Research Institute (MIRI) which was founded in 2000 as the Singularity Institute for Artificial Intelligence.
I stress that believing that AI research is dangerous has never been a requirement for posting on lesswrong.com or for participating in workshops run by CFAR.
Effective altruism (EA) has separate roots, but the two communities have become close over the years, and EA organizations have donated millions to MIRI.
He has no formal education. He hasn't produced anything in the actual AI field, ever, except his very general thoughts (first that it would come, then about alignment and doomsday scenarios).
He isn't an AI researcher except he created an institution that says he is one, kind of as if I created a club and declared myself president of that club.
He has no credentials (that aren't made up), isn't acknowledged by real AI researchers or scientists, and shows no accomplishments in the field.
His actual verifiable accomplishments seem to be having written fan fiction about Harry Potter that was well received online, and also some (dodgy) explanations of Bayes, a topic that he is bizarrely obsessed with. Apparently learning Bayes in a statistics class, where normal people learn it, isn't enough -- he had to make something mystical out of it.
Why does anyone care what EY has to say? He's just an internet celebrity for nerds.
A great example of superficially smart people creating echo chambers which then turn sour, but they can't escape. There's a very good reason that, "Buying your own press" is a cliched pejorative, and this is an extreme end of that. More generally it's just a depressing example of how rationalism in the LW sense has become a sort of cult-of-cults, with the same old existential dread packaged in a new "rational" form. No god here, just really unstable people.
That castle was found to be more cost-effective than any other space the group could have purchased, for the simple reason that almost nobody wants castles anymore. It was chosen because it was the best calculation; the optics of it were not considered.
It would be less disingenuous if you were to say EA is the "it's objectively better for humanity if you give us money to buy a conference space in France than whatever you'd do with it" crowd -- the fact that it was a castle shouldn't be relevant.
Nobody wants castles anymore because they’re impractical and difficult to maintain. It’s not some sort of taboo or psychological block, it’s entirely practical.
Actually, the fact that people think castles are cool suggests that the going price for them is higher than their concrete utility would make it, since demand would be boosted by people who want a castle because it’s cool.
Did these guys have some special use case where it made sense, or did they think they were the only ones smart enough to see that it’s actually worth buying?
> That castle was found to be more cost-effective than any other space the group could have purchased
In other words, they investigated themselves and cleared themselves of any wrongdoing.
It was obvious at the time that they didn't need a 20 million dollar castle for a meeting space, let alone any other meeting space that large.
They also put the castle up for sale 2 years later to "use the proceeds from the sale to support high-impact charities" which was what they were supposed to be doing all along.
The depressing part is that the "optics" of buying a castle are pretty good if you care about attracting interest from elite "respectable" donors, who might just look down on you if you give off the impression of being a bunch of socially inept geeks who are just obsessed with doing the most good they can for the world at large.
Both are factual, the longer statement has more nuance, which is unsurprising. If the emphasis on the castle and SBF - out of all the things and people you could highlight about EA - concisely gives away that I have a negative opinion of it then that was intended. I view SBF as an unsurprising, if extreme, consequence of that kind of thinking. I have a harder time making any sense of the OP story in this context, that's why I was seeking clarification here.
Why buy a conference space. Most pubs will give you a seperate room if you promise to spend some money at the bar. There are probably free spaces had they researched.
If I am donating money and you are buying a conference space on day 1 I'd want it to be filled with experienced ex-UN field type of people and Nobel peace prize winners.
On the fringes of the rationalist community, there are obviously questionable figures who may be playing an evil game (Bankman Fried) or have lost their way intellectually and morally.
My impression from occasional visits to astral codex ten, however, is that the vast majority of people are reasonable and sometimes succeed to make the world a better place.
> My impression from occasional visits to astral codex ten, however, is that the vast majority of people are reasonable and sometimes succeed to make the world a better place.
This is a blog that for the last several months had a vigorous ongoing debate about whether or not shoplifters should be subject to capital punishment.
> My impression from occasional visits to astral codex ten, however, is that the vast majority of people are reasonable and sometimes succeed to make the world a better place.
The rationalist bloggers are very good at optics and carefully distance themselves from the fringes at the surface. They have a somewhat circular definition of rationalism that defines rationalists as being reasonable, which makes it easy to create post facto rationalizations that anyone who ends up on the wrong side of public opinion was actually not part of their tribe, rewriting any history.
The more uncomfortable topics are generally masked in coded language or carefully split off into isolated subforums for plausible deniability. Slate Star Codex (Astral Codex Ten’s precursor) had a “culture war thread” for years that was well known to contain a lot of very toxic positions dressed up in rationalist style language. Around 2019 they realized how much of a problem it was and split it into a separate forum (“The Motte”) for distance and plausible deniability. The Motte was a wild place where you could find people pushing things like holocaust denial or stolen election theories but wrapped up in rationalist language (I remember phrases “questions about orthodox holocaust narratives” instead of outright denial)
There’s also a long history of Slate Star Codex engaging with neoreactionary content over and over again and flirting with neoreactionary ideas in a classic rationalist “what if they’re actually right” context for plausible deniability. There have been some leaked emails from Scott revealing his engagement with the topic and it’s been an ongoing point of confusion for followers for years (See https://www.reddit.com/r/slatestarcodex/comments/9xm2p8/why_... )
The history of rationalist blogs and communities is largely lost on people who only occasionally visit the blogs and enjoy the writing style. There is a long history of some very unsavory topics not only being discussed, but given the benefit of the doubt or even upvotes. These are harder to associate with the main blogs since the 2019 split of contentious topics into “The Motte” side forum, but anyone around the community long enough remembers the ever-present weirdness of things like this Reddit thread on /r/SlateStarCodex preaches white nationalism and gets nearly 50 upvotes (in 2014): https://web.archive.org/web/20180912215243/https://www.reddi...
Reading a couple SSC posts for the first time here myself, so my impression is fairly limited, but it sounds like you might be blaming SSC unfairly for simply intellectually engaging with reactionary ideas, which I can't fault someone for, and nor should you.
Can you link to some specific examples which more explicitly have the "What if they're right?" subtext you're referring to?
There's also plenty of good reasons to be aware of these political ideas, given that, e.g. New Confucianism, which just happens to be quite influential in China is essentially a kind of "Neo-Reaction with Chinese Characteristics". And some people argue that the highly controversial Project 2025 - which seems to be driving policy in the new Trump administration - may be inspired by neo-reactionary ideas.
Had it not been in a serious article I would have believed it had to a parody or a joke of some sort.
“We are just like Darth Maul, but we like salads and drink soy milk instead of regular milk… and then kill people while dressed in tactical black outfits”?
What is even going on? Real life now sounds like some kinda of a broken LLM hallucination.
For just one thing, when you have a broken education system and omnipresent media franchises, you have a significant percentage of the population who know more about the Star Wars backstory structure and theories of diet than about history, civics or conventional morality.
I am not defending any of this fuckwits, but I don't know that it's much different than any organized religion. All of them are stories that get retold over and over until people accept them on faith. I can envision a world where our stories (movies, books) where history is lost of their creation, become facts. "Of course there was Jedi, we've just forgotten…"
Now, they're all fuckwits, but it's not outside the realm of thought.
A far more likely possibility is that their ideology is actually centered around "Sith who happen to originate from Vega (a.k.a. α Lyrae)", not "Sith who abstain from animal products".
(A residual possibility is "Sith from Las Vegas, Nevada".)
Look up Joshua Citarella’s coverage of the ideological milieus that Generation Z cultivated during COVID on platforms like Discord.
And then check out the term “metairony” or “postirony” and this story make more sense…at least as much sense as the absurdity of it all will allow you to make of it.
We seem to live in a post-ironic moment. Look no further than the Boogaloo Boys. They want to start a second American civil war and are named after the 1984 movie "Breakin' 2: Electric Boogaloo".
It would be a joke except they've engaged in violent attacks up to and including murder in the service of trying to start that aforementioned civil war. Are they serious or a joke? I think their embrace of a ridiculous name makes them almost more frightening because it shows their contempt for reasonableness, for lack of a better term.
It's comparable to how the Nazi "goose step" march was terrifying precisely because it was so awkward and unnatural. It's like, if these guys are capable of doing this with a straight face, what else are they capable of?
We really need a phrase for "appears to behave normally in regular matters and is capable of independent living, does not match other DSM symptoms, but believes insane things". It's a very common component in mass shootings.
> it doesn't absolve them from knowing right from wrong however
Certain mental conditions absolutely do, in a legal sense. Some conditions onset from brain injury and tumors, some to drug exposure, some are genetic, etc. For those unfortunate few who are afflicted, the most humane outcome is to intervene to keep them away from harming themselves and others, though often they are left unattended to by overstretched mental health services.
Since it’s a group so wonder if a group mental illness would make sense. I guess it could work for the court “look how out of touch with reality my client is, sith vegans, clearly they didn’t know what they were doing!”
What is going on: the internet is letting you know all this shit. Imagine you had a ticker feed of every story amongst the 8bn. That would be a crazy story per millisecond.
I don't know if these statistics are even kept, but the current social environment in the US feels like a ripe breeding ground for cults. I've had so many people in the past couple years be like "I just want to farm with my friends and family and get away from all this".
It has become a bit of a meme lately. I think there's something to be said about a malaise era leading to an uptick in erratic behavior.
But at the risk of sounding smug and condescending, as someone who actually bought 10 acres "to get away from it all", I get the sense that the type of people in this saga would pack up pretty quick after a little taste. Lesswrongers aren't exactly known for pragmatism, which is sorta the only mindset that works. There's all this work that you don't know you don't know about. I just fell down the rabbit hole of the ziz lore and goddamn do these people sound inept. Like they couldn't even fix their RV to get off their landlord's property. Lots of quasi-intellectual masturbatory posting and not a lot of skills.
All that is to say I'm not super worried about any of these cults really taking off. Logistics remain challenging.
I'm sure a historian could give a more detailed explanation, but my guess is that:
Once steel meets flesh the Role Play ends and all that's left is Live Action, and to take a life is not a meager task when life as you knew it was a virtual mediation.
They brought a sword to a gun fight. Presumably they would have killed the landlord on the first go if he hadn't managed to shoot and kill on of the assailants first.
But it seems they went back later and finished the job. (He survived the original attack, only to be subsequently successfully murdered)
Most ‘samurai swords’ on the market are barely-functional wall-hangers. And the article describes it as a stabbing. Curved swords are actually quite hard to effectively stab with, so it’s hardly surprising that an untrained user using a poor-quality weapon incorrectly couldn’t cause a lethal wound.
"'He had a samurai sword stuck to his back with about a foot of it sticking out in front, his face cut up all over,' said his friend Patrick McMillan."
Every time I hear about anything related to Effective Altruism, it's because people who believe this stuff have done crimes sufficient to make it in to the news.
Are there other contexts they show up where they are not doing crimes, or really non-normative things?
Effective Altruists donate to charities which save lives, etc. They help people to choose effective charities using evidence-based approach, e.g.: https://en.wikipedia.org/wiki/GiveWell
But that doesn't get into the news. Saving people is boring. Killing people? That gets the clicks.
Suppose you have 1000 people in the movement. 998 well-adjusted people. 1 is a crook who uses movement as a shield. 1 is insane freak.
Of course, you'll hear about the crook and the freak. Well-adjusted people are not newsworthy.
LessWrong always felt culty and weird to me, tbh. It's why I stick to HN for my social discourse. No central personality drives this place. It's the opposite of culty. It can be a bit of an echo chamber, maybe, but that's a different problem.
I’m surprised the killing of the customs officer has gotten so little press. I guess if you want your killing spree in the press, don’t do it in Vermont.
The salacious nature of a group of trans cult members killing people across the country is story made for tabloid insanity. The hooks into Silicon Valley and side-quests about AI overlords etc are frosting on the cake.
That murder was the top story on CNN, AP and NPR for about half a day IIRC. It got pushed off because so much happened in other areas nationally. It took about two days for the prosecutors to announce the connections.
Everytime lesswrong or rationalists or Yudkowski or any derivative show up here, I try to read some of the materials around and frankly, I don't understand most of it, it's all so self referential, full of impenetrable jargon and intentionally obtuse. It sounds mentally deranged and cultish. And it turns out that it can one-shot you into a death cult.
I think it’s worth discussing the fact that many folks in EA- and rationalist-adjacent circles are deeply panicked about developments in AI, because they believe these systems pose a threat to human existence. (Not all of them, obviously. But a large and extraordinarily well-funded contingent.) The fear I have with these organizations is that eventually panic - in the name of saving humans - leads to violent action. And while this is just a couple of isolated examples, it does illustrate the danger that violent action can become normalized in insular organizations.
I do hope people “on the inside” are aware of this and are working to make sure those organizations don’t have bad ideas that hurt real people.
Well should be pretty easy for the prosecutor, it sounds like, by their own definition, they meet the criteria for rico. Should be able to wrap up every zizian and at least shake them down for a plea.
After puzzling over the threads on this head-spinning story, my sense is that maybe the most neutral thing to do is re-up this submission of a news report and merge most of the comments hither.
Edit: I guess we can have the current thread be about the recent events and keep https://news.ycombinator.com/item?id=42897871 for discussing / arguing about the generic aspects. Sorry for the complexity but in this case it appears to be essential not accidental.
With religion I think we largely just go with "this is what the group identifies themselves with / as" and leave it as that.
Arguably the politically active "religious right" in the US has long since abandoned most everything Jesus had to offer. We still identify them as such.
The Zizians were only ever a tiny fraction of lesswrong.com, which gets about 2 million page views per month (according to a site maintainer) -- not as much as HN surely, but not a small web site.
Lesswrong.com has been in operation since 2009 and is closely related to the SL4 mailing list which dates back to 1998 or 1999.
The more I read about these Zizians, the more I'm reminded of Final Fantasy House, whose members had been led to believe, or at least go along with, the idea that they were spiritual manifestations of Final Fantasy VII characters, and were manipulated and exploited by the house's dysfunctional leader, "Jenova":
Is there a grand figure of rationalist philosophy? Someone like a rationalist “Satre” or “Heidegger”?
I think a lot of the weirdness of rational communities comes from them not properly integrating the existing canon of western philosophy and rediscovering pitfalls.
Without the benefit of centuries of discussions it’s easy to come to strange conclusions.
I agree with your diagnosis as well. It's Dunning-Krueger. They keep re-inventing the wheel poorly due to their ignorance which is maintained because of their confidence which stems from their ignorance.
Ziz (the cult leader) had a blog at sinceriously.fyi, where she elaborated on her philosophy. The blog has since devolved into a string of death threads and subsequently been deleted, but anyone who's interested in the backstory can still find some posts on archive.ph (e.g. [1]) and more on web.archive.org.
From Ziz and other cult members' writings, it's obvious that the cult has been in conflict with the rationalist community, rather than part of it, for years. It's disappointing but not surprising that SFGATE decided to pin the murders on the rationalist community while contorting their writing to avoid mentioning the elephant in the room: the cult leader is a trans vegan, almost all of the members are also trans vegans, and these are core parts of their ideology. They were discussed over and over on Ziz's blog.
In fact, after the katana stabbing and the self-defense shooting by the wounded landlord, another cult member used her blog to blame Zack for the events. Zack wasn't involved, physically present, or in communication with the cult at the time, but he does believe in a different theory of transness than they do [2], so they can't stop being enraged at him. That theory was actually one of the main reasons the cult has split off from the rationalist community. I'm worried that Zack may be in danger.
"Trans vegan 'Siths' who kill people with katanas" is not a sentence I thought I'd ever read, let alone type, yet here we are, in the most ludicrous timeline.
any violence that is adjacent to rationalism will see a media response like this and a healthy dose of astroturfing to support it. the cia is very eager to make rationalists persona non grata because they provide a cultural nucleation point for anti-AI sentiment. they want to suppress these ideas so that our victory in the AI arms race is secure — unhindered by pesky human rights nonsense. they killed suchir balaji, too.
Yeah they're TPOT adjacent. While the original rationalists group were pretty good about watching their biases the later ones are more accurately rationalizers. Some TPOT people have warned them they're making mistakes to no avail.
It's just the MOPs, geeks, sociopaths and Eternal September thing again. They all use the jargon and reach absurd conclusions and don't have the wisdom to realize that an absurd conclusion with a sound argument means a false assumption.
Just because you say "my priors" like priests say "my prayers" doesn't make you rational. They are in the midwit area where you can reason.
The fool dismisses the absurd conclusion because he cannot reason. The wise man because he knows what it means. The average rationalist decides it's true.
Almost everybody does. But probably there will always be those who don't. Some people's lives are a sad story, and manipulators can learn who is most vulnerable.
Not to imply the Zizians aren't embracing some dangerous ideas, but this is the strangest use of "string of killings" I've ever seen. Author could have gone with "wave of violence," but even then they'd be talking about a "wave" of three incidents in three years.
This story goes as mad as a sub-plot in a Philip Dick novel. If a lot of weird stuff happens, how would this affect observers' estimation for the probability that we live in a simulation? I wonder if that was one of the intents.
I'm so annoyed at how popular rationalism is. Human rationality (along with symbolic creativity) is only useful at generating testable hypothesis, not for direct knowledge of reality. Once someone grant the axioms of rationality they become delusional and spiral (half life of around 6 years)
What I find interesting about these cases and the seemingly unrelated United Healthcare CEO assassination is that all were committed by extremely online data scientists who went to elite universities.
I’ve been fairly skeptical of the right wing narrative that these schools have been “radicalized” (seeing as universities and young people have always been the hotbeds of leftist thought, how soon we forget the hippie movement)…but this definitely has me wondering.
When your entire philosophy is a jupyter notebook filled with bugs from beginning to end but you don't know it because you've never heard of a unit test things go wrong.
Feels like we're going back to a period of political decentralization. History suggests that large empires tend to collapse after the rule of law loses its meaning and power. I've been predicting the rise of political gangs for a long time.
It already feels like everyone is in a tribe... Now the level of violence is being dialed up and the overarching structures are losing their ability to enforce the law. Look at how many criminals have been pardoned, nobody cares anymore. Nobody even agrees about what a criminal is.
How can we reduce crime if we cannot even agree on what it means?
yikes the results of attention seeking mentally ill being normalized and not getting the attention they feel they deserve for snipping themselves. maybe we shouldnt normalize mental illnesses..
Could you please not use HN for ideological battle? It's not what this site is for, and destroys what it is for, regardless of which ideology you're for or against.
Pardon the cult-like musings ahead of time, but it’s par for the course.
My initial thought was rationalism is obviously egoic and selfobsessed, loving and trusting ones own thoughts. Set theory should tell you that you cant make a mental image of yourself to act by that will be more encompassing than the totality of what you are. You can’t build a mental model inside of your ego that will work better than your natural instinct for all interaction with reality. Trust sheer emotion to say that rationalising any loss of life means an upside down philosophy, a castle in the sky. This cult with its ”functional decision theory” […that the normative principle for action is to treat one’s decision as the output of a fixed mathematical function”]
makes actions a sort of cold choice without emotion. Like people using religion in war to remain cool when killing, a misuse of a neutral idea such as a mathematical function.
But it can’t be that easy to handwave it away. When Aum Shinrikyo was mentioned down below, I changed my mind, there’s no easy answers. A sick leader can justify anything and you can judge any tree from its fruits. From doctrine section on that wikipedia: ”Their teachings claimed a nuclear apocalypse was predicted to occur soon” (Parallell to AI now in rationalism), ”Furthermore, Lifton believes, Asahara "interpreted the Tibetan Buddhist concept of phowa in order to claim that by killing someone contrary to the group's aims, they were preventing them from accumulating bad karma and thus saving them" ” (Parallell to rational behavior guidance gone wrong, these datascientists just lost touch, Norm Macdonald would say theyre real jerks pardon the humor).
I just the other day listened to Eckhart Tolles podcast where he talked about doomsday fear, on the bottom of the transcript it says: [“There's also an unconscious desire in many human beings for what we could call the collapse of the human made world.
Because most humans experience the world as a burden. They have to live in this world, but the world is problematic. The world has a heaviness to it.
You have your job, you have the taxes, you have money, and the world is complex and heavy. And there's an unconscious longing for people in what we might bluntly call the end of the world. But ultimately, what they are longing for is, yes, it is a kind of liberation, but the ultimate liberation that they are really longing for is the liberation from themselves, the liberation from what they experience as their problematic, heavy sense of self that's inseparable from the so-called outer world.
And so there's a longing in every human for that. But that's not going to happen yet.”]
Eckhart Tolle: Essential Teachings: Essential Teachings Special: Challenging Times Can Awaken Us 30 jan. 2025
Obvious parallell to AI doomsaying can be drawn.
When we were children we experienced unfiltered reality without formulas to replace our decisions. But we could even then be wrong, stupid, or convinced to do stupid shit by a charismatic playground bully. But when we were wrong it resulted in falling and scraping our knee or whatever. Theres no reality checks in internet culture bubbles.
This is sick people huddling together under a sick charismatic warlord-ish leader whose castle in the sky is so selfcoherent that it makes others want to systemize it aided by the brainwashing methods.[”Zizians believe normal ideas about morality are mostly obfuscated nonsense. They think real morality is simple and has already been put to paper by Jeremy Bentham with his utilitarian ideas. Bentham famously predicted an expanding scope of moral concern. He says if humanity is honest with itself it will eventually adopt uncompromising veganism. Zizians think arguments which don't acknowledge this are not about morality at all, they're about local struggles for power delaying the removal of an unjust status quo.”] Insert Watchmen pic of grandiose narc Adrian Veidt asking Dr Manhattan if utilitarian masskilling was the right choice
And then the sleep deprivation indoctrination method dulls even their rationality even further. So they can all become ego clones of the cultleader.
And that other link in this thread mentioned other groups of rationalists debugging from demons sent by adversary groups and other psychotic stuff, yeah is it the chicken or the egg where those people gather in a place where people loop with their mind or is it the mindlooping that sends them in a downwards spiral. Maybe we should calculate the bayesian.
Holly crap, it reads like the anti Electric coolaid acid test, with petty revenge ,guns, and murder dejour,instead of , instead of, well, an epic road party that is still going......bobby just played the gramys...there were glitches with the sound system and they raised 15 mill
I like some of what Aella has written, but had no idea that ,Rationalists, had just rebranded nialistic hate.....so cleverly.
A later article by the same author: https://www.sfgate.com/bayarea/article/leader-alleged-bay-ar.... Probably makes sense to read both or neither.
[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
1) While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...
2) Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
3) Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
4) The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
5) It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.
G.K.Chesterton knew it, 100 years ago:
"... insanity is often marked by the dominance of reason and the exclusion of creativity and humour. Pure reason is inhuman. The madman’s mind moves in a perfect, but narrow, circle, and his explanation of the world is comprehensive, at least to him."
Or David Hume, 300 years ago:
"Reason is, and ought only to be, the slave of the passions"
So did Fyodor Dostoevsky.
This sounds a lot like the psychopath Anton Chigurh in the movie No Country for Old Men. His view of the world is he is the arbiter of people's destiny, which often involves them being murdered.
Another thing I'll add after having spent a few years in a religious cult:
It's all about the axioms. If you tweak the axioms you can use impeccable logic to build a completely incorrect framework that will appeal to otherwise intelligent(and often highly intelligent) people.
Also, people are always less rational than they are able to admit. The force of things like social connection can very easily warp the reasoning capabilities of the most devout rationalist(although they'll likely never admit that).
Im kinda skeptical these folks were following some hyper-logical process from flawed axioms that led them to the rigorous solution: "I should go stab our land-lord with a samurai sword" or "I should start a shootout with the Feds".
The rationalist stuff just seems like some meaningless patter they stuck ontop of more garden variety cult stuff.
The axioms of rationality, morality, etc. I've always found interesting.
We have certain axioms, (let me chose an arbitrary, and possibly not quite an axiomy-enough example): "human life has value". We hold this to be self-evident and construct our society around it.
We also often don't realize that other people and cultures have different axioms of morality. We talk/theorize at this high level, but don't realize we have different foundations.
Right, and a related problem is a lot of the logic is more like Zeno’s paradox.
Succinct. This should be handed out as a “Signs you’re being manipulated” flyer to young people.
Wow, what a perfect description of why their probability-logic leads to silly beliefs.
I've been wondering how to argue within their frame for a while, and here's what I've come up with: Is the likelihood that aliens exist, are unfriendly, and AGI will help us beat them higher or lower than the likelihood that the AGI itself that we develop is unfriendly to us and wants to FOOM us? Show your work.
It’s pointless. They aren’t rational. Any argument you come up with that contradicts their personal desires will be successfully “reasoned” away by them because they want it to be. Your mistake was ever thinking they had a rational thought to begin with, they think they are infallible.
Much of philosophy throughout history seems to operate this way.
I think philosophy is a noble pursuit, but it's worth noting how often people drew very broad conclusions, and then acted on them, from not very much data. Consider the dozens of theories of the constitution of the world from the time of the Greek thinkers (even the atomic theory doesn't look very much at all like atoms as we now understand them), or the myriad examples of political philosophies that ran up against people simply not acting the way the philosophy needed them to act to cohere.
The investigation of possibility is laudable, but a healthy and regular dose of evidence is important.
They think they can predict the future by extension know what’s good for us. If they could choose you wouldn’t get a vote.
AGI would be extremely helpful in navigating clashes with aliens, but taking the time to make sure it's safe is very unlikely to make a difference to whether it's ready in time. Rationalists want AGI to be built, and they're generally very excited about it, e.g. many of them work at Anthropic. They just don't want a Move Fast and Break Things pace of development.
the term you're looking for is pascal's mugging, and it originates from within rationalism
It seems that you didn't understand the main point of the exposition. I'll summarize the ops comment a bit further.
Points 1 and 2 only explain how they are able to erroneously justify their absurd beliefs, they don't explain why they hold those beliefs.
Points 3 through 5 are the heart of the matter; egotistical and charismatic (to some types of people) leaders, open minded, freethinking and somewhat weird or marginalized people searching for meaning plus a way for them all to congregate around some shared interests.
TLDR: perfect conditions for one or more cults to form.
No, it’s the “rationality.” Well maybe the people too, but the ideas are at fault.
As I posted elsewhere on this subject: these people are rationalizing, not rational. They’re writing cliche sci-fi and bizarre secularized imitations of baroque theology and then reasoning from these narratives as if they are reality.
Reason is a tool not a magic superpower enabling one to see beyond the bounds of available information, nor does it magically vaporize all biases.
Logic, like software and for the same reason, is “garbage in, garbage out.” If even one of the inputs (premises, priors) is mistaken the entire conclusion can be wildly wrong. Errors cascade, just like software.
That's why every step needs to be checked with experiment or observation before a next step is taken.
I have followed these people since stuff like Overcoming Bias and LessWrong appeared and I have never been very impressed. Some interesting ideas, but honestly most of them were recycling of ideas I’d already encountered in sci-fi or futurist forums from way back in the 1990s.
The culty vibes were always there and it instantly put me off, as did many of the personalities.
“A bunch of high IQ idiots” has been my take for like a decade or more.
> As I posted elsewhere on this subject: these people are rationalizing, not rational.
That is sometimes true, but as I said in another comment, I think this is on the weaker end of criticisms because it doesn't really apply to the best of that community's members and the best of its claims, and in either case isn't really a consequence of their explicit values.
> Logic, like software and for the same reason, is “garbage in, garbage out.” If even one of the inputs (premises, priors) is mistaken the entire conclusion can be wildly wrong. Errors cascade, just like software.
True, but an odd analogy: we use software to make very important predictions all the time. For every Therac-25 out there, there's a model helping detect cancer in MRI imagery.
And, of course, other methods are also prone to error.
> That's why every step needs to be checked with experiment or observation before a next step is taken.
Depends on the setting. Some hypotheses are not things you can test in the lab. Some others are consequences you really don't want to confirm. Setting aside AI risk for a second, consider the scientists watching the Trinity Test: they had calculated that it wouldn't ignite the atmosphere and incinerate the entire globe in a firestorm, but...well, they didn't really know until they set the thing off, did they? They had to take a bet based on what they could predict with what they knew.
I really don't agree with the implicit take that "um actually you can never be certain so trying to reason about things is stupid". Excessive chains of reasoning accumulate error, and that error can be severe in cases of numerical instability (e.g. values very close to 0, multiplications, that kind of thing). But shorter chains conducted rigorously are a very important tool to understand the world.
>They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm".
Which leader said anything like that? Certainly not Eliezer or the leader of the Center for Applied Rationality (Anna Salamon) or the project lead of the web site lesswrong.com (Oliver Habryka)!
Hello, can confirm, criticism is like the bread and butter of LW, lol. I have very extensively criticized tons of people in the extend rationality ecosystem, and I have also never seen anyone in any leadership position react with anything like this quote. Seems totally made up.
> Rationalists, by tending to overly formalist approaches,
But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe". Few to none in the community know squat about actually computing a posterior probability, but they'll all happily chant "shut up and multiply" as a justification for whatever nonsense they instinctively wanted to do.
> Precision errors in utility calculations that are numerically-unstable
Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do. The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid. It's because in the face of substantial uncertainty about the world (and your own calculation processes) reasoning things out can only take you so far. A useful tool in some domains, but not a generalized philosophy for life ... The cognitive biases they obsess about and go out of their way to eschew are mostly highly evolved harm mitigation heuristics for reasoning against uncertainty.
> that is particularly susceptible to internally-consistent madness
It's typical for cults to cultivate vulnerable mind states for cult leaders to exploit for their own profit, power, sexual fulfillment, etc.
A well regulated cult keeps its members mental illness within a bound that maximized the benefit for the cult leaders in a sustainable way (e.g. not going off and murdering people, even when doing so is the logical conclusion of the cult philosophy). But sometimes people are won over by a cult's distorted thinking but aren't useful for bringing the cult leaders their desired profit, power, or sex.
> But they don't apply formal or "formalist" approaches, they invoke the names of formal methods but then extract from them just a "vibe".
I broadly agree with this criticism, but I also think it's kind of low-hanging. At least speaking for myself (a former member of those circles), I do indeed sit down and write quantitative models when I want to estimate things rigorously, and I can't be the only one who does.
> Indeed, as well as just ignoring that uncertainties about the state of the world or the model of interaction utterly dominate any "calculation" that you could hope to do.
This, on the other hand, I don't think is a valid criticism nor correct taken in isolation.
You can absolutely make meaningful predictions about the world despite uncertainties. A good model can tell you that a hurricane might hit Tampa but won't hit New Orleans, even though weather is the textbook example of a medium-term chaotic system. A good model can tell you when a bridge needs to be inspected, even though there are numerous reasons for failure that you cannot account for. A good model can tell you whether a growth is likely to become cancerous, even though oncogenesis is stochastic.
Maybe a bit more precisely, even if logic cannot tell you what sets of beliefs are correct, it can tell you what sets of beliefs are inconsistent with one another. For example, if you think event X has probability 50%, and you think event Y has probability 20% conditional on X, it would be inconsistent for you to believe event Y has a probability of less than 10%.
> The world at large is does not spend all its time in lesswrongian ritual multiplication or whatever... but this is not because they're educated stupid
When I thought about founding my company last January, one of the first things I did was sit down and make a toy model to estimate whether the unit economics would be viable. It said they would be, so I started the company. It is now profitable with wide operating margins, just as that model predicted it would be, because I did the math and my competitors in a crowded space did not.
Yeah, it's possible to be overconfident, but let's not forget where we are: startups win because people do things in dumb inefficient ways all the time. Sometimes everyone is wrong and you are right, it's just that that usually happens in areas where you have singularly deep expertise, not where you were just a Really Smart Dude and thought super hard about philosophy.
[flagged]
I noticed years ago too that AI doomers and rationalist type were very prone to (infinity * 0 = infinity) types of traps, which is a fairly autistic way of thinking. Humanity long time ago decided that infinity * 0 = 0 for very good practical reasons.
> Humanity long time ago decided that infinity * 0 = 0
I'm guessing you don't mean this in any formal mathematical sense, without context, infinity multiplied by zero isn't formally defined. There could be various formulations and contexts where you could define / calculate something like infinity * zero to evaluate to whatever you want. (e.g. define f(x) := C x and g(x) := 1/x, What does f(x) * g(x) evaluate to in the limit as x goes to infinity? C. And we can interpret f(x) as going to infinity while g(x) goes to zero, so we can use that to justify writing "infinity * 0 = C" for an arbitrary C... )
So, what do you mean by "infinity * 0 = infinity" informally? That humans regard the expected value of (arbitrarily large impact) * (arbitrarily small probability) as zero?
No, humanity decided that infinity doesn't exist and anyone trying to tell you about it is selling you religion.
They actively look for ways for infinity to happen. Look at Eli's irate response to Roko's basilisk. To him even being able to imagine that there is a trap means that it will necessarily be realised.
I've seen "rationalist" AI doomers who say things like "given enough time technology will be invented to teleport you into the future where you'll be horifically tortured forever".
It's just extrapolation, taken to the extreme, and believed in totally religiously.
>which is a fairly autistic way of thinking.
Any prominent ones of them I've read or met either openly shares their diagnosis, or 100% fits the profile.
Reminds me of the https://en.wikipedia.org/wiki/Measure_problem_(cosmology).
i think you are putting too many people in one bucket
> Humanity long time ago decided that infinity * 0 = 0 for very good practical reasons.
Among them being that ∞ × 0 = ∞ makes no mathematical sense. Multiplying literally any other number by zero results in zero. I see no reason to believe that infinity (positive or negative) would be some exception; infinity instances of nothing is still nothing.
Brilliant summary, thanks.
I'm interested in #4, is there anywhere you know of to read more about that? I don't think I've seen that described except obliquely in eg sayings about the relationship between genius and madness.
I don't, that one's me speaking from my own speculation. It's a working model I've had for a while about the nature of a lot of kinds of mental illness (particularly my own tendencies towards depression), which I guess I should explain more thoroughly! This gets a bit abstract, so stick with me: it's a toy model, and I don't mean it to be definitive truth, but it seems to do well at explaining my own tendencies.
-------
So, toy model: imagine the brain has a single 1-dimensional happiness value that changes over time. You can be +3 happy or -2 unhappy, that kind of thing. Everyone knows when you're very happy you tend to come down, and when you're very sad you tend to eventually shake it off, meaning that there is something of a tendency towards a moderate value or a set-point of sorts. For the sake of simplicity, let's say a normal person has a set point of 0, then maybe a depressive person has a set point of -1, a manic person has a set point of +1, that sort of thing.
Mathematically, this is similar to the equations that describe a spring. If left to its own devices, a spring will tend to its equilibrium value, either exponentially (if overdamped) or with some oscillation around it (if underdamped). But if you're a person living your life, there are things constantly jostling the spring up and down, which is why manic people aren't crazy all the time and depressed people have some good days where they feel good and can smile. Mathematically, this is a spring with a forcing function - as though it's sitting on a rough train ride that is constantly applying "random" forces to it. Rather than x'' + kx = 0, you've got x'' + kx = f(t) for some external forcing function f(t), where f(t) critically does not depend on x or on the individual internal dynamics involved.
These external forcing functions tend to be pretty similar among people of a comparable environment. But the internal equilibria seem to be quite different. So when the external forcing is strong, it tends to pull people in similar directions, and people whose innate tendencies are extreme tend to get pulled along with the majority anyway. But when external forcing is weak (or when people are decoupled from its effects on them), internal equilibria tend to take over, and extreme people can get caught in feedback loops.
If you're a little more ML-inclined, you can think about external influences like a temperature term in an ML model. If your personal "model" of the world tends to settle into a minimum labeled "completely crazy" or "severely depressed" or the like, a high "temperature" can help jostle you out of that minimum even if your tendencies always move in that direction.
Basically, I think weird nerds tend to have low "temperature" values, and tend to settle into their own internal equilibria, whether those are good, bad, or good in some cases and bad in others (consider all the genius mathematicians who were also nuts). "Normies", for lack of a better way of putting it, tend to have high temperature values and live their lives across a wider region of state space, which reduces their ability to wield precision and competitive advantage but protects them from the most extreme failure-modes as well.
There's another way around it. People that see themselves as "freethinkers" are also ultimately contrarians. Taking contrarianism as part of your identity makes people value unconventional ideas, but turn that around: It also means devaluing mainstream ideas. Since humanity is basically an optimization algorithm, being very contrarian means that, along with throwing away some bad assumptions, one also throws away a whole lot of very good defaults. So one might be right in a topic or two, but overall, a lot of bad takes are going to seep in and poison the intellectual well.
Here's another analysis which comes from a slightly different angle.
https://threadreaderapp.com/thread/1361045568663945216.html
This dynamic is not exclusive to those claiming to be part of an insular community of freethinkers:
https://news.ycombinator.com/item?id=25667362
It is like they actually managed to install a shitty low precision distilled llm in their brain.
What does this mean?
I mean, isn't the problem that they actually aren't that smart or rational. They're just a group of people who've built their identity around believing themselves to be smart...
They're also not freethinkers. They're a community that demand huge adherence to their own norms.
Yes. Hard to imagine a more hypocritical outcome of “free thinking” than “think like me or die”
Great summary, and you can add utilitarianism to the bucket of ideologies that are just too rigid to fully explain the world and too rational for human brains not to create a misguided cult around
Ok but social clustering is how humans work. Culture translated to modern idiomatic language is “practice of a cult”. Ure translates to “practice of”, Ur being the first city so say historians; clusters of shared culture is our lived experience. Forever now there have been a statistical few who get stuck in a while loop “while alive recite this honorific code, kill perceived threats to memorized honorific chants”.
We’ve observed ourselves do this for centuries. Are your descriptions all that insightful?
How do you solve isolation? Can you? Will thermodynamics allow it? Or are we just neglecting a different cohort?
Again due to memory or social systems are always brittle. Everyone chafes over social evolution of some kind, no matter how brave a face they project in platitudes, biology self selects. So long as the economy prefers low skilled rhetoricians holding assets, an inflexible workforce constrains our ability to flex. Why is there not an “office worker” culture issue? Plainly self selecting for IT to avoid holding the mirror up to itself.
Growing up in farmland before earning to STEM degrees, working on hardware and software, I totally get the outrage of people breaking their ass to grow food while some general studies grad manages Google accounts and plays PS5 all night. Extreme addiction to a lived experience is the American way from top to bottom.
Grammatically correct analysis of someone else. But this all gets very 1984 feeling; trust posts online, ignore lived experience. It’s not hard to see your post as an algebraic problem; the issues of meatspace impact everyone regardless of the syntax sugar analysis we pad the explanation with. How do you solve for the endless churn of physics?
> Culture translated to modern idiomatic language is “practice of a cult”. Ure translates to “practice of”, Ur being the first city so say historians
Excuse me but what in the name of ever-loving fuck did I just read.
> 3) Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with
I'm always surprised at how common this is in rationalist and EA organizations. The revelations about the cult-like behavior at MIRI / CFAR / Leverage are eye-opening: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experie...
The issues with sexual misconduct and drug-assisted assault in these communities even made the mainstream news: https://www.bloomberg.com/news/features/2023-03-07/effective...
It's equally fascinating to see how effectively these issues are rapidly retconned out of the rationalist discourse. Many of these leaders and organizations who get outed were respected and frequently discussed prior to the revelations, but afterward they're discussed as an inconsequential sideshow.
> TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult.
I still think cults are a rare outcome. More often, I've seen people become "rationalist" because it gives them tools to amplify their pre-existing beliefs (#4 in your list). They link up with other like-minded people in similar rationalist communities which further strengthens their belief that they are not only correct, but they are systematically more correct than anyone who disagrees with them.
> They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.
I have never seen this and I've been active around this around for almost two decades now.
> isolation
Also very much doesn't match my experience. Only about a quarter of my friends are even rationalists.
I disagree. It's common for any criticisms of rationalism or the rationalist community to be dismissed as having ulterior motives. Even the definition of rationalism is set up in a way that it is de facto good, and therefore anyone suggesting anything negative is either wrong or doesn't know what they're talking about.
What you said should be happily accepted verbatim as a guest post on any rationalist blog because it is scientific and shows critical thinking.
Maybe so! They didn't kick me out. I chose to leave c. early 2021, because I didn't like what I saw (and events since then have, I feel, proven me very right to have been worried).
How is it scientific? What do you mean by scientific?! Do you mean logical?
This is a very insightful comment. As someone who was 3rd-degree connected to that world during my time in the bay, this matches the general vibe of conversations and people I ran into at house parties and hangouts very very well.
It's amazing how powerful isolation followed by acceptance is at modifying human behavior.
None of that seems very rational
>The problem with rationalists/EA
i see 2 - superiority complex and lack of such an "irrational" thing like empathy. Basically they use crude logical-like looking constructions to excuse their own narcissism and related indulgences.
>The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community
It's precisely those kind of people though that would ever be so deluded and so little self conscious as to start a group about rationality - and declare themselves its arbiters.
"the problem with rationalism is that you're a fucking idiot" never fails
I agree with all of this, but in points 1 and 2 you’ve clearly and succinctly described two issues that I’ve always struggled to express. Thank you!
Great summary
[flagged]
[flagged]
any reason you decided to use slurs here?
Eliezer is many things but I don't see him behaving egotisticaly. He has always come across very genuine.
> (the guy makes Neil deGrasse Tyson look like a monk)
So years after an investigation into the allegation of sexual misconduct ended empty handed, it's still ok to throw mud?
That seemed to be “monk” in the context of humility, not chastity
It's unsettlingly weird that you assume all mentions of NdT to be about the sexual misconduct allegations against him.
From the article:
A 2023 post on Rationalism forum LessWrong.com warned of coming violence in the Zizian community. “Over the past few years, Ziz has repeatedly called for the deaths of many different classes of people,” the anonymous post read. Jessica Taylor, a friend of Baukholt’s, told Open Vallejo she warned Baukholt about the Zizians, describing the group on X as a “death cult.”
The post: https://www.lesswrong.com/posts/T5RzkFcNpRdckGauu/link-a-com...
[flagged]
This story just keeps getting more and more bizarre. Reading the charges and supporting affidavits, the whole thing is reading more and more like some sort of Yorgos Lanthimos film. The rationalist connection - a literal sequel to the 2022 events (in turn a sequel to the 2019 CFAR stuff) - is already weird enough. But I can't get over the ridiculousness of the VT situation. I have spent time in that area of VT, and the charged parties must have been acting quite bizarre for the clerk to alert the police. Checking into a motel wearing all black, open carrying does NOT cut it. The phones wrapped in foil is comical, and the fact that they were surveilled over several days is interesting, especially because it reads like the FBI only became aware of their presence after the stop and shootout?
The arresting agent seems pretty interesting, a former risk adjuster who recently successfully led the case against a large inter-state fraud scheme. This may just be the plot of Fargo season 10. Looking forward to the season arc of the FBI trying to understand the "rationalist" community. The episode titled "Roko's Basilisk", with no thematically tied elements, but they manage to turn Yudkowsky into a rat.
This story happened in my backyard. The shootout was about 40 minutes from me but Youngblut and Felix Bauckholt were reported by a hotel clerk dressed in tactical gear and sporting firearms in a hotel a few blocks from me.
Weird to see a community I followed show up so close to home and negatively like this. I always just read LW and appreciated some of the fundamentals that this group seems to have ignored. Stuff like rationality has to objectively make your life and the world better or its a failed ideology.
Edit: I've been following this story for over a week because it was local news. Why is this showing up here on HN now?
> Weird to see a community I followed show up so close to home and negatively like this.
I had some coworkers who were really into LessWrong and rationality. I thought it was fun to read some of the selected writings they would share, but I always felt that online rationalist communities collected a lot of people with reactionary, fascist, misogynistic, and far-right tendencies. There’s a heavily sanitized version of rationality and EA that gets presented online with only the highlights, but there’s a lot more out there in the fringes that is really weird.
For example, many know about Roko’s Basilisk as a thought exercise and much has been written about it, but fewer know that Roko has been writing misogynistic rants on Twitter and claiming things like having women in the workforce is “very negative” for GDP.
The Slate Star Codex subreddit was a home for rationalists on Reddit, but they had so many problems with culture war topics that they banned discussion of them. The users forked off and created “The Motte” which is a bit of a cesspool dressed up with rationalist prose. Even the SlateStarCodex subreddit has become so toxic that I had to unsubscribe. Many of the posts and comments on women or dating were becoming indistinguishable from incel communities other than the rationalist prose style.
Even the real-world rationalist and EA communities aren’t immune, with several high profile sexual misconduct scandals making the news in recent years.
It’s a weird space. It felt like a fun internet philosophy community when my coworkers introduced it years ago, but the longer I’ve observed it the more I’ve realized it attracts and accepts a lot of people whose goals aren’t aligned with objectively “make the world better” as long as they can write their prose in the rationalist style. It’s been strange to observe.
Of course, at every turn people will argue that the bad actors are not true rationalists, but I’ve seen enough from these communities to know that they don’t really discriminate much until issues boil over into the news.
Sophistry is actually really really old:
>In the second half of the 5th century BCE, particularly in Athens, "sophist" came to denote a class of mostly itinerant intellectuals who taught courses in various subjects, speculated about the nature of language and culture, and employed rhetoric to achieve their purposes, generally to persuade or convince others. Nicholas Denyer observes that the Sophists "did ... have one important thing in common: whatever else they did or did not claim to know, they characteristically had a great understanding of what words would entertain or impress or persuade an audience."
The problem then, as of now, is sorting the wheat from the chaff. Rationalist spaces like /r/SSC, The Motte, et. al are just modern sophistry labs that like to think they're filled with the next Socrates when they're actually filled with endless Thrasymachi. Scott Alexander and Eleizer Yudkowsky have something meaningful (and deradicalizing) to say. Their third-degree followers? Not so much.
The community/offshoot I am part of is mostly liberal/left. My impression that lesswrong is also liberal/left.
I think astral codex ten did a survey recently and the majority of respondents were politically left
It's somewhat odd to represent a community as being right wing when the worst thing to come from it was a trans vegan murder cult. Most "rationalists" vote Democrat, and if the franchise were limited to them, Harris would have won in a 50 state landslide.
The complaint here seems to be that rationalists don't take progressive pieties as axiomatic.
I have had some rather… negative vibes, for lack of a better term, from some of the American bits I've encountered online; but for what it's worth, I've not seen what you described in the German community.
There is, ironically, no escape from two facts that was well advertised at the start: (1) the easiest person for anyone to fool is themselves, and (2) politics is the mind-killer.
With no outwardly visible irony, there's a rationalist politics podcast called "the mind killer": https://podcasts.apple.com/de/podcast/the-mind-killer/id1507...
Saying this as someone who read HPMOR and AI to Zombies and used to listen to The Bayesian Conspiracy podcast:
This is feeling a bit of that scene in Monty Python Life of Brian where everyone was chanting in unison about thinking for themselves.
The whole internet mainstream zeitgeist with dating among men has become identical to incel talking points from 5 years ago.
Reading about the Roko’s Basalisk saga, it seems clear that these people are quite far from rational and of extremely limited emotional development. It reads like observing a group of children who are afraid of the monster in the closet, which they definitely brought into existence by chanting a phrase in front of the bathroom mirror…
Members of these or other similar communities would do well to read anything on them dispassionately and critique anything they read. I’d also say that if they use Yudkowsy’s writings as a basis for understanding the world, that understanding is going to have to the same inadequacies of Yudkowsky and his writings. How many people without PhDs or even relevant formal education are putting out high quality writing on both philosophy and quantum mechanics (and whatever other subjects)?
For what it's worth, there's a thriving liberal rationalist-adjacent community on Twitter that despises people like Roko.
[flagged]
(To answer that last procedural question: there have been assorted submissions, but none spent much time on the front page. More at https://news.ycombinator.com/item?id=42901777)
Many people were curious here that the perpetrators were using Vim or Emacs.
Wait, OR?!
Clearly this is a poorly organized movement, with wildly different beliefs. There is no unity of purpose here. Emacs or vi, used without core beliefs being challenged?!
And one does not form a rationalist movement, and use emacs after all.
After seeing this news, I recall watching a video by Julia Galef about "what is rationality". Would it be fair to say that in this situation, they lack epistemic rationality but are high in instrumental rationality?
If they had high instrumental rationality, they would be effective at achieving their goals. That doesn’t seem to be the case - by conventional standards, they would even be considered "losers": jobless, homeless, imprisoned, or on the run.
Hard to say without hearing them speak for themself.
So far I have 0 idea of any motive.
Supposedly it should be rational, so I would at least like to hear it, before judging deeper.
What is LW?
Less Wrong
[dead]
Why does a hotel clerk wear tactical gear and guns?
That sentence was slightly awkward, the hotel clerk reported that those two people were in tactical gear with guns.
Relevant link (2023): https://www.lesswrong.com/posts/T5RzkFcNpRdckGauu/link-a-com...
The top comment has an interesting take: "Unless Ziz gets back in the news, there’s not much reason for someone in 2025 or later to be reading this."
This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
This is the kind of thing where it is warranted that the feds gets every single wiretap, interception, and surveillance possible on everyone involved in the zizian movement.
Calling them a roaming band or "tactical death squad" is giving far too much credit. It is a handful of crazy people who convinced themselves that a couple murders would solve their problems.
In particular the attack on border patrol was obviously random and illogical. And the fact that no one was convicted of the Pennsylvania murders seems to reflect more on the police and prosecutors than the intelligence of the perpetrators.
Speaking of random and illogical, what prompted the Border Patrol to stop their car in the first place, I wonder? None of the news stories have elaborated on that.
Conflating ziz and less wrong feels a bit like conflating Aiden Hale with the LGBTQ movement or the Branch Davidians with Christianity.
Or even just the Branch Davidians and the seventh day adventists, of whom the branch Davidians were an offshoot.
I’ve read a couple rationalist blogs for over a decade and this past week is the first I’ve ever heard of these “Zizians”
Split the beliefs from the crime. A bunch of murderers were caught. Given they are dangerous killers, one killing a witness and one faking their death yeah they should get warrants.
> Split the beliefs from the crime.
Pretty hard to do that when the beliefs explicitly endorse murder. Ziz used to run a blog on which she made thinly veiled death threats, argued for a personal philosophy of hair-trigger escalation and massive retribution, raged at the rationalist community for not agreeing with her on that philosophy and on theories of transness, and considered most people on Earth to be irredeemably evil for eating meat.
It appears the ven diagram of the beliefs and crimes overlap quite a bit. Sometimes the beliefs are that certain crimes should be committed.
This is a free country (disputably) and you should be able to think and say whatever you want, but I also think it is reasonable for law enforcement in the investigation of said crimes to also investigate links to other members in the movement.
Can you really split the beliefs of the nazi movement in germany in 1940 from the crimes the believers committed?
Yes, thank you for saying so- reading about all this, but especially all the people chiming in who already knew about a lot of it? The fact that the founder of LessWrong coined the term “alignment,” a subject I’ve read about many times… it feels like learning lizard people always walked among us
Honestly it feels like this is the first time people are realizing that six degrees of separation means that crazy people can usually be connected to influential people. In this case they're just realizing it with the rationalists.
At least it's clear that they aren't receiving proper sword handling training. Good grief.
[flagged]
Your about tells me you could make this comment way more specifically and with evidence.
Got a source for that claim?
Rest assured, I'm pretty sure among the easiest ways to make yourself the target of surveillance is to do anything interesting at all involving technology. All serious AI researchers, for example, should assume that they are the victims of this.
>This whole rabbit hole of rationalism, less wrong, and ziz feels like a fever dream to me. Roaming trans veganist tatical death squads shooting border officers and stabbing 80 year olds with swords.
I don't exactly see how it's different from a group of habitual alcoholics discussing politics and having a fatal disagreement, which is a normal day of the week in any police department with enough demographics to have this sort of low-effort low-gain crime. It's more scandalous because of details and people involved are more interesting, but everyone will forget about it after a week, as they don't matter.
>I don't exactly see how it's different from a group of habitual alcoholics discussing politics and having a fatal disagreement
Intent, premeditation, possibly being designated a terrorist group depending on other factors. Big differences.
These people?
https://nypost.com/2025/01/30/us-news/killing-of-border-patr...
Is the appellation in the headline, "radical vegan trans cult," a true description?
> Authorities now say the guns used by Youngblut and Bauckholt are owned by a person of interest in other murders — and connected to a mysterious cult of transgender “geniuses” who follow a trans leader named Jack LaSota, also known by the alias “Ziz.”
Is all this murder stuff broadly correct?
The NY Post tried to frame them as "radical leftist", but that's a big stretch. I don't think most rationalists would consider themselves leftist. The article also seems to be leaning into the current "trans panic" - pretty typical for the NYP.
I also dislike Right/Left categorizations. Most people don't even know the history of the terms and their roots in the French Revolution. Though the "Cult of Reason" established then certainly had the Left categorization at the time.
But is the trans element not a major part of this cult? It seemed to be from the linked story in the top link. But if there is something incorrect there, or false in the NYP reporting, you should point it out. If it is a major element of this cult, then far from complaining about NYP, I would complain about any news organization leaving it out of its reporting.
> I don't think most rationalists would consider themselves leftist
Yes they do.
https://docs.google.com/forms/d/e/1FAIpQLSf5FqX6XBJlfOShMd3U...
Who is making a statement about "most rationalists" here? The claim is about a trans vegan murder cult, which doesn't appear to be a natural member of the right side of the political spectrum.
Many rationalists do consider themselves leftist. Many others do not. It's a big tent and anyone can wander in.
Left libertarian would be more likely, I think?
> Is the appellation in the headline, "radical vegan trans cult," a true description?
For this small group, yes. Their leader believes in Nuremberg-style trials for people who eat meat. If you want to go down the rabbit hole, it gets much weirder: https://zizians.info/
The cult does seem to target people who identify as trans - OP has some discussion of this. Not sure if that justifies calling it a "radical vegan trans cult" though. Trans folks seem to be overrepresented in rationalist communities generally, at least on the West Coast - but there may be all sorts of valid reasons for that.
None of the murder victims I'm aware of were transgender?
[dead]
[flagged]
[flagged]
I’m from Burlington and a couple weeks ago downtown I noticed a group of 2 or 3 people walking past me in full black clothing with ski masks (the kind you rob banks with).
I thought it was strange, having never seen that before except on Halloween, but didn’t think to alert any authorities specifically because Burlington is filled with people dressing differently and doing strange things. But 99% of the time it’s totally non violent and benign.
I’m guessing this was them. Scary!
I can't speak to Burlington but in philly balaclavas (which is what those masks are called) are quite common and have been since 2020. I suspect this is true of many cities. It's been the subject of some controversy involving mask bans. In fact seeing someone in all black with a ski mask on is a pretty typical, if intimidating, fashion.
Some bookmarks from early 2023 that seem relevant now:
https://zizians.info
https://old.reddit.com/r/SneerClub/
https://aiascendant.substack.com/p/extropias-children-chapte...
I do not engage with any of those people nor their weird communities.
Sneer Club is one of the most nasty, uncharitable, and bad-faith subs out there these days. They generally hate HN, as well. I think any community which exists solely to make cheap shots at another community is poison at its core, SC’s parasocial relationship with LW is a perfect example.
N-gate was among the only things that made this website worth reading. They solely existed to make "cheap shots" at HN. If that's "poison", than I don't want an antidote!
Just wanted to say fantastic substack writing. Thanks for linking as I go down this rabbit hole
Yeah! I devoured the entire series of posts in one go back then, I had no idea about all the people and their ties. Plus it was a super engaging read, I could imagine being there.
I especially enjoy how the author from the substack series described the singularity as “The Rapture of the Nerds”.
I really loved the language describing the singularity as "an inescapable runaway feedback loop which leads to the ascension of an enemy god". Beautiful.
Holy shit, there are 7 chapters to that last one. Chapter 1 is fucking mind-blowing. I could never figure out why they were obsessed with Roko's basilisk but it makes total sense now considering how it all started.
This is such an epic unbelievable story. We live in this world?
Chapter 6 covers how StarSlateCodex comes into the picture, by the way. I always wondered that too.
Most of the news coverage I've seen of this story is omitting what some might consider a relevant detail: almost all the members of this group are trans.
This is a divisive topic, but failing to mention this makes me worry a story is pushing a particular agenda rather than trying to tell the facts. Here's what the story looks like if the trans activism is considered central to the story:
https://thepostmillennial.com/andy-ngo-reports-trans-terror-...
While Ngo's version is definitely biased, and while I don't know enough about the story to endorse or refute his view, I think it's important to realize that this part of the story is being suppressed in most of the coverage elsewhere.
it's been an exhausting couple of weeks for me, as a trans person. one executive order after another, explicitly attacking us. scrambling to update all my documents, navigating a Kafkaesque bureaucracy with constantly shifting rules.
now this.
there are like six Zizians. there are millions of trans people. I'm sure that many of the Zizians being trans says something about the Ziz cult, but Ziz doesn't say anything about "trans activism."
any evil one trans person does, is used to stain all trans people. recognize this tendency; don't let this become like blood libel.
I’m not a big George W Bush fan but this quote of his has stuck with me for years:
> Too often, we judge other groups by their worst examples while judging ourselves by our best intentions
> any evil one trans person does, is used to stain all trans people. recognize this tendency; don't let this become like blood libel.
As a Christian, I can empathize. The wrongs and hypocrisies of so many are heaped on those who have no relation to the actions.
Sorry this is happening to you.
What really shatters my faith in common sense is the fact that trans is a minority of a minority and is made out to be a problem when it's not.
Just remember that you have allies in this industry. Lots of love.
I know I sound crazy saying what I'm about to say but it is the truth as I understand it and I think it's important.
It appears to me that there is a certain modality of thought that occurs much more often in people with hEDS, specifically those with TNXB SNPs. If you're super deep into type theory the odds substantially increase that you have hEDS - it's how I found out that I had it. And this same group is far more likely to be trans than the general population. A link that would be far more obvious if hEDS wasn't so underdiagnosed.
Additionally, it appears to me that mental disorders are often caused by auto-immune conditions which is extremely common in those with hEDS. So with a strong selection bias on math ability and trans and you're gonna end up with a lot of hEDS people who are strongly predisposed to mental disorders. I know someone with hEDS who obsessively studies the link between hEDS and serial killers - not something I want to be associated with the stats were pretty convicting. I do think it is possible that two TNXB SNPs are sufficient to explain why I think the way I do, why I'm far more like Ted Kaczynski than I would like to be. Of note; Ted Kaczynski did consider gender reassignment back in 1966.
Which is to say two things, I think what people are observing is a real phenomena and it is not purely from personal biases, though I'm not denying personal biases play a part in perception. And perhaps with that in mind the solution is in fact in diagnosing and treating the underlying auto-immune conditions. And to put a hat on a hat on my 'crazy' I think people are going to find that GLP1-Agonists like ozempic, specifically at the lower doses, are quite helpful in managing auto-immune conditions, among other things.
Precisely.
Most US cults are made up almost entirely of cis people, but nobody jumps to conclusions about the impact of cisness on indoctrination susceptibility.
Absolutely. I think the rationalists feel this way too.
Keep fighting, Sterlind. Most people aren't full of hate. Just the assholes who take the time to comment mean things on social platforms.
It’s not “like blood libel.” It is blood libel. It is literally the same thing.
Yes, this. Please. I am so very tired, every day I wake up to the news that more legal protections are being stripped from me and the people I care about. I didn't need "trans terror" flashed in my face in large boldface type on top of everything else tonight. The GP didn't make this clear, but The Post Millenial is apparently a far-right publication and the author of that article seems to have built his brand on painting large groups of people as violent.
Thank you for being you. Stay safe; the consequences of this election are ugly.
I am so so sorry you have to deal with this. As an Australian I have been watching on in horror this week at the way trans persons are being demonized and oppressed in your country. I know HN is meant to be an apolitical space, but I hope that the mods here have the sense to allow a certain amount of push back again this fascist nonsense.
It’s not about blood libel or attribution of evil. But even trans people acknowledge that transness is comorbid with other personality disorders.
[flagged]
[flagged]
> almost all the members of this group are trans.
The Zizians.info site (linked by one of the HN posts re: this story) mentions that the Zizians did target people who identified as transgender for indoctrination, so this is not really surprising. People who are undergoing this kind of stress and marginalization may well be more vulnerable to such tactics.
The Ziz method of indoctrination involves convincing his minions they are trapped inside a mental framework they need to break free of. Trans people already feel trapped in a body not aligned with who they are, and are naturally susceptible to this message (and therefore natural targets for recruitment).
Yup. 100% a cult indoctrination technique.
The vulnerability is the crowbar the cult uses
[flagged]
I think the relevance of their transness is not very significant.
The lesswrong apocalypse cult has been feeding people's mental illness for years. Their transness likely made them more outsiders to the cult proper, so e.g. they didn't get diverted off into becoming Big Yud's BSDM "math pets" like other women in the cult.
I doubt they are significantly more mentally ill than other members of the cult they just had less support to channel their vulnerability into forms more beneficial to the cult leaders.
Yudkowsky wrote an editorial in Time advocating for the use of nuclear weapons against civilians to prevent his imagined AI doomsday... and people are surprised that some of his followers didn't get the memo that think-pieces are for only for navel gazing. If not for the fact that the goal of cult leaders is generally to freeze their victims into inaction and compliance we probably would have seen more widespread murder as a result of Yud cult's violent rhetoric.
>I doubt they are significantly more mentally ill than other members.
Why would this certain group defy the US trend of being 4-7x more likely afflicted by depressive dissorder? We are talking about a demographic with a 46% rate of suicidal ideation and you doubt that's significant why?
I shudder to ask, but what exactly is a math pet?
… lesswrong apocalypse cult?
Like the guy who wrote that insufferable Harry Potter fanfiction?
Marginalized groups seem to be a target / susceptible to this kind of thing.
I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration. But the idea wasn't unusual or all that out there so I didn't think much of it. But that group showed up again and again and eventually someone asked and their theory all but seemed to imply that nobody else could possibly have ... feelings and that lack of understanding made those people lesser and them greater.
It seemed to come from some concept that their experience imparted some unique understanding that nobody else could have, and that just lead down a path that lead to zero empathy / understanding with anyone outside.
Reddit encounters are always hard to understand IMO so I don't want to read too much into it, but that isolation that some people / groups feel seem to potentially lead to dark places very easily / quickly.
This group formed in the SF Bay Area, which is known for being one of the most accepting places in the world for LGBT people. If marginalization were the main cause, it seems to me that the group would have been located somewhere else. I think it's more likely that these people had an underlying mental disorder that made them likely to engage in both violent behavior and trans identity.
One big difference the Zizians have with the LessWrong community is that LW people believe that human minds cannot be rational enough to be absolute utilitarians, and therefore a certain kind of deontology is needed.[1] In contrast, the Zizians are absolutely convinced of the correctness of their views, which leads them to justify atrocities. In that way it seems similar to the psychology of jihadists.
1. https://www.lesswrong.com/posts/K9ZaZXDnL3SEmYZqB/ends-don-t...
>I had a weird encounter on reddit with some users who expressed that "only X people understand how this character in the movie feels". Interestingly, there was no indication that the movie intended this interpenetration.
The death of the author is a reasonable approach to reading a work. But what you said reminded me of the more delusional view in which a) the watcher/reader's approach is the only "correct" one, and b) anyone who disagrees is *EVIL*. An instance of this happened among Tumblrinas obsessed with the supposed homosexual relationship between Holmes and Watson on BBC's Sherlock, and who were certain that the next episode of the show would reveal this to the world. Welp. <https://np.reddit.com/r/the_meltdown/comments/5oc59t/tumblr_...>
I see it mainly as a reaction to a dysfunctional and abusive system/culture, and not necessarily a constructive one.
Fix the world and these problems don't exist.
There is a well-documented correlation between gender dysphoria, mental health conditions, and autism spectrum disorder. These overlapping factors may contribute to increased vulnerability to manipulative groups, such as cults.
Thanks the pronouns were confusing me and making it hard for me to follow the complex story. I assumed I made a mistake when the article mentions a Jack and refers to them as Jack the whole way through but uses she at the end.
Unfortunately the gendered language we use is also the mechanism to provide clues and content as you read the story. So if I can rely on that they need to call it out to help the reader.
I'd rather the article mention it.
Why are they not? Is this a chilling effect?
It goes unmentioned because there is an unwritten rule in progressive media that marginalized groups must never be perceived as doing wrong, because that will deepen their marginalization.
In practice it creates a moral blind spot where the worst extremists get a pass, in the name of protecting the rest. Non-progressive media are all too happy to fill in the gap. Cue resentment, cue backlash, cue Trump. Way to go, guys!
The fact that many are transgender seems to be relevant because it’s a recruiting and manipulation tactic, not because of a connection to “trans activism.” I haven’t seen any evidence of that connection besides people involved being transgender.
Why scare quotes? There are political organizations representing trans-people (and doing quite a bit of activity).
I don't think it's so much pushing an agenda, as it is avoiding a thermonuclear hot potato of modern life. If you start talking about gender identity, everyone has STRONG opinions they feel they must share. Worse, a subset of those opinions will be fairly extreme, and you're potentially exposing yourself to harassment or worse. If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
So if you can tell the story without the possibly superfluous detail of the genders of the people involved, that's a pretty obvious angle to omit. Andy Ngo is obviously not doing this, but that's really only because he has a very clear agenda and in fact his entire interest in this story probably stems from that.
Yes, that's a reasonable possibility as well. It's not proof of an agenda, and might be prudent, but I do think it's a form of bias. There's a thin line between skipping "possibly superfluous" details and skipping core parts of a story that might provide evidence for viewpoints one disagrees with. The result is still that readers need to realize that they are being presented with a consciously edited narrative and not an unbiased set of facts.
No, that is omitting quite a significant detail. If apparently the majority of people have X characteristic that is a tiny percentage in the overall population there is some correlation or something newsworthy there,
> If you sound like you're attacking trans people, that's going to end badly. If you sound like you're supporting them, especially as this new US administration takes off... that's going to end badly.
That’s not true: 99% percent of news outlets have absolutely no fear supporting trans activism.
It’s trivial to find hundreds of such cases from sfgate with a google search.
The local paper did a pretty fair job as far as I can tell. https://sfist.com/2025/01/29/suspect-and-possible-cult-membe...
Ngo certainly notes the trans aspect with some charged words. Calls them “leftist trans militants with alleged ties to a trans terror cell.”
I suspect the answer is closer to "one trans person preyed upon the trust and vulnerability of people in their circle" and that happened to include multiple other trans people (who were likely extremely vulnerable to the charismatic charms of a cult leader).
They’re also all software developers, right?
I'm sorry, but Andy Ngo is beyond "biased" - he is deliberately derogatory towards the trans community whenever he has the opportunity.
If the gender identities of the Zizians aren't being brought up in the mainstream press, it's likely because it isn't relevant to the story. Responsible reporting includes not bringing up details that could encourage moral panic or hate crimes if they aren't demonstrably relevant to the story. This kind of "well actually" response is really no different than the racist complaints people make in the comments of every crime story that failed to mention how black the accused was, when race wasn't a factor, and nobody ever cares when the accused is white.
Most cults are filled with straight, cigender, white people. If every story about cult violence brought this up, its connection to the story would be rightfully mocked as contrived.
Citing the guy who tries to dig up dirt on every trans person he can isn't exactly a revelation. It's exactly what I would expect Ngo to do, and only because it validates his neo-fascist peers' anti-trans views.
The fact that a death/murder cult might be deliberately targeting vulnerable trans folks for recruitment and indoctrination can certainly be relevant. I agree that merely talking about "a vegan trans" murder cult, as some media outlets did, would be something rather different however.
I don't know, seems like if we're trying to do that kind of targeting, LessWrong is the better place to start. 100% of these people are LessWrong people, right?
You think LessWrong is a better place to start probably because you've heard a lot about trans young people whereas you probably haven't heard much about and probably don't know much about Less Wrong, but I am confident that if you were to get to know us, you'll find that we are mostly good people and we have a healthy community.
> Bauckholt was a biological male who identified as trans and used feminine pronouns. He was an award-winning youth math genius who later graduated from the University of Waterloo in Ontario, Canada ..... Around 2021, he was hired as a quantitative trader at Tower Research Capital in New York.
Honestly, being trans is the least interesting thing about this dude (girl?). This is not some random angry person with uncontrolled emotions.
I'm probably the furthest thing from an active supporter of trans (whatever you take it to mean, I'm old-fashioned about gender). But how does it matter to this story at all? You could take any group of people and find a crazed group of killers among them.... And people tend to stick to people like them, so again, how is it relevant?
> omitting what some might consider a relevant detail
This sounds weaselly enough to detract from the rest of the comment, even though you later say why it might be relevant.
“Ziz seems to go out of her way to target transgender people.” https://zizians.info/
Are people confusing transgender and transhumanism somehow? Looking up rationalist philosophy it seems to be about 'improving' the human species, human potential movement related, people becoming cyborgs and living forever as AI uploads, etc.? Vaguely eugenic in outlook, if more individualist. I suppose such a philosophy views gender as an irrelevant issue, so recruiting transgender people would be something they do?
A good rule of thumb is that people who view philosophy as something other than an amusing pasttime are best avoided, especially when they're spending their time trying to recruit others into their cult.
Almost all are trans and all are rationalists. OK.
I know something else that is over-represented in killings. Soldiers. And soldiers are mostly male. So male is the natural killing machine, right?
But male humans are mostly selected to be soldiers by design. In some countries the only possible gender for soldiers.
So mabe it could be that there is some other agenda at play here? Mabe it is not related to trans but to grooming a target group into becoming cult members? Why is it that we always have to think there is a /Big Conspiracy/ somewhere? Don't spread around fud that you have no clue about with words like "omitting something I consider relevant" without making damn sure it realy is relevant. You just feed the trolls if you keep doing this.
The Post Millennial is a right-wing rag from an anticommunist sex cult and any assertion by Ngo should be discarded out of hand.
[flagged]
[flagged]
[flagged]
[dead]
[flagged]
[flagged]
Andy Ngo is not a credible source of news about trans people. Media Matters describes him as a “right wing troll” who spread misinformation about this issue [1] and The Advocate points out that right wing media have repeatedly ignored the facts around supposed trans shooters and continued to spread misinformation on the subject. [2]
[1] https://www.mediamatters.org/diversity-discrimination/apalac...
[2] https://www.advocate.com/news/apalachee-school-shooter-trans...
Media matters is just as biased as Andy Ngo. They are the definition of a hit piece mill, and will find any reason possible to criticize popular figures with right wing beliefs.
IMO, the media frenzy on the subject was part of a corporate plot to promote certain beliefs in order to silence contrarian ideas which could trigger a conversation around corporate negligence topics such as the increase of endocrine disruptors in our environment and their effects on our health.
The plot worked for some extent. Hence, I cannot fully express myself here in clear language. We can see that health has become a central topic of American politics but we're still dancing around some of the more important issues, because implying certain connections is taboo.
The first step to fixing a problem is acknowledging it. If not fixed, it will get worse until it becomes impossible to ignore.
The main group is a sex cult and nobody wants to have sex with the trans people so they got kicked out or something.
WARNING: The Post Millennial is an extremist website.
I can’t believe that getting “news” about an extremist group from another extremist organization is a productive way to make sense of the world.
Honestly, read whatever you want but just be aware that radical extremist exist and commit horrific crimes and other radical extremist will exploit that.
It is radical extremism that’s dangerous in and of itself—not just a particular brand of radical extremism.
Carry on.
* WARNING: The Post Millennial is an extremist website.*
Ok, but is there anything false stated?
In what way is this supposed to be a relevant detail? Unless you think that they are killing people because they are trans, why should you report that they are part of a marginalized group? If they were mostly blondes or had freckles, should that be part of the story too?
It seems as if the group targeted trans member in their recruitment - and then used evidence of general marginalization to justify their crimes.
If you look up old reddit threads about the murder of the landlord, you can see many people defending the crime as the landlord was transphobic. It's not just a random detail like freckles, it seems like the identity shaped the way this group interacted with the world.
It's basically impossible that it's a random irrelevant detail, I'd say any such detail is fair to share.
For example, if every member of this group was Indian American I'd consider that a fair detail to note, the chances of that happening at random are minuscule, yet that's orders of magnitude more probable than all of them being trans for no reason.
They're over-represented by ~1000x compared to background population; that's relevant.
Or what if they were rationalists? Would they include that in the headline?
[flagged]
Doesn't sound rationalist to me (from Ziz quoted section of article linked below):
"Ziz
Impostors keep thinking it's safe to impersonate single goods. A nice place to slide in psyche/shadow, false faces, "who could ever falsify that I'm blaming it on my headmate!"
Saying you're single good is saying, "Help, I have a Yeerk in my head that's a mirror image of me. I need you to surgically destroy it, even if I'm then crippled for life or might die in the process. Then kill me if I ever do one evil act for the rest of my life. That's better than being a slave. Save me even though it is so easy to impersonate me. And you will aggro so many impostors you'll then be in a fight to the death(s) with. Might as well then kill me too if I don't pass an unthinkable gom jabbar. That'll make us both safer from them and I care zero about pain relative to freedom from my Yeerk at any cost."
It's an outsized consequentialist priority, even in a doomed timeline, to make it unsafe to impersonate single goods.
Critical to the destiny of the world. The most vulnerable souls impostors vex. To bring justice to individual people, from collective punishment."
https://openvallejo.org/2025/01/31/zizian-namesake-who-faked.... More detail.
I put this post into Suno
https://suno.com/song/13bdb0ef-a219-4a2c-b1bf-646698593e56
needs more male falsetto
This Ziz person is really unhinged. I read some of their writing, it reminds me of every eloquent, manipulative narcissist I've met. They are never as smart as they think they are - or as smart as they want you to think they are - though they may be smart, charming, and engaging. They've created an alternate universe in their mind and haphazardly abuse whatever ideas they've encountered to justify it.
what the heck does any of this mean??
Sadly, I completely understand after reading all the links in this thread tonight.
The specific theory they're speaking in: https://zizians.info/
Backstory (7 chapters): https://aiascendant.substack.com/p/extropias-children-chapte...
They write and talk in their group lingo so outsiders can't understand it without diving deep into their lore, mindset and community. It's a common thing. Seen it numerous times. Don't waste your time.
I'm sure it makes sense to the most indoctrinated of the cult members.
One of the many rabbit holes, the deeper layers of.
Head of LessWrong and generally active rationality community leader here. Happy to answer any questions people have. These people haven't been around the community for a long time, but I am happy to answer questions with my best guesses on why they are doing what they are doing.
They've been banned on LW and practically all in-person things for like 5+ years now. My guess is the reason why they hung around the rationality community this much years ago is just that it's a community with much higher openness to people and ideas than normal, especially in the Bay Area. IMO in this instance that was quite bad and they should have been kicked out earlier than they did end up getting kicked out (which was like 4 years ago or so).
I don't understand how the police encountered a bail-skipping presumed dead person at a crime scene and just let them go.
Cops are fundamentally lazy.
Unfortunately it seems that a cop's laziness is inversely correlated with the melanin in the suspect's skin.
This summary doc, "The Zizian Facts", is another collection of relevant information from various sources (including recent events):
https://docs.google.com/document/u/0/d/1RpAvd5TO5eMhJrdr2kz4...
The HN title added the word "rationalist", which isn't in the source article. This is editorializing in a way that feels kind of slander-y. Their relationship to the bay area rationalist community is that we kicked them out long before any of this started.
It does appear in the article.
> The group is a radical offshoot of the Rationalism movement, focusing on matters such as veganism and artificial intelligence destroying humanity.
You yourself seem to acknowledge this as a fact.
Can you tell us more about how they were kicked out? Are there other groups that have been kicked out?
I mean, they seemed kind of visibly crazy, often saying threatening things to others, talking about doing crazy experiments with their sleep, often insinuating violence. They were pretty solidly banned from the community after their crazy "CFAR Alumni Reunion" protest stunt, and before then were already very far into the fringes.
In addition to the tragedy of the killings, I worry that this will give rationalism _as a concept_ a bad name. Deep thought is so important to progress, and already is somewhat stigmatized. My church underwent a nasty split when I was a kid, and the reason I heard was “pastor Bobby read too many books”. Obviously it wasn’t as simple as that, but the message was clear — don’t read too many books. I suspect this will interpreted similarly.
It feels that our world has a resurgence of anti-intellectual, anti-science, anti-data, etc movements. I hate that.
> ...our world has a resurgence of anti-intellectual...
That's the US, not the world. There has always been an anti-intellectual strain in American culture.
I believe Stalin's purges included intellectuals as a group.
Mao's hundred flowers thing was a purge of people who dared think they knew better than him.
Or for current popular things, the US is far from alone in seeing the electorate reject the standard academic position on national borders.
The fact that you can make a cult of the Rationalist movement is truly a testament to the fact that some humans are able to turn anything into a cult
If anything, the reading I have done (mostly SSC and LW) have made me less radical and much much more humble
Rationalism has a number of characteristics typical of cults.
1. Apocalyptic world view.
2. Charismatic and/or exploitative leader.
3. Insularity.
4. Esoteric jargon.
5. Lack of transparency or accountability (often about finances or governance).
6. Communal living arrangements.
7. Sexual mores outside social norms, especially around the leader.
8. Schismatic offshoots.
9. Outsized appeal and/or outreach to the socially vulnerable.
Sadly rationalism is a movement where it's easy for someone who doesn't understand to wrap their desires in "rationality" and carry them out without any moral guilt.
Rationalism? The term has been used a lot of times since Pythagoras [0], but the combination of Bay Area, Oxford, existential risks, AI safety makes it sound like this particular movement could have formed in the same mold as Effective Altruism and Long-Termism (ie, the "it's objectively better for humanity if you give us money to buy a castle in France than whatever you'd do with it" crowd that SBF sprung from). Can somebody in know weigh in?
[0] https://en.wikipedia.org/wiki/Rationalism#History
You're correct. Those communities heavily overlap.
Take, for example, 80,000 Hours, among the more prominent EA organizations. Their top donors (https://80000hours.org/about/donors/) include:
- SBF and Alameda Research (you probably knew this),
- the Berkeley Existential Risk Initiative, founded (https://www.existence.org/team) by the same guy who founded CFAR (the Center for Applied Rationality, a major rationalist organization)
- the "EA infrastructure fund", whose own team page (https://funds.effectivealtruism.org/team) contains the "project lead for LessWrong.com, where he tries to build infrastructure for making intellectual progress on global catastrophic risks"
- the "long-term future fund", largely AI x-risk focused
and so on.
It bothers me also how the word “rationalist” suddenly means the LW crowd, while I keep thinking Leibniz
Their sneers are longer than their memories
Check your notifications
Doom scroll
Refresh
Refre
Ref
Rationalism is simply an error. The thing being referred to here is "LessWrong-style rationality", which is fundamentally in the empirical, not rational school. People calling it rationalism are simply confused because the words sound similar.
(Of course, the actual thing is more closely "Zizian style cultish insanity", which honestly has very very little to do with LessWrong style rationality either.)
Thet're virtually identical. Seven chapter history: https://aiascendant.substack.com/p/extropias-children-chapte...
Just like HN grew around the writing of Paul Graham, the "rationalist community" grew around the writings of Eliezer Yudkowsky. Similar to how Paul Graham no longer participates on HN, Eliezer rarely participates on http://lesswrong.com anymore, and the benevolent dictator for life of lesswrong.com is someone other than Eliezer.
Eliezer's career has always been centered around AI. At first Eliezer was wholly optimistic about AI progress. In fact, in the 1990s, I would say that Eliezer was the loudest voice advocating for the development of AI technology that would greatly exceed human cognitive capabilities. "Intentionally causing a technological singularity," was the way he phrased it in the 1990s IIRC. (Later "singularity" would be replaced by "intelligence explosion".)
From 2001 to 2004 he started to believe that AI has a strong tendency to become very dangerous once it starts exceeding the human level of cognitive capabilities. Still, he hoped that before AI starts exceeding human capabilities, he and his organization could develop a methodology to keep it safe. As part of that effort, he coined the term "alignment". The meaning of the term has broadened drastically: when Eliezer coined it, he meant the creation of an AI that stays aligned with human values and human preferences even as its capabilities greatly exceed human capabilities. In contrast, these days, when you see the phrase "aligned AI", it is usually being applied to an AI system that is not a threat to people only because it's not cognitively capable enough to dis-empower human civilization.
By the end of 2015, Eliezer had lost most of the hope he initially had for the alignment project in part because of conversations he had with Elon Musk and Sam Altman at an AGI conference in Puerto Rico followed by Elon and Sam's actions later that year, which actions included the founding of OpenAI. Eliezer still considers the alignment problem solvable in principle if a sufficiently-smart and sufficiently-careful team attacks it, but considers it extremely unlikely any team will manage a solution before the AI labs cause human extinction.
In April 2022 he went public with his despair and announced that his organization (MIRI) will cease technical work on the alignment project and will focus on lobbying the governments of the world to ban AI (or at least the deep-learning paradigm, which he considers too hard to align) before it is too late.
The rationalist movement began in November 2006 when Eliezer began posting daily about human rationality on overcomingbias.com. (The community moved to lesswrong.com in 2009, at which time overcomingbias.com became the personal blog of Robin Hanson.) The rationalist movement was always seen by Eliezer as secondary to the AI-alignment enterprise. Specifically, Eliezer hoped that by explaining to people how to become more rational, he could increase the number of people who are capable of realizing that AI research was a potent threat to human survival.
To help advance this secondary project, the Center for Applied Rationality (CFAR) was founded as a non-profit in 2012. Eliezer is neither an employee nor a member of the board of this CFAR. He is employed by and on the board of the non-profit Machine Intelligence Research Institute (MIRI) which was founded in 2000 as the Singularity Institute for Artificial Intelligence.
I stress that believing that AI research is dangerous has never been a requirement for posting on lesswrong.com or for participating in workshops run by CFAR.
Effective altruism (EA) has separate roots, but the two communities have become close over the years, and EA organizations have donated millions to MIRI.
What puzzles me about Eliezer Yudkowsky is this:
He has no formal education. He hasn't produced anything in the actual AI field, ever, except his very general thoughts (first that it would come, then about alignment and doomsday scenarios).
He isn't an AI researcher except he created an institution that says he is one, kind of as if I created a club and declared myself president of that club.
He has no credentials (that aren't made up), isn't acknowledged by real AI researchers or scientists, and shows no accomplishments in the field.
His actual verifiable accomplishments seem to be having written fan fiction about Harry Potter that was well received online, and also some (dodgy) explanations of Bayes, a topic that he is bizarrely obsessed with. Apparently learning Bayes in a statistics class, where normal people learn it, isn't enough -- he had to make something mystical out of it.
Why does anyone care what EY has to say? He's just an internet celebrity for nerds.
A great example of superficially smart people creating echo chambers which then turn sour, but they can't escape. There's a very good reason that, "Buying your own press" is a cliched pejorative, and this is an extreme end of that. More generally it's just a depressing example of how rationalism in the LW sense has become a sort of cult-of-cults, with the same old existential dread packaged in a new "rational" form. No god here, just really unstable people.
That castle was found to be more cost-effective than any other space the group could have purchased, for the simple reason that almost nobody wants castles anymore. It was chosen because it was the best calculation; the optics of it were not considered.
It would be less disingenuous if you were to say EA is the "it's objectively better for humanity if you give us money to buy a conference space in France than whatever you'd do with it" crowd -- the fact that it was a castle shouldn't be relevant.
Nobody wants castles anymore because they’re impractical and difficult to maintain. It’s not some sort of taboo or psychological block, it’s entirely practical.
Actually, the fact that people think castles are cool suggests that the going price for them is higher than their concrete utility would make it, since demand would be boosted by people who want a castle because it’s cool.
Did these guys have some special use case where it made sense, or did they think they were the only ones smart enough to see that it’s actually worth buying?
> That castle was found to be more cost-effective than any other space the group could have purchased
In other words, they investigated themselves and cleared themselves of any wrongdoing.
It was obvious at the time that they didn't need a 20 million dollar castle for a meeting space, let alone any other meeting space that large.
They also put the castle up for sale 2 years later to "use the proceeds from the sale to support high-impact charities" which was what they were supposed to be doing all along.
The depressing part is that the "optics" of buying a castle are pretty good if you care about attracting interest from elite "respectable" donors, who might just look down on you if you give off the impression of being a bunch of socially inept geeks who are just obsessed with doing the most good they can for the world at large.
Both are factual, the longer statement has more nuance, which is unsurprising. If the emphasis on the castle and SBF - out of all the things and people you could highlight about EA - concisely gives away that I have a negative opinion of it then that was intended. I view SBF as an unsurprising, if extreme, consequence of that kind of thinking. I have a harder time making any sense of the OP story in this context, that's why I was seeking clarification here.
The irony of pure rationalists buying a castle, unable to see what every other market participant can.
Why buy a conference space. Most pubs will give you a seperate room if you promise to spend some money at the bar. There are probably free spaces had they researched.
If I am donating money and you are buying a conference space on day 1 I'd want it to be filled with experienced ex-UN field type of people and Nobel peace prize winners.
Otherwise it looks like a grift.
I'd love to see the logic they used to determine the castle was the best option.
Optics are an important part of being effective
[flagged]
Wow. Didn't they learn about ecosystems at school. And who says they are suffering?
On the fringes of the rationalist community, there are obviously questionable figures who may be playing an evil game (Bankman Fried) or have lost their way intellectually and morally.
My impression from occasional visits to astral codex ten, however, is that the vast majority of people are reasonable and sometimes succeed to make the world a better place.
Translated with DeepL.com (free version)
> My impression from occasional visits to astral codex ten, however, is that the vast majority of people are reasonable and sometimes succeed to make the world a better place.
This is a blog that for the last several months had a vigorous ongoing debate about whether or not shoplifters should be subject to capital punishment.
'Reasonable' is not the word I would choose.
Scott Alexander writes very well, perhaps he is blinding me.
https://www.astralcodexten.com/p/in-continued-defense-of-eff...
> My impression from occasional visits to astral codex ten, however, is that the vast majority of people are reasonable and sometimes succeed to make the world a better place.
The rationalist bloggers are very good at optics and carefully distance themselves from the fringes at the surface. They have a somewhat circular definition of rationalism that defines rationalists as being reasonable, which makes it easy to create post facto rationalizations that anyone who ends up on the wrong side of public opinion was actually not part of their tribe, rewriting any history.
The more uncomfortable topics are generally masked in coded language or carefully split off into isolated subforums for plausible deniability. Slate Star Codex (Astral Codex Ten’s precursor) had a “culture war thread” for years that was well known to contain a lot of very toxic positions dressed up in rationalist style language. Around 2019 they realized how much of a problem it was and split it into a separate forum (“The Motte”) for distance and plausible deniability. The Motte was a wild place where you could find people pushing things like holocaust denial or stolen election theories but wrapped up in rationalist language (I remember phrases “questions about orthodox holocaust narratives” instead of outright denial)
There’s also a long history of Slate Star Codex engaging with neoreactionary content over and over again and flirting with neoreactionary ideas in a classic rationalist “what if they’re actually right” context for plausible deniability. There have been some leaked emails from Scott revealing his engagement with the topic and it’s been an ongoing point of confusion for followers for years (See https://www.reddit.com/r/slatestarcodex/comments/9xm2p8/why_... )
The history of rationalist blogs and communities is largely lost on people who only occasionally visit the blogs and enjoy the writing style. There is a long history of some very unsavory topics not only being discussed, but given the benefit of the doubt or even upvotes. These are harder to associate with the main blogs since the 2019 split of contentious topics into “The Motte” side forum, but anyone around the community long enough remembers the ever-present weirdness of things like this Reddit thread on /r/SlateStarCodex preaches white nationalism and gets nearly 50 upvotes (in 2014): https://web.archive.org/web/20180912215243/https://www.reddi...
Reading a couple SSC posts for the first time here myself, so my impression is fairly limited, but it sounds like you might be blaming SSC unfairly for simply intellectually engaging with reactionary ideas, which I can't fault someone for, and nor should you.
Can you link to some specific examples which more explicitly have the "What if they're right?" subtext you're referring to?
Slate Star Codex's engagement with neoreactionary thought is not exactly a secret. He wrote both "Reactionary philosophy in an enormous, planet-sized nutshell" https://slatestarcodex.com/2013/03/03/reactionary-philosophy... (a sympathetic treatment of these ideas) and "The Anti-Reactionary FAQ" https://slatestarcodex.com/2013/10/20/the-anti-reactionary-f...
There's also plenty of good reasons to be aware of these political ideas, given that, e.g. New Confucianism, which just happens to be quite influential in China is essentially a kind of "Neo-Reaction with Chinese Characteristics". And some people argue that the highly controversial Project 2025 - which seems to be driving policy in the new Trump administration - may be inspired by neo-reactionary ideas.
> self-described “vegan Sith” ideology
Had it not been in a serious article I would have believed it had to a parody or a joke of some sort.
“We are just like Darth Maul, but we like salads and drink soy milk instead of regular milk… and then kill people while dressed in tactical black outfits”?
What is even going on? Real life now sounds like some kinda of a broken LLM hallucination.
"Vegan Sith"?
For just one thing, when you have a broken education system and omnipresent media franchises, you have a significant percentage of the population who know more about the Star Wars backstory structure and theories of diet than about history, civics or conventional morality.
These kids were very well educated and went to some of high end schools according to the article
I am not defending any of this fuckwits, but I don't know that it's much different than any organized religion. All of them are stories that get retold over and over until people accept them on faith. I can envision a world where our stories (movies, books) where history is lost of their creation, become facts. "Of course there was Jedi, we've just forgotten…"
Now, they're all fuckwits, but it's not outside the realm of thought.
(BRB, gonna go start a sci-fi story.)
A far more likely possibility is that their ideology is actually centered around "Sith who happen to originate from Vega (a.k.a. α Lyrae)", not "Sith who abstain from animal products".
(A residual possibility is "Sith from Las Vegas, Nevada".)
Look up Joshua Citarella’s coverage of the ideological milieus that Generation Z cultivated during COVID on platforms like Discord.
And then check out the term “metairony” or “postirony” and this story make more sense…at least as much sense as the absurdity of it all will allow you to make of it.
This post came to mind for me
https://moonmetropolis.substack.com/p/antioch-the-first-food...
We seem to live in a post-ironic moment. Look no further than the Boogaloo Boys. They want to start a second American civil war and are named after the 1984 movie "Breakin' 2: Electric Boogaloo".
It would be a joke except they've engaged in violent attacks up to and including murder in the service of trying to start that aforementioned civil war. Are they serious or a joke? I think their embrace of a ridiculous name makes them almost more frightening because it shows their contempt for reasonableness, for lack of a better term.
It's comparable to how the Nazi "goose step" march was terrifying precisely because it was so awkward and unnatural. It's like, if these guys are capable of doing this with a straight face, what else are they capable of?
https://en.wikipedia.org/wiki/Boogaloo_movement
people can and do have mental illness. it doesn't absolve them from knowing right from wrong however.
We really need a phrase for "appears to behave normally in regular matters and is capable of independent living, does not match other DSM symptoms, but believes insane things". It's a very common component in mass shootings.
> it doesn't absolve them from knowing right from wrong however
Certain mental conditions absolutely do, in a legal sense. Some conditions onset from brain injury and tumors, some to drug exposure, some are genetic, etc. For those unfortunate few who are afflicted, the most humane outcome is to intervene to keep them away from harming themselves and others, though often they are left unattended to by overstretched mental health services.
Since it’s a group so wonder if a group mental illness would make sense. I guess it could work for the court “look how out of touch with reality my client is, sith vegans, clearly they didn’t know what they were doing!”
(This was originally posted in https://news.ycombinator.com/item?id=42849281, but we merged that thread hither)
People love to fall into these sort of traps where they convince themselves they are fighting for some sort of cause. The more hopeless the better.
What is going on: the internet is letting you know all this shit. Imagine you had a ticker feed of every story amongst the 8bn. That would be a crazy story per millisecond.
[flagged]
define insane here.
I think modern social acceptability hinges more on the degree to which personal beliefs infringe on others, at least in the west.
If someone refuses to personally eat pasta because they believe in the great spaghetti monster, they can knock themselves out for all I care.
I don't know if these statistics are even kept, but the current social environment in the US feels like a ripe breeding ground for cults. I've had so many people in the past couple years be like "I just want to farm with my friends and family and get away from all this".
It has become a bit of a meme lately. I think there's something to be said about a malaise era leading to an uptick in erratic behavior.
But at the risk of sounding smug and condescending, as someone who actually bought 10 acres "to get away from it all", I get the sense that the type of people in this saga would pack up pretty quick after a little taste. Lesswrongers aren't exactly known for pragmatism, which is sorta the only mindset that works. There's all this work that you don't know you don't know about. I just fell down the rabbit hole of the ziz lore and goddamn do these people sound inept. Like they couldn't even fix their RV to get off their landlord's property. Lots of quasi-intellectual masturbatory posting and not a lot of skills.
All that is to say I'm not super worried about any of these cults really taking off. Logistics remain challenging.
How does a 27-year-old fail to kill an 80-year-old with a samurai sword?
Probably by trying to do it from first principles.
Living in a container, playing with a Samurai sword… Neal we need you to go on TV and tell these people to quit fooling around.
They failed because they didn't fold the glorious Nippon steel themselves.
I'm sure a historian could give a more detailed explanation, but my guess is that:
Once steel meets flesh the Role Play ends and all that's left is Live Action, and to take a life is not a meager task when life as you knew it was a virtual mediation.
They brought a sword to a gun fight. Presumably they would have killed the landlord on the first go if he hadn't managed to shoot and kill on of the assailants first.
But it seems they went back later and finished the job. (He survived the original attack, only to be subsequently successfully murdered)
I've worked professionally on cases that have parallels.
It suffices to say people sometimes have second thoughts in the moment.
More like life is messy and difficult compared to pretending.
Most ‘samurai swords’ on the market are barely-functional wall-hangers. And the article describes it as a stabbing. Curved swords are actually quite hard to effectively stab with, so it’s hardly surprising that an untrained user using a poor-quality weapon incorrectly couldn’t cause a lethal wound.
"'He had a samurai sword stuck to his back with about a foot of it sticking out in front, his face cut up all over,' said his friend Patrick McMillan."
https://www.ktvu.com/news/two-held-in-death-of-fellow-squatt...
Perhaps they weren't trying?
He knew his Judo well.
[dead]
[flagged]
Every time I hear about anything related to Effective Altruism, it's because people who believe this stuff have done crimes sufficient to make it in to the news.
Are there other contexts they show up where they are not doing crimes, or really non-normative things?
Effective Altruists donate to charities which save lives, etc. They help people to choose effective charities using evidence-based approach, e.g.: https://en.wikipedia.org/wiki/GiveWell
But that doesn't get into the news. Saving people is boring. Killing people? That gets the clicks.
Suppose you have 1000 people in the movement. 998 well-adjusted people. 1 is a crook who uses movement as a shield. 1 is insane freak.
Of course, you'll hear about the crook and the freak. Well-adjusted people are not newsworthy.
LessWrong always felt culty and weird to me, tbh. It's why I stick to HN for my social discourse. No central personality drives this place. It's the opposite of culty. It can be a bit of an echo chamber, maybe, but that's a different problem.
I’m surprised the killing of the customs officer has gotten so little press. I guess if you want your killing spree in the press, don’t do it in Vermont.
The salacious nature of a group of trans cult members killing people across the country is story made for tabloid insanity. The hooks into Silicon Valley and side-quests about AI overlords etc are frosting on the cake.
That murder was the top story on CNN, AP and NPR for about half a day IIRC. It got pushed off because so much happened in other areas nationally. It took about two days for the prosecutors to announce the connections.
Everytime lesswrong or rationalists or Yudkowski or any derivative show up here, I try to read some of the materials around and frankly, I don't understand most of it, it's all so self referential, full of impenetrable jargon and intentionally obtuse. It sounds mentally deranged and cultish. And it turns out that it can one-shot you into a death cult.
Me too, until now.
https://news.ycombinator.com/item?id=42904925
For related background read the (horrifying) description of Leverage, a “research” institution with links to Zizians: https://medium.com/@zoecurzi/my-experience-with-leverage-res...
There’s an undercurrent of cults and cult-like institutions in the rationalist crowd (think: lesswrong.com folks) and this is one instance of this.
I think it’s worth discussing the fact that many folks in EA- and rationalist-adjacent circles are deeply panicked about developments in AI, because they believe these systems pose a threat to human existence. (Not all of them, obviously. But a large and extraordinarily well-funded contingent.) The fear I have with these organizations is that eventually panic - in the name of saving humans - leads to violent action. And while this is just a couple of isolated examples, it does illustrate the danger that violent action can become normalized in insular organizations.
I do hope people “on the inside” are aware of this and are working to make sure those organizations don’t have bad ideas that hurt real people.
Scared to click the link!
> focusing on matters such as veganism and artificial intelligence destroying humanity
Not sure if they are proponents of veganism or if they think it will, in addition to AI, destroy humanity.
This title here is editorialized. The original title on the website is
"String of recent killings linked to Bay Area 'death cult'"
No mention of "rationalist".
But it’s an accurate and useful summary. This group is specifically organized around the principles of veganism and Rationalism.
Coming soon from Scott Alexander: Rationalist Death Cult House Party
Well should be pretty easy for the prosecutor, it sounds like, by their own definition, they meet the criteria for rico. Should be able to wrap up every zizian and at least shake them down for a plea.
After puzzling over the threads on this head-spinning story, my sense is that maybe the most neutral thing to do is re-up this submission of a news report and merge most of the comments hither.
The other recent threads are/were:
The Zizians (2020) - https://news.ycombinator.com/item?id=42898323
The Zizians and the rationalist death cult - https://news.ycombinator.com/item?id=42897871
Suspects in 2 murder cases shared connection to 'Zizian' rationalist group - https://news.ycombinator.com/item?id=42849281
Edit: I guess we can have the current thread be about the recent events and keep https://news.ycombinator.com/item?id=42897871 for discussing / arguing about the generic aspects. Sorry for the complexity but in this case it appears to be essential not accidental.
i’ve seen so many posts about this on HN i can’t help but wonder if it’s a recruiting drive…
The submitter of this story inserted 'rationalist' into the title; it's not in the headline and IMO it's calumny. Would you take it back out?
Transgender technology cult that kills people somehow doesn't seem "rationalist" to me but who am I to judge.
Most religious wars don't seem that Christlike either.
What people preach and what they do often diverge.
With religion I think we largely just go with "this is what the group identifies themselves with / as" and leave it as that.
Arguably the politically active "religious right" in the US has long since abandoned most everything Jesus had to offer. We still identify them as such.
If you think of the suffix "ist" as a form of negation then it makes sense.
The Zizians were only ever a tiny fraction of lesswrong.com, which gets about 2 million page views per month (according to a site maintainer) -- not as much as HN surely, but not a small web site.
Lesswrong.com has been in operation since 2009 and is closely related to the SL4 mailing list which dates back to 1998 or 1999.
And yet so many cults and politically extreme movements seem to originate out of the LessWrong-o-sphere. Strange, that.
True Anon just did a great episode on this.
https://www.patreon.com/posts/121111568?utm_campaign=postsha...
https://soundcloud.com/trueanonpod/zizian-murder-cult-1 the latest episode of trueanon is about these rationalist wierdos
not very informative, throws a lot of mud
Pretty sure Zizians haven't been considered rationalists for at least five years.
The more I read about these Zizians, the more I'm reminded of Final Fantasy House, whose members had been led to believe, or at least go along with, the idea that they were spiritual manifestations of Final Fantasy VII characters, and were manipulated and exploited by the house's dysfunctional leader, "Jenova":
http://www.demon-sushi.com/warning/
But with deadly consequences in the Zizians' case.
Related thread from a few hours earlier:
https://news.ycombinator.com/item?id=42897871
Is there a grand figure of rationalist philosophy? Someone like a rationalist “Satre” or “Heidegger”?
I think a lot of the weirdness of rational communities comes from them not properly integrating the existing canon of western philosophy and rediscovering pitfalls.
Without the benefit of centuries of discussions it’s easy to come to strange conclusions.
Yes, https://aiascendant.substack.com/p/extropias-children-chapte...
I agree with your diagnosis as well. It's Dunning-Krueger. They keep re-inventing the wheel poorly due to their ignorance which is maintained because of their confidence which stems from their ignorance.
The NYPost has an article about them: https://nypost.com/2025/01/30/us-news/killing-of-border-patr...
Every time I hear about these "rationalist" guys they always seem completely bonkers. Not rational at all.
Well, believing that you can be strictly rational is going to lead you down that dark path.
That’s because the “rational” part is a personal shroud for their delusion.
Ziz (the cult leader) had a blog at sinceriously.fyi, where she elaborated on her philosophy. The blog has since devolved into a string of death threads and subsequently been deleted, but anyone who's interested in the backstory can still find some posts on archive.ph (e.g. [1]) and more on web.archive.org.
From Ziz and other cult members' writings, it's obvious that the cult has been in conflict with the rationalist community, rather than part of it, for years. It's disappointing but not surprising that SFGATE decided to pin the murders on the rationalist community while contorting their writing to avoid mentioning the elephant in the room: the cult leader is a trans vegan, almost all of the members are also trans vegans, and these are core parts of their ideology. They were discussed over and over on Ziz's blog.
In fact, after the katana stabbing and the self-defense shooting by the wounded landlord, another cult member used her blog to blame Zack for the events. Zack wasn't involved, physically present, or in communication with the cult at the time, but he does believe in a different theory of transness than they do [2], so they can't stop being enraged at him. That theory was actually one of the main reasons the cult has split off from the rationalist community. I'm worried that Zack may be in danger.
[1] https://archive.ph/jChxP
[2] http://unremediatedgender.space/about/
"Trans vegan 'Siths' who kill people with katanas" is not a sentence I thought I'd ever read, let alone type, yet here we are, in the most ludicrous timeline.
any violence that is adjacent to rationalism will see a media response like this and a healthy dose of astroturfing to support it. the cia is very eager to make rationalists persona non grata because they provide a cultural nucleation point for anti-AI sentiment. they want to suppress these ideas so that our victory in the AI arms race is secure — unhindered by pesky human rights nonsense. they killed suchir balaji, too.
Lots of parallels here with Luigi Mangione who also referenced a lot of LessWrong ideas
Couldn't be more different.
this cult leader hatched a scheme to house people on a fleet of boats… this is a great analogy for the rest of the story. she could have just moved.
Yeah they're TPOT adjacent. While the original rationalists group were pretty good about watching their biases the later ones are more accurately rationalizers. Some TPOT people have warned them they're making mistakes to no avail.
It's just the MOPs, geeks, sociopaths and Eternal September thing again. They all use the jargon and reach absurd conclusions and don't have the wisdom to realize that an absurd conclusion with a sound argument means a false assumption.
Just because you say "my priors" like priests say "my prayers" doesn't make you rational. They are in the midwit area where you can reason.
The fool dismisses the absurd conclusion because he cannot reason. The wise man because he knows what it means. The average rationalist decides it's true.
When will people learn to identify and stay away from narcissists?
Almost everybody does. But probably there will always be those who don't. Some people's lives are a sad story, and manipulators can learn who is most vulnerable.
By "string" here, we mean one.
Not to imply the Zizians aren't embracing some dangerous ideas, but this is the strangest use of "string of killings" I've ever seen. Author could have gone with "wave of violence," but even then they'd be talking about a "wave" of three incidents in three years.
Four killings. The landlord in Villejo, Michelle Zajko's parents Richard and Rita in Pennsylvania, and now the border patrol agent in Vermont.
Who is Ziz? Is she still alive and active and free? Does anyone know her real identity?
https://news.ycombinator.com/item?id=42902017 covers your questions
Psyops and synthetic narratives out of nowhere always look like this https://trends.google.com/trends/explore?geo=US&q=zizian&hl=...
This whole story is so bizarre. I honestly can barely wrap my head around it.
It also vaguely reminds me of a Monk episode.
Who are these people?
Some sort or EA on steroids?
Eugenics?
Genuinely curious: what are their typical beliefs?
This story goes as mad as a sub-plot in a Philip Dick novel. If a lot of weird stuff happens, how would this affect observers' estimation for the probability that we live in a simulation? I wonder if that was one of the intents.
How the hell haven't I heard of this? Bizarre!
[dead]
I'm so annoyed at how popular rationalism is. Human rationality (along with symbolic creativity) is only useful at generating testable hypothesis, not for direct knowledge of reality. Once someone grant the axioms of rationality they become delusional and spiral (half life of around 6 years)
What the fuck is this history
The more I read, the more I'm confused
Its like four people from a wierd communuty doing violent shit, but online.
What I find interesting about these cases and the seemingly unrelated United Healthcare CEO assassination is that all were committed by extremely online data scientists who went to elite universities.
I’ve been fairly skeptical of the right wing narrative that these schools have been “radicalized” (seeing as universities and young people have always been the hotbeds of leftist thought, how soon we forget the hippie movement)…but this definitely has me wondering.
Also, why is data science a common thread?
When your entire philosophy is a jupyter notebook filled with bugs from beginning to end but you don't know it because you've never heard of a unit test things go wrong.
Probably because they are focusing on "artificial intelligence destroying humanity", as the article says? Many data scientists work on AI nowadays.
Because the size is so small, you can connect many dots.
>Also, why is data science a common thread?
People on the spectrum as susceptible to mind-viruses and enjoy grand ideas. At least two of them it seems.
a good chunk of the current administration is ivy league trained
Feels like we're going back to a period of political decentralization. History suggests that large empires tend to collapse after the rule of law loses its meaning and power. I've been predicting the rise of political gangs for a long time.
It already feels like everyone is in a tribe... Now the level of violence is being dialed up and the overarching structures are losing their ability to enforce the law. Look at how many criminals have been pardoned, nobody cares anymore. Nobody even agrees about what a criminal is.
How can we reduce crime if we cannot even agree on what it means?
So these are related to the "Effective Altruism" far-right movement?
If so, I'm not surprised they've started killing.
most 2025 headline I have yet to see
Is this thread an exemplary performance of bleeding edge AI models or just a bunch of delusional e/acc zealots on too much Prozac?
Uhhh… what? Is this a Aum Shinrikyo type of thing?
"Rationalism"
[dead]
[dead]
[dead]
[dead]
[dead]
yikes the results of attention seeking mentally ill being normalized and not getting the attention they feel they deserve for snipping themselves. maybe we shouldnt normalize mental illnesses..
[flagged]
Nah
[flagged]
What does comment even mean.
I assume food was meant to be full, but why "trans"?
[flagged]
[flagged]
Could you please not use HN for ideological battle? It's not what this site is for, and destroys what it is for, regardless of which ideology you're for or against.
If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful.
How about presenting any kind of argument against what seems to be obviously true to anyone who has been paying attention?
Nah, just downvote it out of perception and get back to Netflix.
[flagged]
[flagged]
[flagged]
what nonsense, the gall to call this rationalist
Pardon the cult-like musings ahead of time, but it’s par for the course.
My initial thought was rationalism is obviously egoic and selfobsessed, loving and trusting ones own thoughts. Set theory should tell you that you cant make a mental image of yourself to act by that will be more encompassing than the totality of what you are. You can’t build a mental model inside of your ego that will work better than your natural instinct for all interaction with reality. Trust sheer emotion to say that rationalising any loss of life means an upside down philosophy, a castle in the sky. This cult with its ”functional decision theory” […that the normative principle for action is to treat one’s decision as the output of a fixed mathematical function”] makes actions a sort of cold choice without emotion. Like people using religion in war to remain cool when killing, a misuse of a neutral idea such as a mathematical function.
But it can’t be that easy to handwave it away. When Aum Shinrikyo was mentioned down below, I changed my mind, there’s no easy answers. A sick leader can justify anything and you can judge any tree from its fruits. From doctrine section on that wikipedia: ”Their teachings claimed a nuclear apocalypse was predicted to occur soon” (Parallell to AI now in rationalism), ”Furthermore, Lifton believes, Asahara "interpreted the Tibetan Buddhist concept of phowa in order to claim that by killing someone contrary to the group's aims, they were preventing them from accumulating bad karma and thus saving them" ” (Parallell to rational behavior guidance gone wrong, these datascientists just lost touch, Norm Macdonald would say theyre real jerks pardon the humor).
I just the other day listened to Eckhart Tolles podcast where he talked about doomsday fear, on the bottom of the transcript it says: [“There's also an unconscious desire in many human beings for what we could call the collapse of the human made world.
Because most humans experience the world as a burden. They have to live in this world, but the world is problematic. The world has a heaviness to it.
You have your job, you have the taxes, you have money, and the world is complex and heavy. And there's an unconscious longing for people in what we might bluntly call the end of the world. But ultimately, what they are longing for is, yes, it is a kind of liberation, but the ultimate liberation that they are really longing for is the liberation from themselves, the liberation from what they experience as their problematic, heavy sense of self that's inseparable from the so-called outer world.
And so there's a longing in every human for that. But that's not going to happen yet.”] Eckhart Tolle: Essential Teachings: Essential Teachings Special: Challenging Times Can Awaken Us 30 jan. 2025
Obvious parallell to AI doomsaying can be drawn.
When we were children we experienced unfiltered reality without formulas to replace our decisions. But we could even then be wrong, stupid, or convinced to do stupid shit by a charismatic playground bully. But when we were wrong it resulted in falling and scraping our knee or whatever. Theres no reality checks in internet culture bubbles.
This is sick people huddling together under a sick charismatic warlord-ish leader whose castle in the sky is so selfcoherent that it makes others want to systemize it aided by the brainwashing methods.[”Zizians believe normal ideas about morality are mostly obfuscated nonsense. They think real morality is simple and has already been put to paper by Jeremy Bentham with his utilitarian ideas. Bentham famously predicted an expanding scope of moral concern. He says if humanity is honest with itself it will eventually adopt uncompromising veganism. Zizians think arguments which don't acknowledge this are not about morality at all, they're about local struggles for power delaying the removal of an unjust status quo.”] Insert Watchmen pic of grandiose narc Adrian Veidt asking Dr Manhattan if utilitarian masskilling was the right choice
And then the sleep deprivation indoctrination method dulls even their rationality even further. So they can all become ego clones of the cultleader.
And that other link in this thread mentioned other groups of rationalists debugging from demons sent by adversary groups and other psychotic stuff, yeah is it the chicken or the egg where those people gather in a place where people loop with their mind or is it the mindlooping that sends them in a downwards spiral. Maybe we should calculate the bayesian.
Holly crap, it reads like the anti Electric coolaid acid test, with petty revenge ,guns, and murder dejour,instead of , instead of, well, an epic road party that is still going......bobby just played the gramys...there were glitches with the sound system and they raised 15 mill I like some of what Aella has written, but had no idea that ,Rationalists, had just rebranded nialistic hate.....so cleverly.
Well they haven't. It's one group of fewer than 10 people, considerably removed from "the rationalists" as this point.
We are not alone in our own minds. There is an exogenous driving us crazy from within.