Rationality Freiburg

Language selector

Cults & Cultishness

Start: Friday, February 2, 2024 6:00 pm
End: Friday, February 2, 2024 8:30 pm
Location: Haus des Engagements, Rehlingstraße 9 (inner courtyard), 79100 Freiburg (Map, Coordinates: 47.98934, 7.83945)
Host(s): Omar
Type of event: discussion, exercise
This event on meetup.com
This event on lesswrong.com
Feedback and statistics


There are three possible levels of preparation. Depending on your (self-reported) level of preparation you will join one or another discussion group.

I encourage you to read the texts carefully, optimizing for quality rather than quantity and take some notes to use during the discussion.

The preparation time is estimated using the word count of the articles and assuming a reading speed of 270 words / minute plus some extra time for reflection and note taking.

Preparation: Basic

Anyone who comes unprepared to the meetup will read the following articles during the meetup. Nonetheless, I encourage you to read the texts beforehand and re-read them on Friday. I will provide printed copies and links.

Estimated preparation time: 20 - 30 minutes.

Preparation: Intermediate

Note that this list includes all texts from the “Basic” section above.

Estimated preparation time: 1 - 1.5 hours

Preparation: Advanced

Note that this list includes all texts from the “Basic” and “Intermediate” sections above.

Estimated preparation time: 2 - 3 hours.

What will we do?

Sometimes people ask half-jokingly whether rationality is a cult. The answer (at least for Rationality Freiburg) is of course: no. But… isn’t that exactly what a cult member would say? Yes, of course! And isn’t it also the case that a cult member might recognize other groups as cults but would rarely see that they themself belong to a cult? Yes! So, where does that leave us? Well, it’s a valid concern and in fact as our supreme leader Eliezer Yudkowsky wrote in our holy book The Sequences (yes, that was irony):

[…] every Cause wants to be a cult. It’s a high-entropy state into which the system trends, an attractor in human psychology. It may have nothing to do with whether the Cause is truly Noble. You might think that a Good Cause would rub off its goodness on every aspect of the people associated with it—that the Cause’s followers would also be less susceptible to status games, ingroup-outgroup bias, affective spirals, leader-gods. But believing one true idea won’t switch off the halo effect. A noble cause won’t make its adherents something other than human.


And as Kaj Sotala wrote:

I’m reminded of something I recall Eliezer Yudkowsky once saying: “if you tell your doting followers not to form a cult, they will go around saying ‘We Must Not Form A Cult, Great Leader Mundo Said So’.”


So what we will do today is discuss cults and cultishness.

  • Do human groups have a natural tendency towards cultishness? If so, why is that?
  • What can be done to avoid it? How would we recognize if it happens?
  • Does cultishness include characteristics that are valuable? Is it possible to take those characteristics and avoid the problems?

The modality of the meetup will be: We will divide into multiple 5 person groups depending on the level of preparation. Those in the “Basic” group will read their articles during the meetup because it includes all people who came unprepared. In the end we will form one big discussion circle to share some insights and do a practical exercise named “Dissent Collusion” that can be summarizes as: Take turns being confronted with a group that’s presenting a united front, and grow comfortable agreeing or disagreeing with them.


You are worried you have nothing to contribute? No worries! Everyone is welcome!

There always is a mix of German and English speakers and we configure the discussion rounds so that everyone feels comfortable participating. The primary language is English.

This meetup will be hosted by Omar.

There will be snacks and drinks.

We will go and get dinner after the meetup. Anyone who has time is welcome to join.


Learn more about us.

Group medidating

Image generated with DALL·E.