- 40
- 8
What is this place?
This website is a place for people who want to move past shady thinking and test their ideas in a
court of people who don't all share the same biases. Our goal is to
optimize for light, not heat; this is a group effort, and all commentators are asked to do their part.
The weekly Culture War threads host the most
controversial topics and are the most visible aspect of The Motte. However, many other topics are
appropriate here. We encourage people to post anything related to science, politics, or philosophy;
if in doubt, post!
Check out The Vault for an archive of old quality posts.
You are encouraged to crosspost these elsewhere.
Why are you called The Motte?
A motte is a stone keep on a raised earthwork common in early medieval fortifications. More pertinently,
it's an element in a rhetorical move called a "Motte-and-Bailey",
originally identified by
philosopher Nicholas Shackel. It describes the tendency in discourse for people to move from a controversial
but high value claim to a defensible but less exciting one upon any resistance to the former. He likens
this to the medieval fortification, where a desirable land (the bailey) is abandoned when in danger for
the more easily defended motte. In Shackel's words, "The Motte represents the defensible but undesired
propositions to which one retreats when hard pressed."
On The Motte, always attempt to remain inside your defensible territory, even if you are not being pressed.
New post guidelines
If you're posting something that isn't related to the culture war, we encourage you to post a thread for it.
A submission statement is highly appreciated, but isn't necessary for text posts or links to largely-text posts
such as blogs or news articles; if we're unsure of the value of your post, we might remove it until you add a
submission statement. A submission statement is required for non-text sources (videos, podcasts, images).
Culture war posts go in the culture war thread; all links must either include a submission statement or
significant commentary. Bare links without those will be removed.
If in doubt, please post it!
Rules
- Courtesy
- Content
- Engagement
- When disagreeing with someone, state your objections explicitly.
- Proactively provide evidence in proportion to how partisan and inflammatory your claim might be.
- Accept temporary bans as a time-out, and don't attempt to rejoin the conversation until it's lifted.
- Don't attempt to build consensus or enforce ideological conformity.
- Write like everyone is reading and you want them to be included in the discussion.
- The Wildcard Rule
- The Metarule
Jump in the discussion.
No email address required.
Notes -
EA is a social movement. Arguments in themselves are completely inert.
Not really. If we have some seemingly plausible argument that still constantly leads people to heinous actions then the good heuristics would be to reject this argument even if we can't conclusively prove that it is always false. Humans are flawed, and evaluating consequences and strength of each argument personally is hard work. I am not sure that sufficiently clever demagogue can't trick me into following something bad if I treat each argument as inert. However, if I put up some guardrails in dangerous places - like if adherents of this concept often turn out to be psychopathic conmen, or if adherents of that concept frequently end up committing genocide when they raise to power - I may be restricting myself a bit but at least I cut off a huge chunk of possibility space where I would be convinced to follow something very bad.
Sure this Bankman guy fucked up, but I think you’d be hard pressed to find any ideology without its share of bad actors. It can be fair to dismiss an ideology at some point based on this heuristic after enough consistent failures with few counterbalancing successes (maybe communism comes to mind as an example of this category). But does EA fit that condition?
Does it consistently lead people to bad actions? More frequently than other systems?
That remains to be established. I hope the answer is "no" because we have enough nicely looking ideological packages leading people into bad places, we don't need more. But my point is not that it's bad, my point is it's not "inert" - we need to watch out and evaluate whether it's bad or not, we can't just rely on the idea that "ideas are inert", because very often they aren't.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Semantics.
I'm persuaded by the ethical arguments. If, in practice, the ethical arguments are not honored, then the "social movement of EA" is uninteresting to me.
“The social movement of EA” is the topic of the post that you’re commenting under, so…
As I said, that (i.e. semantics) is at issue.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link