This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Uh.. Yes? I mean, is this a question worth asking really?
I mean, I wasn't born believing in HBD, I was won over. I used to think that terraforming Mars was a great idea, and now I think it's a rather suboptimal choice when it comes to establishing robust space colonies. I used to think that AI would more inhuman, and very much didn't expect them to speak like us before they thought as well as they do.
I'm sure there are plenty of cases where I've been wrong and thus changed my mind. I don't think I've had any drastic collapses of my cognitive framework that forced me to re-evaluate everything.
This gentleman is autistic. I'm sure we have more than our fair share, but that's a condition that predisposes a tendency to take things at face value without considering how much of it is virtue-signalling or social fiction.
I think you're engaging in the hobby of making up people to be mad at. There are worse hobbies, I'm sure. I'm quite certain that there isn't anyone here who will claim identification with this, unless someone spins up an alt. If there are, I offer my psychiatric services, first interview free.
At any rate, I find singling out Rats and rat-adjacents like the Motte's users as examples of bad epistemics or miscalibration is somewhere between laughable and preaching to the choir. Name a group more obsessed with evaluating the rigor of their beliefs about the world. If someone listened to Yud or Scott and came away with the belief that they themselves were therefore unimpeachable, then they can read an IKEA manual and assemble a mouse-trap that takes their finger off.
If you think we're bad, have you seen the rest of the internet?
I'm skeptical of this as an explanation in this instance, if only because of the fact that if he was predisposed to believe in such nonsensical ideas (whether due to autism or anything else) I don't see how he'd ever have gotten to the state of being taken seriously as a rationalist in the first place. After all, the topic of men vs women in sports won't be only contentious issue he's come across where there's a strong social incentive to take one side over the other.
Also, I know you're the psychiatrist, but wouldn't being autistic make it less likely you'd have the requisite cognitive machinery in place necessary to delude yourself about the state of the world for the purposes of social signalling?
I was not familiar with this gentleman, and I'd call myself unusually steeped in rat culture, at least as close as you can get without knowing any in person. He's written a few minor SF anthologies, and has an unremarkable number of Twitter followers.
He's not a rationalist big name, nor even a major one. It's not like we hand out ID cards, anyone can call themselves a rationalist.
Despite being called a "disciple of Yud", the only evidence for that claim are is a blurb in his blog:
Is he a disciple of Watts? Am I, because I've read both him and Yudkowsky?
He's nobody. Not even on LessWrong. At best, he "promoted" a popular Medium post there, which I don't think he wrote.
That and running a HPMOR podcast that probably has like 10 listeners is all the real influence he has in the rationalist community.
I know no end of non-autistic people who have ridiculous beliefs. I have an uncle who's a fan of homeopathy despite being an acclaimed microbiologist. He's perfectly capable of switching off Science Mode when he leaves the lab and takes his sugar pills.
One swallow doesn't a summer make, and this is more of a chicken painted blue.
In general, yes. That doesn't mean that autistic people are infallible arbiters of truth. Especially social truths/lies, where they're perpetually perplexed by how people just seem to make shit up, when they realize the discrepancy.
At the very least, he wasn't deluding himself for social signaling. He's a victim of those who do, guilty of taking them at face value and lacking enough contact with the physical world to overcome that. He's not getting any status from this disclosure, quite the opposite, everyone is coming to laugh and sneer at him.
More options
Context Copy link
In my unprofessional opinion: Autists accept the signal at face value and then boost it because the original signal comes from a trusted source. No delusion required; it's genuine belief.
IMO (which is also unprofessional!) it's the opposite. It's the autistic person who shouts out that the emperor has no clothes, after all.
Yeah, if nobody they had no reason to mistrust previously informed them that the emperor wore invisible clothes.
Possibly - but then would a person susceptible to such explanations be likely to become a somewhat esteemed rationalist?
Eh? Who is esteeming him? Man's getting like 10 likes on average for his Twitter posts.
This controversy is his most popular post by a country mile.
If "somewhat esteemed rationalist" is a term that can be handed out this generously, I'm going to start calling myself that.
More options
Context Copy link
Absolutely, yeah. I mean, I don't know anything about the specific rationalist in question here, I speak in abstracts, but rationalists have huge blind spots, preferably near their own sacred cows. A rationalist can gain esteem enough by writing on topics other than those.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link