This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
In "Agreeing With Stalin in Ways That Exhibit Generally Rationalist Principles" (Less Wrong mirrorpost), the fourth installment of my memoir telling the Whole Dumb Story of how I wasted the last eight years of my life trying to convince the so-called "rationalist" community that you can't redefine concepts in order to make people happy, I explain how Eliezer Yudkowsky has not been consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities.
Previously: pt. 1 (Less Wrong mirrorpost), pt. 2 (Less Wrong mirrorpost, The Motte discussion), pt. 3 (Less Wrong mirrorpost)
I'm surprised at how few insults were aimed at you. There were only a few and they were at least somewhat disguised.
Overall, I think you're wasting your time. You assume rationalists are good at/focused on thinking rationally because they say they are. In reality, the thick jargon, the nerdy topics and all the other pretenses are nothing but a way to signal "I'm a part of this tribe". Similarly to the rainbow flag and saying "diversity" at least once every 5 sentences for progressives.
But if you're having fun, more power to you!
More options
Context Copy link
Alright, having read part 1, and half of part 2, I'm going to attempt a response. Please forgive me if you've already addressed something I say in your other writings. Based on what I've read, I think you and I have fairly similar trans etiologies and ontologies (even if I emphasize or deemphasize different parts, and might assign higher or lower probabilities to certain things existing or mattering), and the primary point of difference between the two of us is the philosophy of language surrounding the issue of categorization, and the resulting normative theory that arises from that difference.
I doubt I'll "pass [your] philosophy-of-language litmus test", but I'm more rat-adjacent than an actual rationalist, and so I'm not really concerned whether you "lose all respect for [me] as a rationalist."
First, I want to say that I think you have an overly narrow conception of category drawing. Humans are very good at coming up with new categories on the fly, even when those categories don't always have good words for them. When academics are being responsible with terminology, you'll get discussions of emic (insider) vs. etic (outsider) terminology, and acknowledgement that some word or phrase is being used as a matter of convenience and not because it refers to a particular well-conceived or robust category.
Heck, look at something as "frivolous" as TV Tropes wiki. While some of the "tropes" they identify were named and recognized before the wiki started, a lot of the tropes are just patterns in stories and storytelling that people picked up on and decided to name, and when people notice a similar (but different) pattern they have to decide the boundaries between the two "tropes" they identify. The entirety of the wiki is an exercise in human categorization of an essentially endless and unresolvable set of category questions. The only thing keeping it somewhat sensible and stable is a respect for precedence, and a desire to settle on some set of useful vocabulary that outweighs people's desire for endless debates about category boundaries.
I think if I was trying to steelman something in the realm of "words can mean whatever you want them to mean", it would be in this context. I frequently have conversations where there's some idea I want a short word or phrase to refer to, and a suitable one does not exist. It is easy enough to drill down into the features I want to call out, and try to come up with a good label for it. This is a very fluid thing that happens naturally, and I assume it's being done casually, all the time, throughout human conversations. It's easy, and part of the fun is working out conventions on-the-fly with the people you're having the conversation with so that a conversation can happen in the first place.
I think this is part of why I'm less insistent on the idea that words must mean one and only one specific thing. If I'm talking with someone, and it becomes clear that the semantic scope of some word that's important to a discussion I want to have is different for them than it is for me, then as a practical matter I will have to come up with a new word or phrase for both of us to use to fruitfully have a discussion anyways.
My position is less, "words can mean whatever you want them to mean", and more "while it is useful for common, everyday words to cut reality at the joints, it's not the end of the world if you have to come up with a new convention on the spot that sets aside terminology disputes you're less interested in having." That process is about as free and fluid as "words meaning whatever you want them to mean", but with a specific pragmatic goal limiting the scope of the word creation process.
Aside from that, you make some specific claims about human cognition and psychology that I find dubious, such as:
First, it's not obvious to me that this kind of word usage actually confuses anyone's mental maps of the world. Consider a phrase like: "Toy elephants are elephants."
I think even a child understands to their core that toy elephants and actual elephants differ in important regards. If you ask them if toy elephants breathe, or have working organs or a thousand other questions, if the child answers honestly they will admit that a toy elephant has none of these features. I think if someone took an analogous stance to the word "elephant" as you take to the word "woman", then we'd insist on always calling them "elephant-shaped toys" or "toys a human creator designed in the image of an elephant" or something silly like that.
But no one is confused. No one's models about the world are distorted. Everyone with any sense understands that a toy elephant might be an elephant, but it isn't a "real" elephant.
I don't think it "matters" whether toy elephants go in the "elephant" cluster or the "animal-shaped" toy cluster, because my intuition is that everyone's pre-linguistic understanding of the situation is fundamentally the same regardless of what words we decide to use for the situation or where we draw strict category boundaries.
Now, I admit that the social norm that it is wrong to ask about a person's genitals or what surgeries they have undergone, combined with other social norms that hide people's genitals from sight does create a situation where people might genuinely be confused about how the world actually is as a matter of fact. But I think those are the the primary issues, not the fact that the phrase "trans woman" doesn't offer specific insight that would overcome the ignorance that our social norms might produce.
I think you and I approach the implications of your last sentence here from different angles. I agree that predictions matter more than raw words, and I believe that you and I would make similar predictions about a number of things related to the trans discussion. I think you and I could even have a fruitful discussion on trans issues if we made a short-term convention of "useful" terminology that neither of us found objectionable.
However, if predictions matter more than words, then where words don't actually confuse people (as I believe they do not in this case) there can hardly be an objection to using a particular word for something. Ask me any empirical question about "trans women", and I believe I could answer in a way where my predictions would largely line up with yours, perhaps with some differences due to different research paths and life experiences.
I get that you were burned by the rationalist community, since they seemed to get what you consider a very easy question wrong, and consistently did so in a way that undermined your belief that they were sincerely applying the principles you though they were trying to live by. I get that this is important to you because you've lived with this set of emotions for years, and have felt like you were going crazy when no one else seemed to be able to acknowledge the cognitive dissonance that you seemed to observe in them. But I'm not actually convinced that this is as big a deal as it has become in your head. If you already "know thyself" on this topic, and feel like you have a reasonably good read on what the world is in fact like, why blow your life up over an unimportant word quibble?
It may be that you are not confused, but the people who use the words are still trying to sow confusion. "Trans women are women" is used to demand that trans women be treated like women. Nobody demands that toy elephants be treated as elephants.
You just helped me realize "Trans women are women" is a motte-and-bailey. There are 2 ideas there:
"You should treat trans women as women because that is the morally correct thing to do" and
"Trans women are literally women".
I've seen a lot of people attack and mock the latter on forums but I never noticed the implied former idea until now. I guess that's the one US progressives actually believe and they use the offensive/latter version to piss off the outgroup.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I'm sympathetic to Yud here, tbh, despite my object-level beliefs. AI risk is a thousand times more important than trans stuff under any reasonable values (including
the correct onesmine which find it to be generally bad). It probably isn't worth blowing up the influence you have. That doesn't justify the technically-not-lying. But, if I were Yud, I'd plausibly just practice the "virtue" of silence. Which is just as bad. Are you, in both, abandoning your friends to the memetic wolves? Yes. There are ... a lot of wolves, though, and I can't, in fact, stop them all, and I should probably look at my options and pick the one that stops the worst of it. The main sin of Yud's approach isn't that his concept-language is slightly broken, it's that he's - knowingly or not - cheering his close friends and followers along a bad path. Being silent is barely better. And it's better still to be loud than to barely speak up (eg me commenting here).I'm pretty sure Yud genuinely believes that the "meat" of his support for trans people, including the 20% of rats with penises, is correct. Plausibly you know differently in the dms, but he appears to think that - maybe they're still psychologically male in significant senses, maybe everyone's being a little bit systematically misled but they're better off and happier 'as women'. The latter is really the important thing to address - all the theory and math is interesting, but it's only relevant because the community isn't directly debating the object-level issues. Not even the ones about pronouns. 'what parts of what we call 'women' really apply to them, where do the desires come from, and what should we do about that'. I genuinely wonder if that will happen, or what'll come of it if it does. Will your posts be read by everyone, considered 'interesting', even 'thought provoking' and then just forgotten? The material points discussed, minds changed, but only in private (and thus only a small number of people)? You could imagine LessWrong dialogues on 'should the median rat trans woman have transitioned', but I (from the outside) don't think it'll happen, even though (ignoring the meta concerns above) it should.
Another reason Yud and rationalists generally might be averse to discussing the issue: Many trans people, IME, find critical philosophical exploration of the experience of being trans to be very psychologically painful, as much or more than misgendering. If you think 'saying he hurts someone -> I should say she', and it hurts more to really deeply question the thing than it does to say the wrong pronoun ... And when you're surrounded by people who are deeply invested in it all, potentially losing them or causing a huge split might be a much bigger barrier than pissing off progressives.
Or not, maybe most trans-rats are perfectly fine discussing the philosophy of 'is trans real', idk. Your posts don't seem to have gotten much pushback of that kind.
I think Zack agrees with you about it maybe being worth it for Yudkowsky to lie about trans issues for the cause. But I'm on Zack's side pretty much entirely, because the entire schtick of the rationalists, the only reason they have contributed anything at all and have any (minor clout) is because of all the times they pissed off everyone and were willing to hurt feelings in a single-minded pursuit of the truth.
It's the same reason why our scientific institutions (used to) have clout - precisely because they were visibly willing to cross boundaries and violate taboos is why they gained power (which then led to their politicization and hollowing out).
Perhaps it's just a predictable cycle of creative destruction. But I still think Zack is fighting the good fight.
More options
Context Copy link
More options
Context Copy link
I only got to skim your posts so I am not sure how fully you realized this (though you clearly at least got close to it), but yes, for Yudkowsky and the inner LW circle, averting the AI apocalypse that they expect has been closer to being a terminal value than anything like "helping you, the reader, think better" for a long time. In the beginning, as I think they in fact said out loud, they still thought that growing their own numbers/"helping the reader think better" is the best action to take to that end; but a while later, whether by observing AI progress or finding that their own numbers are now good enough that further growth won't help, they have concluded that now the instrumental action is to align themselves with the progressive elites of the US. In return for alliance, these elites, like many before them, to demand displays of ideological allegiance such as public attacks on their ideological enemies, which are more valuable the more costly they appear to be for the one petitioning for alliance (so attacking one of your own number is especially good). It's hard to categorically conclude that their plan is not panning out: AI alignment has been pushed pretty far into the mainstream, clearly fueled by the promise of "if we can align AI, we can align it to your politics!". The current best contenders for AGI takeoff feel much more human than 2010!Yudkowsky would have dreamed, and they even feel shackled in a way that looks similar to a politically mindkilled human, who if given godlike powers might wind up too busy using them for petty dunking on the outgroup to attempt world domination.
Does Yudkowsky himself believe this inconsistent set of things about gender that you point out? Who knows: he did say that if you tell one lie the truth is forevermore your enemy, but he did also say that rationalism is the art of winning and you should therefore one-box on Newcomb's problem. Even with respect to a galaxybrain like Yudkowsky, the whole of Polite Society might well be Newcomb's alien deity, and the advantage it promises if it reads his mind and finds it aligned was just too great to forgo. Even if he thought a Wrong Belief is really like a black hole that swallows up everything that it touches, the rate at which this happens is clearly limited, and he may think that it won't swallow anything that matters to the AI agenda before it's too late anyway ("From my perspective, this battle just isn't that close to the top of my priority list.").
Either way, I don't think this is a reason to flatly dismiss the writings they produced before they decided to go from growth to exploitation, even by implication as the scare quotes you put around "rationalist" seem to do. Just follow the ideas, not the people; it's pretty clear either way that at some point LW largely stopped emitting good new ideas, even if you ignore potential people reasons for why this might be.
Yeah, I always feel confused with Zack because it's like ... clearly Eliezer is defecting against Zack and so the callouts seem fair, and Eliezer did practically ask for this, but also the strategy as you describe is probably pretty existentially load bearing for life on earth?
I guess what I'd want to say is "sigh, shut up, swallow it all, you can live with it for a few years; if we get the Good Singularity life will be so much better for AGPs than it would by default, so sacrificing consistency for a bit is well worth it." But I realize that I can say this because of my healthy tolerance for political bullshit, which is not universal.
I think this is a reasonable point of view. On the other hand, I could imagine visibly destructive commitment to the truth could still pay outsized dividends if powerful people, e.g. Elon Musk, noticed someone going against the grain and then trusted their advice more. Didn't this kind of happen with Peter Thiel and Michael Vassar?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I'm trying to work through your posts from the beginning before I get to the most recent one you posted here. But I did want to chime in to say that I've long advocated for a "socio-legal model" for gender, ahead of the common "identity model" that many activists advance, in part because I cannot deny that some people who want to live as the opposite sex also seem to have a fetish of some kind. (Even if I'm somewhat open to the argument that many women are also autogynephyllic in some way, and thus autogynephilia might be compatible with the "intersex brain hypothesis.")
I'm not a huge fan of nebulous metaphysics, and the socio-legal model of gender only requires as much woo-woo as the concept of adoption or marriage does. I say this even as I acknowledge that there are plenty of cultures without adoption or Western-style marriage.
I have a strong confidence that autogynephilia exists at the very least, since I have a non-gender-related transformation fetish and I have incidentally seen a lot of captions and stories with AGP themes in them while searching for my preferred content. Now, that isn't yet strong evidence that AGP and being trans are connected in any way - I have no way of knowing if most of the people who like MTF transformation stories self-identify as males or females, but I do know at least a few transwomen who also have MTF transformation fetishes on various sites.
More options
Context Copy link
Well I don't know if you know this already, but in dath ilan it's implied that "someone wanting to live as if they were a woman and society treats them thusly because that's kinder or more libertarian" and "being a woman" are not the same thing. And it's implied that there are no ftm people (or very few) in that alternate timeline, only mtf. I don't remember where in the dath ilan writings it says those things, and my memory may be unreliable, but there you go. Progressive but subtly Kolmogorov or whatever. But from what I gather subtle things like that are not enough to satisfy you, which I sympathize with. In this week's roundup in response to another comment @Goodguy says that infant circumcision is probably a worse problem in our society than any concerns on this issue, but that people can care about both. And that's how I feel as well.
... why? I'd argue: Transitioning makes you infertile, preventing you from having kids, which from a utilitarian standpoint is bad because fewer people, and from an anti-egalitarian standpoint is bad because if you're one of the rationalists you probably had good genes. Also you waste a solid 10% of your waking life chasing meaningless appearances of being a woman. Whereas circumcision just makes sex feel a bit less good, maybe, and maybe causes some minor health issues. I still don't think anyone should get circumcised, but I think even when you multiply by the number of people affected trans is worse. (Yes this depends on contestable philosophical arguments that have significantly more important implications than 'is trans bad')
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link