site banner

Culture War Roundup for the week of December 11, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

That's the part that caught my interest: how did the rationalist community, with its obsession with establishing better epistemics than those around it, wind up writing, embracing, and spreading a callout article with shoddy fact-checking?

People occasionally ask whether the ratsphere is just reinventing the wheel of philosophy (my response then). I suspect that EA is similarly reinventing the wheel of non-profit profiteering.

This is something I've been thinking about a lot lately, but so far all I have to show for it is a scattered mess of loosely-connected (as though by yarn and pushpins) thoughts. Some of them are even a bit Marxist--we live in a material world, we all have to eat, and if you aren't already independently wealthy then your only options for going on living are to grind, or to grift (or some combination of the two). And the Internet has a way of dragging more and more of us into the same bucket of crabs. AI is interesting stuff, but 99% of the people writing and talking about it are just airing views. MIT's recent AI policy briefs do not contribute any technical work to the advancement of AI, and do not express any substantive philosophical insight; all I see there is moralizing buzzwords and wishful thinking. But it is moralizing buzzwords and wishful thinking from top researchers at a top institution discussing a hot issue, which is how time and money and attention are allocated these days.

So for every one person doing the hard work of advancing AI technology, there seem to be at least a hundred grasping hands reaching out in hopes of being the one who gets to actually call the shots, or barring that at least catches some windfall "crumbs" along the way. For every Scott Alexander donating a damn kidney to strangers in hopes of making the world an ever-so-slightly better place to live, there are a hundred "effective altruists" who see a chance to collect a salary by bouncing between expenses-paid feel-good conferences at fancy hotels instead of leveraging their liberal arts degree as a barista. And I say that as someone with several liberal arts degrees, who works in academia where we are constantly under pressure to grift for grants.

The cliche that always comes to my mind when I weigh these things is, "what would you do, if money were not an issue?" Not in the "what if you had unlimited resources" sense, but like--what would the modal EA-AI acolyte do, if they got their hands on $100 million free and clear? Because I think the true answer for the overwhelming majority of them is something like "buy real estate," not "do more good in the world." And I would not condemn that choice on the merits (I'd do the same!) but people notice that kind of apparent hypocrisy, even if, in the end, we as a society seem basically fine with non-profits like "Black Lives Matter" making some individual persons wealthy beyond their wildest dreams. I can't find the link right now (but I thought it was an AAQC?) but someone here did a Likewise, there was a now-deleted deep dive into the Sound of Freedom guy's nonprofit finances posted here a while back, and he was making a lot of money.

So if you want to dig in, the 2020 return is here and the 2021 is here.

As far as most concerning stuff, there is a pretty large amount of money flowing out to Ballard and his wife. $335,000 of salary to Ballard in 2021 and $113,858 of salary to his wife. These aren't super eye popping numbers, but it is a pretty high amount.

The second thing is that they seem to be hoarding a lot of cash. They have like $80 million cash on hand, and are spending much less than they raise. This isn't inherently an issue if they're trying to build an organization that's self-sustaining, but it does mean as a donor your money is not likely going to actual stuff in the short or medium term.

Speaking of that actual stuff, they don't seem to spend most of what goes out the door on their headline-generating programs. A pretty big chunk of their outflow is just grants to other 501(c)(3)s, which is not something you need to be spending millions in executive compensation for. As best I can figure, in 2021 they did just shy of $11 million of grants to other nonprofits. It's a little tricky to suss out their spending on program expenses versus admin, but they claim for outside the US a total of just shy of $8 million in program expenses.

Legal expenses are also very high (at over 1.5 million). Not sure if they're involved in some expensive litigation or what is going on there. Travel is also really high at 1.9 million, but given the nature of their organization, a good chunk of that is likely programmatic.

Now it looks like, even if maybe he did (?) save some kid(s) from trafficking along the way, it was mostly a grift? Anyway, the point is, stories like this abound.

So it would be more surprising, in the end, if the rationalist community had actually transcended human nature in this case. And by "human nature" I don't even mean greedy and grubbing; I just mean that anyone who isn't already independently wealthy must, to continue existing, find a grind or a grift! As usual, I have no solutions. This particular case is arguably especially meta, given the influence AI seems likely to have on the grind-or-grift options available to future (maybe, near-future) humans. And maybe this particular case is especially demonstrative of hypocrisy, given the explicit opposition of both effective altruism and the ratsphere to precisely the kind of grind-or-grift mentality that dominates every other non-profit world. But playing the game one level higher apparently did not, at least in this case, translate into playing a different game. Perhaps, so long as we are baseline homo sapiens, there is no other game available to us.

there are a hundred "effective altruists" who see a chance to collect a salary by bouncing between expenses-paid feel-good conferences at fancy hotels instead of leveraging their liberal arts degree as a barista

Yeah, I think that's so. If you're in the geographical bubble, there's a good chance that if you can't parlay your way into some kind of Silicon Valley start-up with free money from venture capitalists, the next best thing is to hop aboard the EA train. Especially if you knock together something about AI-risk. There's money in that, now (see Microsoft, Altman, and OpenAI). Put together a convincing pitch that you're working on existential risk, and there are donors, grant-makers, and a lot of deep pockets who will listen attentively.

Right now this makes it fertile ground for hucksters, scammers, and the like.

Right now this makes it fertile ground for hucksters, scammers, and the like.

Or also (I imagine, I'm not actually familiar) relatively sincere people, who do care about the goals in question, but also care about living well, or social status, or whatever else.