site banner

Culture War Roundup for the week of November 7, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

EA does not value ownership rights; if your money could do more good somewhere else it would be positive for it to be taken from you and directed somewhere else.

I think there's this idea that utilitarianism is all like "sure, go ahead, rob people iff you can use that money better" but that's dumb strawman-utilitarianism.

The reason it's dumb is because you have to take into account second-order effects in doing whatever it is you're doing, and those second-order effects for dishonest and coercive actions are nearly always profoundly negative, in general resulting in a society where nobody can trust anyone well enough to coordinate (and also resulting in a society where nobody would want to live).

There is a reason why nobody on the EA side is defending Bankman.

that's dumb strawman-utilitarianism.

As we see with live examples, no it's not. And that's how it works in general. Utilitarians have not noticed the skulls, nobody ever notices the skulls, Putin's Russia is simply fascist, Xi is not a Palladiumpilled Governance Futurist but a two-bit dictator, your enemies simply hate you, there are more malicious conspiracies and frauds than «emergent memetic attractors», the simplest and crudest and cringiest explanation is usually correct.

The reason it's dumb is because you have to take into account second-order effects in doing whatever it is you're doing, and those second-order effects for dishonest and coercive actions are nearly always profoundly negative

It's not dumb, and this is a fig leaf for useful idiots with their prosocial Hajnal brains. Every time EAs try to explain how they are acktchually rationally compliant with conventional morality, it screams «but in less tortured scenarios we can calculate when to defect and get maximum utility» just as strongly as this yield farming aka «Ponzi with extra steps» story does. It's from April as you see; a lot of SBF-glorifying EA-backed content came after that. EAs know they are smart and can trick commoners; they believe themselves to be even smarter than that and able to calculate ripple effects. In fact, calculating ripple effects using raw brainpower is what they do under the rubric of longtermism.

There is a reason why nobody on the EA side is defending Bankman.

The reason is that he has apparently failed them and burned their piggy bank, decreasing the utility they could have produced.

And it's the same reason they did not renounce him before his fraudulent empire crashed. They are not deontologists, after all. They assumed he's a math whiz and got it all figured out.

  1. This is sort of an ad argumentum to make the point that the problem is not that EA fails to communicate values. I don't think EA comes into this saga at all.

  2. I get what you're saying about second order effects, but it sort of makes the whole thing banal. Once you start justifying things by second and third order effects, you just get all the way back to deontology.

When EA or EA-affiliated started getting involved in politics, or trying to, with the Carrick Flynn race then that was a very big red flag. That is moving away from the original principles into "well you know how do we really do the most good, isn't it getting guys elected who can make laws and commit funding to our pet projects?"

And that makes you the same as the guys lobbying for grants for farm subsidies, or tobacco companies against legislation banning their advertising, or every other group out there wanting the government to pass laws and give money to their pet projects.

On reflection I think EA as a tribal signifier has come to mean a whole bunch of different things to different people, from "we should value the lives of future people more than our own" to "maybe we should think for two seconds about cost efficiency" to "defrauding people can be good, actually" to "just donate to whoever Givewell says." This is unhelpful.

Agreed. I am not a strict utilitarian but still support EA more on the “think for two seconds on cost efficiency” side, and the idea that the first world has a moral obligation to help lift the rest of the world out of poverty.

I don’t buy into longtermism or AI doom scenarios at all though, and find them rather annoying. People forget that most of the work done in EA and most of the money spent is on global development. Unfortunately controversy drives headlines so most don’t see that.

One of my consistent frustrations with progressive style politics, going back decades now, is the total, violent allergy to any kind of consideration for second order effects. I am not familiar with EA enough to say how they handle that stuff, but I would not be overly surprised if they found themselves in a situation where Scott and 10 other high-decouplers uselessly decry this new trend of EAers embezzling for malaria nets.

Alternatively, this has already happened, and yet EAers still seem to mostly support bigger government.

I would not be overly surprised if they found themselves in a situation where Scott and 10 other high-decouplers uselessly decry this new trend of EAers embezzling for malaria nets

The entire EA forum is filled with people saying 'this is bad and evil and disgusting EA would never support this we made a severe mistake in blindly trusting SBF we deeply apologize we must be very careful to make sure this doesn't happen again'. And those posts are now the top posts of all time on the EA forum. They're also explicitly saying things like 'utilitarianism taken too far can look like this which is why we endorse pluralism and moderation', and they've said things like that beforehand. So I don't think the 'allergic to 2nd order effects' criticism applies!

And those posts are now the top posts of all time on the EA forum.

The ironic thing about this is that all of the well-considered counter arguments come off as exactly what a manipulative sociopath would say in this situation, and there are numerous comments pointing that out. "I condemn and distance from this bad thing now that it has come to light, but we should not link criticism to any particular people and totes promise not to do it again." What they fail to realize is that the credible pre-commitment to strong business ethics they are talking about is deontology.

In the sense that a real witch would condemn witches to maintain their cover, and maintain a long track record of doing so just to make it extra secure, sure.

It's more like "Yes, we all condemn witchcraft, it's the worst. But in the wake of this witchcraft scandal of my good friend, we should focus on general condemnations, and totally not worry about any particular people who might also be witches. Also, we probably don't need any particular new anti-witch policies beyond general frowning and finger-wagging."

What they fail to realize is that the credible pre-commitment to strong business ethics they are talking about is deontology

Aside from that plenty of claimed christian deontologists or liberals have committed similar scales of fraud over the years, meaning it's not clear EA somehow makes people more likely to commit fraud (crypto exchange rugpulls are incredibly common), 'deontology' and 'believing in business ethics' are rather different. Deontology is a claim about 'all morals and decisionmaking', business ethics are a specific set of rules (ones that basically all EAs and philosophical utilitarians endorse, afaik).

Also for deontology in a practical sense, the problem of 'which rules' confuses things, am i being consequentialist if I accept 'business ethics' instead of 'christian ethics' for finance? how do i decide that business ethics are better? aren't you just embracing rules 'deontologically' that were - literally - created by people who planned out the consequences of said rules? isn't that just consequentialism-by-proxy? Can I even think things like 'wow, liberalism and universalism maybe aren't ideal' under deontology, if those are the prevailing rules?

I mean the de-facto rules in crypto were, kind of, 'commit as much fraud as you can without getting caught'

I've argued before that ethics should be viewed as a stack, from virtue ethics to deontology to utilitarianism to consequentialism, where the core difference is the trade-off between ideal outcomes vs ease of implementation. My point is that the post you linked is, at best, arguing for accepting a trade-off down-stack in the case of business ethics. They want to implement a "No Fraud" rule because the risk of tricking yourself into harmful bullshit when doing business ethical reasoning on utilitarian grounds is too high, so you should just round off to "all clever plans to increase utility that violate standard business ethics should be assumed false on sight". And the way you credibly signal that commitment, is to switch to a deontological framework for those cases, instead of continuing in a utilitarian framework (which implies a whispered "unless it seems like a really good idea!" caveat).

IIRC EY tweeted something to the effect of "go like 75% of the way from deontology to utilitarianism and you're basically in the right place until you've become a god", which sounds about right.