site banner

Culture War Roundup for the week of September 16, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

The fundemental problem with utilitarianism is that it depends on a belief that "utilty" is fungible. That x amount of happy puppies can directly offset y number of dead babies or that the suffering of person a can be converted into the enjoyment of another without loss. I do not believe that ethier of these are the case at all.

I think this is a good example of how attacking 'utilitarianism' is used as a shield to avoid difficult moral choices. Society simply has made and will always make difficult choices involving people living and dying. People die, or are severely disabled by, vaccine side effects somewhat regularly. These people are, of course, different people than the people who would've died of the disease itself. But since they're many fewer of the former, we recommend vaccinations. Around forty thousand people will die in traffic accidents next year. Many of them won't be at fault. We could massively lower that number by simply significantly reducing speed limits, yet we choose not to, because we like getting to places quickly. Yet we could also raise the speed limit even more, and get to places even faster, in exchange for more deaths! Utilitarianism or not, people are making these decisions and will make these decisions, based on the tradeoffs between lives and lives, or lives and other useful things.

And the point of OP's thought experiment is to make you think about that. If the choice was 'no cars' or '1 random child dies per year', obviously we're picking the latter, because it's much better than what we have now. If the choice is 'no cars' or '5% of the population dies per year', we'd very quickly ditch cars. I believe you'd make those decisions too, if you had to! So you do recognize that tradeoffs exist, and that tough decisions must be made. And the question then is, how? why? what for?

And the question then is, how? why? what for?

Okay, here is one possible answer: gut feeling, instinct and heuristics. I may choose to hold dog owner criminaly liable if he has dangerous dog - so the kid is mauled and dog owner's ass is hauled into prison. We may implement one of the ranges of laws ranging from "one free bite" law or mandatory muzzles/leashes etc. In fact more often than not I find it more correct than endless harping over hypothetical trolley problems. One issues with this "rational" thinking is that it leads you astray. It strips the situation of context, forces you to abstract looking at the problem from some inhumane birds eye view and reconstruct stuff from some first principles, which often smuggles in more assumptions. More often than not, it is not useful or sometimes outright misleading.

I will actually use one example, that of famous Rawl's original position. At the first glance it seems reasonable, you are just using thought experiment to sharpen your moral instinct "rationally". However what it really does is smuggle in some strange or outright spiritual assumptions. The Original Position assumes that people are distinct from their bodies. They are all indistinguishable immaterial souls possessing all the rational faculties and all the knowledge and they are about to decide into what society to reincarnate. This is deeply spiritual and political assumption. The Original Position basically assumes some spiritual space communism and utopian equity between souls about to be trapped in sometimes stupid or weak human bodies maybe even of “wrong” sex and asks, if you would not want to recreate Social Justice communism here on Earth, that is what you would want to do in this scenario - right?

But let me ask you dear reader, to consider another thought experiment I will call Georgioz's Positon. It is identical to Original Position except that there also exists Christian God floating above all the commie souls from Rawls's thought experiment. Teachings of Jesus Christ are correct and if you do not adhere to it, you will go to hell after you die. Under such assumptions, you would surely want to be incarnated into body of a good Christian, right?

It is the same here with this example: let's assume this thought experiment, where you are presented with a choice between two outcomes with some range of various values - please calibrate the exact percentage threshold of your choice. We are assuming utilitarianism here in this thought experiment, which I refuse. Maybe I will refuse to calculate values X, Y, C, B, A and number of children or dogs killed as you steer me to do. Fuck that, maybe all I care about is adherence to eye-for-an-eye Hammurabi style of law: if a child gets mauled by some dog, then the dog owner will be sentenced to a colosseum, where he himself gets mauled by pack of bloodthirsty mastiffs - and portion of ticket sales for the spectacle will be used to pay for damages. Justice was served, next case. And also please dear utilitarian, calculate exact threshold between 5 to 10 mastiffs you think is the best choice when executing the transgressor in this thought experiment. I am sure you will come up with something interesting here.

As I said to the other commenter, I'm not defending utilitarianism!

Okay, here is one possible answer: gut feeling, instinct and heuristics. I may choose to hold dog owner

I agree the dog example isn't a very practical one. This is why I brought up the speed limit and vaccine examples, which are actual decisions of this type our society is currently making. I don't think we should decide speed limits on gut feelings and instincts. Or what the proper rules of war are.

The veil of ignorance is, imo, fine but uninformative. If I think, as a matter of fundamental good, the strongest should do what they will and the weak will suffer what they must, or that kings should rule by divine right and subjects should obey, I can simply believe that behind the veil of ignorance too. The veil of ignorance basically says "if everyone's main ends are benefit and pleasure for themselves, and everyone's human experience matters roughly equally, then we should have equality", and that's as true as 1+1=2, but the "veil" argument hides the importance of the assumptions.

As I said to the other commenter, I'm not defending utilitarianism!

Good, utilitarianism is a monstrous moral system so I give you a point here.

This is why I brought up the speed limit and vaccine examples, which are actual decisions of this type our society is currently making. I don't think we should decide speed limits on gut feelings and instincts. Or what the proper rules of war are.

Yes, I think we should do exactly that, bring back common sense and gut feelings. I have zero faith in our philosopher kings running utilitarian calculation for meaning of life, speed limit and war and spitting out number 42. Especially as they scramble to readjust their speed limit calculation after their computer mistakenly said, that legalizing of marijuana was universally good.

The veil of ignorance basically says "if everyone's main ends are benefit and pleasure for themselves, and everyone's human experience matters roughly equally, then we should have equality", and that's as true as 1+1=2, but the "veil" argument hides the importance of the assumptions.

Exactly. The veil of ignorance - a very apt name I have to add - is just restating old Marxist trick of declaring every other moral and philosophical system as "ideology", while hiding itself from the same criticism. That is why under the veil of ignorance you divide values between "ideologies" you are incarnated into, such as religion, while Rawls's values are "sacred" and kept outside of the incarnated world as assumptions embodied by commie soul(s) waiting to be incarnated. It is the stupidest and oldest trick in the books.

My point was, that your argument is in similar vein: lets assume that we have to do utilitarian calculation about speed limits, which means measuring exact value between speed and lives lost. What is the value? While there are other approaches. Such as for instance, that we will incarcerate and strip of drivers lince all the psychopathic people who are "obviously" speeding, while the rest of the population will drive normally: they will slow down even to 5miles per hour instead of allowed 15 if they see a school and small kids who are jaywalking to get there in time, and who can simultaneously go 100 miles per hour in broad sunny daylight on empty highway. No need for autistic philosopher-king-utilitarians to abstract from all these "details" and "context" to bring us their perfect formula between miles per hour and deaths. Just normal heuristics of normal people, enforced by good old justice where child killers and reckless or drunk drivers are reported and shamed and punished.

I think this is a good example of how attacking 'utilitarianism' is used as a shield to avoid difficult moral choices.

Is it? Or is utilitarianism "a cope" to avoid dealing with the concept of a necessary evil? ie the idea that a decision can be both terrible and correct. Or that bad things will happen as a result of bad actions and that this is a good thing.

I wasn't defending utilitarianism, my implication was that what you thought was utilitarianism in the initial comment was, in fact, a willingness to acknowledge difficult but necessary moral choices. People who die in car crashes and slowness of travel are, literally, fungible, in the sense that society is actively making decisions that exchange one for the other. A decision must and will be made, and whether via utilitarianism, base instincts, or some other method, the two valuable things will be measured and compared.