This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Not all values are compatible. Some are mutually antagonistic, making peaceful coexistence difficult or impossible. It's turned out to be surprisingly easy to achieve mutually-incompatible values with baseline human brains. Add in brain modification, and you jump straight to semantic apocalypse.
I’ll agree that brain modification could lead to some nasty outcomes, but overall I think the benefits outweigh the risks as with most technologies. I trust us to use it at least relatively wisely.
What's your conception of the consequences if, in fact, we do not use it responsibly?
From the previous link:
Every method of conflict resolution other than naked, merciless force is founded on the idea that the core nature of Us and Them is in fact fundamentally similar, that at some point we find common ground in our values. You are talking here about technology that could very easily render this idea empirically false. Merely calling this assumption into question in the last century caused a drastic increase in the concentration and intensity of human misery. It is hard to imagine how definitively falsifying it would work out better.
The most dangerous consequence is basic wireheading. I suspect that most of humanity will fall prey to this unfortunately, which I see as somewhat inevitable. I'd argue that the majority of Westerners are essentially in a wirehead-lite version of life already with junk food, porn, entertainment, etc.
To respond to your quote, I think that's far too optimistic in terms of how we can change our brains. We have zero clue how to do it right now, and I think that if we play our cards right we'll slowly learn to live with the technology and adapt it into our society.
There will always be casualties. The millions of people who are obese are a casualty of cheap and tasty food. I still think that creating the abundance which allows them to exist is worth the cost. I hope we can find solutions like semaglutide which will help stem the negative effects while keeping the positives.
In your worst case scenario where we do get a fast takeoff of brain manipulation, I'd imagine the vast majority, as I mentioned above, will wirehead themselves. A small group of those with the willpower or interest in resisting that fate will hopefully persist, reproduce, and become the vanguard of the next society. With over 7 billion human beings on the planet, I'd be shocked if we all succumbed to this sort of problem through a social contagion, although I admit it's possible.
More optimistic than full-brain uploads or functional immortality?
If they become "casualties" in the sense that they put themselves in an early grave via choices freely made, that seems like something to try to ameliorate, but ultimately an acceptable outcome. What seems very questionable to me is the idea that, in a scenario that comes anywhere close to fitting the definition of "transhumanism", they do this before tech enables the sort of fragmentation Bakker is pointing out.
Of those who resist wireheading, what's your basis for supposing that they will be able to get along in any way? Why one society, and not two or five or five million? What's the force that's going to cohere them together into a unified, cooperative whole, as opposed to simply fragmenting into a bloody swamp of fratricide and predation?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link