site banner

Culture War Roundup for the week of March 27, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

To vastly outclass humans in 'technological development, politics, economic productivity, war, and general capability' I think an AI would actually need to have an advantage in any given problem.

I'm not sure I understand why? There are many problems of the form of 'reverse 100k iterations of SHA3' or 'what is the 1e10th digit of chaitin's constant' or 'you are in a straitjacket in a river, don't be eaten by piranhas'. And supersmart AIs probably can't solve those. But tech/politics/economics/war problems aren't like those! To an extent, it's just 'do what we're doing now, but better and faster'. The - well tread at this point - example is 'a society where the median person is as smart as John Von Neumann'. It's obviously harder than just cloning him a bunch of times, but assume that society would also have a small fraction of people significantly smarter than JvN. Would that society would have a significant military / technological / political advantages over ours?

Why is it always Von Neumann? Last I recall he was a physicist, not a brilliant leader, warlord or businessman who solves coordination problems.

Because Von Neumann could do shit like argue convincingly with an expert on Byzantine history after someone gave him a set of encyclopedias on said history, he read the books once and could remember everything in them well enough to give the expert trouble. At dinner parties people would call out books, and he'd just start reciting them from memory until someone told him to stop. He didn't train to do any of this, it was just an incidental fact about his brain. He revolutionised a bunch of separate branches of mathematics and physics by the time he died at 53, and he wasn't some social recluse like Dirac, he was apparently quite socially adept.

I am entirely sure that if Von Neumann had tried his hand at business instead of being obsessed by physics (like a lot of smart people get), he'd have been one of the best businessmen of all time. Same thing for pretty much any other field. The anecdotes about him really do point at his brain being a completely unambiguous upgrade to the normal human brain, with basically no downsides.

Upgrade perhaps, but still looks like to be an upgrade primarily in the areas where he can just do his own thing, once allocated resources, and drop a breakthrough paper. Allocating resources sounds like a bigger problem to be when moving towards superhumanity.

So Von Neumann really was the original Nth-Dimensional Hyperbeing In A Skin Suit, and not John Carmack?

I mean, he did make crucial technical developments for the atomic bomb. That's sort of warlord-like.

To an extent, it's just 'do what we're doing now, but better and faster'.

If you want to do something 'better' or 'faster' you have to do it differently in some way from how it was being done before. If you are just doing the same thing the same old way then it won't be any better or faster. So an intelligence would have to make war, do politics, economics, etc. in a different way than humans do, and it's not clear that "just be smarter bro" instantly unlocks those 'different' and scarily efficient ways of making war, doing politics.

It is difficult to answer this question empirically but the only real way to do so would be to look at historical conflicts, where it's far from clear that the 'smarter' side always wins. Unless you define 'smarter' tautologically as 'the side that was able to win.'

If you want to do something 'better' or 'faster' you have to do it differently in some way from how it was being done before

Take 'theoretical mathematics' as an example. Progress there is, to some extent, made by 'smart people thinking, scribbling, and talking'. An AI that was just 'JvN and colleagues but 1000x faster' could ... do that 1000x faster. Something similar applies to technology, economics, and war - the relation to the physical world means there's less of a speedup, but there's still some. An AI doesn't have to do anything that different from humans to get to in 50 years what humans would in 500.

progress there is, to some extent, made by 'smart people thinking, scribbling, and talking'.

A lot rides on "to what extent?" It's not clear to me that if you just 'sped up' the brains of top mathematicians or physicists by 10x or 50x or whatever it would actually cause a commensurate explosion in scientific breakthroughs.