site banner

Culture War Roundup for the week of October 14, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

I think of intelligence like I think of processing power in a computer. Now below a certain level, if you don’t have enough, it’s going to be nearly impossible to do anything useful. I think there are several types:, linguistics, mathematics, art, social. These can’t be used interchangeably— meaning I can’t use artistic intelligence to understand math or language, nor can I use mathematical intelligence to learn to write poetry. To my mind these sit atop a more general CPU that is needed for any type of thinking. And I further think that we’re dealing with multiple genes in multiple places which to my mind would complicate any sort of simple correlation to ethnicity. Until we know which genes exist in which population it’s impossible to tell for sure.

I think of intelligence like I think of processing power in a computer. Now below a certain level, if you don’t have enough, it’s going to be nearly impossible to do anything useful.

The reason I can't quite use this analogy is that even if you have a slow computer, as long as it is Turing-Complete, it CAN complete any given task you put before it, even if it takes literal centuries.

So being faster or slower to complete tasks is not quite the same as being able to handle more complex tasks. I sincerely believe there are problems that 150+ IQs can handle that are utterly beyond a 100 IQer, even if you gave the 100 specific, detailed instructions on how to complete it and gave them years to work on it without interference. MAYBE if you stuck a team of cooperative 100s who are at least capable of delegating tasks and getting along.

So there's other bottlenecks. "Working Memory" is probably the big one. I think extremely high IQ people are also defined by being able to fit a LOT more information in their working memory and thus can can bring all those mental resources to bear at once, rather than having to painstakingly write everything out and do each individual mental calculation one at a time.

So perhaps add in RAM to the equation. If you can't fit the majority of the problem in your head, at least big enough chunks of it to make progress, then you'll find yourself unable to ever solve it.

Side note, this is often how I feel most constrained when faced with complex problems. I can't actually 'visualize' the problem in my head because trying to load all the details in ends up pushing some parts out, and I can compensate by writing out bits of info, but this always slows me down substantially.

even if you have a slow computer, as long as it is Turing-Complete

Here's the thing, though. No real world computer is Turing-complete. They all have finite storage and thus fail the infinite tape requirement. For an obvious example, try running eg. Stable Diffusion on an early 90s PC - you simply can't because they don't have enough storage for the model and results, even if you allowed infinite time.

Reminds me of the assertion that a 2004-2006 research supercomputer could probably have been capable of training GPT-3.

I would be careful associating working memory with the brain's ability to actively model complex problems. The latter is a conscious process, while the former is unconscious. An 80 IQ person can, with pen and paper, rotate any shape or model any system given enough time, or calculate out a 6-move chess sequence that Magnus Carlsen could perform in seconds mentally, but he could never have the spontaneous causal associations in his mind that naturally occur to more intelligent people. The lack of this faculty, and exclusively this, is what precludes low IQ people from complex things. This is why the computer analogy is weak. And why low IQ civilizations just can't get it together. If it were only a matter of processing power, nothing would stop them from busting out the compasses and graph paper. But intelligence is really a phenomenon of the subconscious, of the brain noticing a pattern and showing this to the conscious mind. For that reason it can never be taught or compensated for.

While I agree with you, I think "busting out the compasses and graph paper" is what science is. We've reached the limits of what we can do in our minds, so now we do mathematics on paper. This allows us to calculate things that we cannot wrap our heads around (try visualizing infinite-dimensional spaces for instance)

One thing I have noticed that less intelligent people do is solving the same problem over and over again. Even culture wars are like this. "X people are discriminated against, and it's totally not their fault, so we need to give special rights to X group to prevent this, as it's only fair to escalate their power/position in society". Even as a teenager I generalized this problem to every related problem which could exist, but somehow society still sees a sense of novelty in "We are X, we are victims, give us power or you're bad and support bad things"

Edit: My point is that, even with pens and stacks of paper, stupid people cannot generalize or reach levels of abstractions which gives them the advantages of space. Space is really powerful though, even more than Time (which is probably why PSPACE and EXPSPACE are bigger than PTIME and EXPTIME respectively. Not that I actually know complexity theory)

Not sure if I'm agreeing or disagreeing, but consider that writing something out is not the equivalent of having it in your working memory. Although human language is very rich, if we consider writing out a problem to be the equivalent of forcing some million-parameter vector in latent space into a sentence of unicode text, then there's likely to be a huge loss of information/nuance that we can't perceive consciously. It may be that the ability to hold slightly large/more concepts in your mind is responsible for the spontaneous causal associations you describe.

Working memory is a passive process, it's not what we use to consciously model things. Not sure what we'd call the modeling area of the brain, I've heard sensorium used.

It may be that the ability to hold slightly large/more concepts in your mind is responsible for the spontaneous causal associations you describe.

It's a bit of a mystery really. All we know for sure is, working memory/modeling ability/intelligence are strongly correlated. When you and I say modeling ability we're probably thinking about shape rotation or figures and so on, but I believe each form of intelligence has its own type of modeling ability, which is accompanied by a strong working memory (at least in that field). So I suppose there's no knowing which is the 'essential' component, the two always occur simultaneously.