site banner

Culture War Roundup for the week of September 23, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

Surely there could be a point where technology advances enough that computers do everything better, no?

Currently, computers are better at chess than humans. Still, nobody wants to watch the computer world championship and many people want to watch the human world championship. In some jobs it's not just about being better. Maybe more such jobs will exist in the future?

nobody wants to watch the computer world championship and many people want to watch the human world championship

Yes, because Deep Blue is never going to open with Bongcloud.

That’s like at most 1% of jobs.

Sure. And at that point we are discussing hypothetical scifi futures. Like in Accelerando when the Hello Kitty artificial intelligence explains to newly created people that things like monster trucks are free and they can have as many as they want.

But I'm not very concerned about all human labor being made irrelevant soon. Maybe some portion of it. And that won't be very conformable for some people. Like English clothing makers when machine looms were first made. A hard time, but society did not collapse or suffer permanent unemployment. They only had to slaughter a small number of people to stop them from destroying clothing factories. And clothes are now a tiny fraction of the cost. I'd say a clear net good. I'm hoping when HR drones are replaced with software we can figure out how to deal with them more peacefully than British soldiers dealt with Luddites. I have been told that Excel put most accountants out of business and we navigated that without bloodshed and social upheaval.

This is the midwit argument.

A better argument is that AI will create an even more extreme power law distributions of return to human capital and cognitive performance. You'll see software firms that used to need 100s of developers to work on various pieces of the codebase turn into 10 elite engineers plus their own hand-crafted code LLM. That same firm used to have 100+ sales people to cover various territories, now it just has a single LLM on 24/7 that can answer all the questions of prospects and only turns them over to an elite sales team of 10 when they get to a qualified position.

All of a sudden, we're at 30%+ unemployment because the marginal utility of the bottom 30% of cognitive performers is literally negative. It's not that they can't do anything, it's that whenever anyone thinks of something for them to do, there's an LLM on the way already.

I think we're actually starting to see this already. Anec-data-lly, I'm hearing that junior devs are having a really hard time getting jobs because a lot of what they used to do really is 90% handled by an LLM. Senior devs, especially those that can architect out whole systems, are just fine.

The AI doom scenario isn't paperclips or co-opted nukes, it's an economic shock to an already fragile political system that crashes the whole thing and we decide to start killing each other. To be clear, I still think that that scenario is very, very unlikely, but "killer robot overlords" is 100% Sci-Fi.

Are there really swarms of "junior devs" out there writing code so menial that their whole job can be replaced by an LLM? This is just totally discordant with my experience. Back when I started they threw an active codebase at me and expected me to start making effective changes to a living system from the get. Sure, it wasn't "architecting whole systems", but there is no way you could type the description of the first intern project I built years ago into an LLM and get anything resembling the final product out.

These systems that claim to write code just aren't there. Type in simple code questions and you get decent answers, sure. They perform well on the kind of thing that appears in homework problems, probably because they're training on homework problems. But the moment I took it slightly off the beaten path, I asked it how to do some slightly more advanced database queries in a database I wasn't familar with, the thing just spat out a huge amount of plausible but totally incorrect information. I actually believed it and was pretty confused later that day when the code I wrote based on the LLM's guidance just totally did not work. So I am incredulous that there is really any person doing a job out there which could be replaced by this type of program.

The junior devs graduating college over the past 5 years are drastically less capable than before. There are fully diploma'ed CS majors who do not understand basic data structures. Yes, this is a problem.

Are there really swarms of "junior devs" out there writing code so menial that their whole job can be replaced by an LLM?

Yes, or close to it. Used to be stack overflow was full of them trying to get real devs to do their work for them.

Thanks for the kind words.

Yes, that is one possibility (ie the tech advances enough that it kills some but not all jobs so those at the top become Uber rich and those at the bottom UBI). Of course that ignores the possibility that the situation you describe is a mid point; not the end.