This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
"AI will automate/do tasks A,B,C easily...here is an example of said task...major economic disruption promised, jobs will be destroyed!"
Unless the AI solution is 100% intuitive, somewhere in this chain a human has to teach another human how to use it, or has to learn how to input stuff into the AI, etc. Sure, AI can produce amazing outputs, but this depends on what you input. Too much hype focuses on the output, ignoring the input side. It's like in the 60s making a similar claim about computers and spreadsheets and eliminating accountant jobs. Rather than eliminating those jobs, it lead to the entire software/programing industry, of people who specialize in how to make inputs into computers, which still a hard enough job that those who can do it well are among the highest paid in the world. Not just coders, but people who write guides/books on coding.
You’re missing the speed of growth. We’re seeing what in previous generations was a decade or two of change happen in weeks. The reason that those industries still exist as viable careers because the technology didn’t adapt to human needs nor did it adapt to taking on more of the work. GPT has learned — in a week — to assign other ais to help it, and to solve a problem it had never seen before. So if that’s true, the pace will be too quick for humans to insert themselves.
More options
Context Copy link
Absolutely humans will be necessary in the near and even medium term. But there’s no question that this will 5x the production of many jobs and 10x or 100x others. This is a foolish objection because even if there is a human in the loop there will be a need for far fewer humans than there currently are.
Why? Why not just use those productivity gains to just make more things?
I swear I rarely see AI doomers actually engage with the Schumpeterian argument on its terms. If you agree that humans are still in the loop at all, then how is this not creative destruction?
Is the idea that we'd run out of material ressources faster than humans? Why not use the gains to get to space and get more then?
I’m not a doomer to be clear. I do believe the coming AI framework is creative disruption. That being said, it will happen fast. I think it’ll be too fast for our society to adapt without massive unrest.
During the industrial revolution there was also massive unrest, as a downthread discussion on anarchy went into. This revolution by nature of it being digital and software based, will likely happen far faster than the industrial revolution. You don’t think that will lead to massive societal issues?
I see. That's indeed a much more agreeable position. It's hard to argue that fast change isn't a generator of unrest.
Any ideas for how to address it? Butlerian jihad?
We've done (something like it) before so the blueprints are all there for us: WWII scarcity programs. Rations, 'victory gardens,' etc.
Bit trickier now that everyone is living in an apartment building with 1000 strangers speaking different languages, but in principle...
More options
Context Copy link
Have the government heavily regulate AI and create corrupt monopolies that will massively stifle innovation?
Say the word "flexicurity" a lot whilst doing barely anything significant to curb the problem?
Have a big economic crisis and world war?
So many possibilities. I do like the idea of smashing the machines though.
Maybe all of the above.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link