This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Knowing about things that might transform your life or society as a whole is important? Even if there's nothing you can do about the trend of AI as a whole, whatever smaller-scale goals you have are surely impacted, so knowing about it is important. And more generally, that the forum was capable of demonstrating to you, viscerally, that AI will transform the world, suggests it might inform you of similarly-important things in the future.
I just don't think that's very true.
Because the effects it will have are going to be relatively unpredictable, and your choices trying to respond to every little development have potential to make things worse for you.
Its like trying to 'time the market' and day trade vs. just stick with a long term investment strategy.
There are almost certainly diminishing returns to becoming deeply informed about [current thing.]
For instance, if you're a woman who makes her living spinning fabric and selling it, knowing about the 'industrial revolution' or 'factory production of cloth' is incredibly relevant. Knowing about it three years earlier seems very useful. That "long term investment strategy" of continuing to spin fabric to feed your kids doesn't work.
You say there are diminishing returns to being deeply informed, but without being deeply informed, you might just stay confused. that It's difficult to know beforehand what the 'big things' will be. A lot of people were not, three years ago, sure that AI would be a 'big thing' in five years, even though they probably saw something about 'neural networks identify cats in youtube videos' in a news headline. And today, most people still don't really care. So if you just 'read the headlines once every few months', maybe you'll hear about ChatGPT as a cool thing your young friend plays with, and write it off as something that doesn't matter. Maybe ten years ago, you could've trained to be a ML engineer or lesswrong alignment person or something.
Yeah, it'll make you sad to think about too much, or something, but ... humans being obviated in all aspects of life is, at least potentially, sad, right? Being sad isn't an unconditional bad! It's being aware that something not-good is happening. Consider: we could easily, by tweaking a few dozen/hundred genes, not feel any sadness after a family member dies - and, yeah, at that point it's too late to do anything- so why feel sad? Would that be good?
What does she do with the information?
Develop another skillset... which is ALSO going to be disrupted in short order?
How does she act when, knowing that the change is coming, she still can't tell what the second order impacts might be?
That's my point. Knowing about the coming change is perhaps useful, but how much information must one obsessively seek out in order to make a good decision with that information? And how much time should one spend before it is counterproductive?
For instance, I'm pretty sure AI is coming for my job inside of ̶1̶0̶ ̶5̶ 2 years. But how in the hell can I predict which jobs are going to be 'safe' with any precision?
So basically, I've done the best I can by buying stocks in companies that might take off due to AI development, and I'm preparing myself to jump when the inflection point arises.
But I am not obsessively churning through AI news to try and predict the outcomes.
... In that non-hypothetical historical situation, yes, you develop another skillset. And that skillset won't 'also be disrupted in short order', given we're hundreds of years later and plenty of people hold occupations for decades. But, given the primary occupations aren't 'farm / household laborer' anymore, every single person eventually retrained, whether because they saw the way the wind was blowing or because the price of their labor went to zero.
It's tough! But "plumber" or "doctor" are better jobs than "copyeditor" or "commodity artist", i think.
I'm not so sure about 'Doctor,'
https://www.cnbc.com/2023/03/14/googles-medical-ai-might-soon-answer-questions-about-health.html
Surgeon, maybe.
And while I agree with Plumber, I'm no longer very confident in my own predictions so I wouldn't be too surprised if we get "PipeGPT" sometime this year.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link