This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
What is the point of having that discussion here? I am one of "little people" / NPCs, the options available to us have are quite minimal. No compounds to buy, no ancient Japanese villages pour money into. I am debating whether it would make sense for me buy a Japanese car.
The mottizens who have the resources for meaningfully prepare, they probably have better networks than this forum. I am puzzled: if your well-connected portfolio manager buddies are prepping, why come to this internet forum for second opinion? A flex? Doomerism for doomerism sake?
Some people just want to talk? There aren't that many places where people discuss AI x-risk, at least not with our standards of discourse.
This isn't LessWrong, where the majority of people believe that this is the most likely outcome. We've got everything from AI skeptics to true believers and Doomers.
I know I've discussed my beliefs here plenty of times, and I appreciate the debate that ensues. I want to not have to worry about the future, but not to the degree that I would dismiss my very real concerns for the sake of peace of mind. Having to defend them against skeptics is good epistemic practice.
Forgive me if I'm misremembering @2rafa 's previous stance, but she used to be far more skeptical of such concerns, including AI induced mass unemployment. I suppose the evidence has only mounted, and there's something to be said for seeing many of your peers, namely rich, intelligent professionals, begin to take things seriously and buckle down for the wild ride.
Correct or not, seeing billionaires and politicians talking about it did things that no amount of Yudkowsky did.
More options
Context Copy link
The idea that people with lots of resources are better positioned to find ways to prepare is off. It's not like advisors at the family office have any particular insight into AI. They have been selected for basic competency and controlled risk management, not predicting radical step changes in the world. If they fail to predict ruin from AI, they'll have lots of good company; if they stick their neck out on AI predictions and fail, they'll face much worse consequences. At most, they'll say "this AI thing seems important, let's reallocate your portfolio to include more IBM."
With potential AGI, no one has a solid understanding of what will happen. In those situations, mainstream opinion sources default to status quo bias, which is about the worst thing to do. Weird randos on obscure Internet forums at least offer the potential for some variance.
While controversial in certain spheres, richer people tend to be smarter, better educated people.
All else being equal, the opinions of someone who fit that bill are worth more than someone who doesn't have their shit together.
I agree with much of your comment, but keep in mind that when you're already rich and powerful, a lot of the usual downsides of risky plays become minimal. The upsides here are things like potentially making out like a gangster, outperforming the competition that relies on Mk 1.0 humans, and so on. (I know you've said something similar downthread, I'm elaborating, not contesting this bit).
More options
Context Copy link
I don’t think they have more insight but having more wealth means that you have the ability to retool when your industry goes AI. You can save to FIRE when it happens, you can go back to school, you can start a business, and so on. Poorer people can’t do that stuff and thus when AI takes those jobs, they’ll have very few options.
More options
Context Copy link
This is silly. Yes, more resources mean more capacity to misallocate them, but it's better than not having them.
If we get paperclips or fully automated luxury gay space communism, all the money in the world will do you no good, but there are a lot of other possible scenarios.
There's communism, and there's communism, even holding the fully automated luxury bit equal.
I can easily see the trajectory of our civilization leading to a situation where everyone is incredibly wealthy and happy by modern standards, but some people, by virtue of having more starting capital to invest into the snowball, own star systems while others make do with their own little space habitat. I'd consider this a great outcome, all considered. Some might even say that by the standards of the past, much of the world is already here.
More options
Context Copy link
My point was different than you interpret: I was responding to the idea that people of means have access to special networks of information. Money gives optionality, which is an undisputed good, but it doesn't give special access to information about how to prepare for AI.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link