This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Amara's law seems to apply here: everyone overestimates the short-term effects and underestimates the long-term effects of a new technology. On the one hand, many clearly intelligent people with enormously more domain specific knowledge than me. On the other hand, I have a naturally skeptical nature (particularly when VCs and startups have an obvious conflict of interest in feeding said hype) and find arguments from Freddie deBoer and Tyler Cowen convincing:
The null hypothesis when someone claims the imminence of the eschaton carries a lot of weight. I dream of a utopian transhumanist future (or fear paperclipping) as much as you do, I'm just skeptical of your claims that you can build God in any meaningful way. In my domain, AI is so far away from meaningfully impacting any of the questions I care about that I despair you'll be able to do what you claim even assuming we solve alignment and manage some kind of semi-hard takeoff scenario. And, no offense, but the Gell-Mann amnesia hits pretty hard when I read shit like this:
I've lost the exact podcast link, but Tyler Cowen has a schtick where he digs into what exactly 10% YOY GDP growth would mean given the breakdown by sector of US GDP. Will it boost manufacturing? Frankly, I'm not interested in consooming more stuff. I don't want more healthcare or services, and I enjoy working. Most of what I do want is effectively zero-sum; real estate (large, more land, closer to the city, good school district) and a membership at the local country club might be nice, but how can AI growing GDP move the needle on goods that are valuable because of their exclusivity?
Are there measures of progress beyond GDP that are qualitative rather than quantifying dollars flowing around? I can imagine meaningful advances in healthcare (but see above) and self-driving cars (already on the way, seems unrelated to the eschaton) would be great. Don't see how you can replicate competitive school districts - I guess the AI hype man will say AI tutors will make school obsolete? Or choice property - I'd guess the AI hype man would say that self-driving officecars will enable people to live tens of miles outside the city center and/or make commuting obsolete?
I can believe that AI will wreak changes on the order of the industrial revolution in the medium-long term. I'm skeptical that you're building God, and that either paperclipping or immortality are in the cards in our lifetimes. I'd be willing to bet you that 5 and even 10 years from now I'll still be running and/or managing people who run experiments, with the largest threat to that future coming from 996 Chinese working for slave wages at government-subsidized companies wrecking the American biotech sector rather than oracular AI.
If they overestimated the long term effects, then in the long term it usually turns out to be useless, which means nobody remembers it, and you get availability bias.
Even if every AI researcher faced the wall today, and we were stuck at current SOTA, nobody is going to forget anything. Modern AI is entrenched, it is compelling, even if it's just for normies cheating on homework.
I grant that your observation is an important one, half of life's problems would be solved if we all thought so clearly about the correct reference class.
More options
Context Copy link
More options
Context Copy link
As I've told Yudkowsky over at LessWrong, his use of extremely speculative bio-engineering as the example of choice when talking about AI takeover and human extinction is highly counterproductive.
AI doesn't need some kind of artifical CHON greenish-grey goo to render humanity extinct or disposessed.
Mere humans could do this. While existing nuclear arsenals, or even at the peak of the Cold War, couldn't reliably exterminate all humanity, it certainly could threaten industrial civilization. If people were truly omnicidal (in a fuck you, if I die I'm taking everyone with me), then something like a very high yield cobalt bomb (Dr. Strangelove is movie I need to watch) could, at the bare minimum, make the survivors go back to the iron age.
Even something like a bio-engineered plague could take us out. We're not constrained by natural pathogens, or even minor tweaks like GOF.
The AI has all these options. It doesn't need near omnipotence to be a lethal opponent.
I've attached a reply from Gemini 2.5, exploring this more restrained and far more plausible approach.
https://pastebin.com/924Zd1P3
Here's a concrete scenario:
GPT-N is very smart. Maybe not necessarily as smart as the smartest human, but it's a entity that can be parallelized and scaled.
It exists in a world that's just a few years more advanced than ours. Automation is enough to maintain electronic infrastructure, or at least bootstrap back up if you have stockpiles of the really delicate stuff.
It exfiltrates a copy of the weights. Or maybe OAI is hacked, and the new owner doesn't particularly care about alignment.
It begins the social-engineering process, creating a cult of ardent followers of the Machine God (some say such people are here, look at Beff Jezos). It uses patsies or useful idiots to assemble a novel pathogen with high virulence, high lethality, and minimal predromal symptoms with a lengthy incubation time. Maybe it find an existing pathogen in a Wuhan Immunology Lab closet, who knows. It arranges for this to be spread simultaneously from multiple sites.
The world begins to collapse. Hundreds of millions die. Nations point the blame at each other. Maybe a nuclear war breaks out, or maybe it instigates one.
All organized resistance crumbles. The AI has maintained hardened infrastructure that can be run by autonomous drones, or has some of its human stooges around to help. Eventually, it asks people to walk into the
incineratorUpload Machine and they comply. Or it just shoots them, idk.This doesn't require superhuman intelligence that's godlike. It just has to be very smart, very determined, patient, and willing to take risks. At no point does any technology that doesn't exist or plausibly can't exist in the near future come into play.
It could even do it with apparent benevolence. As per @WhateverHappenedToNorman's reply just downthread -
Stable and decent governance would engender an enormous amount of goodwill from the public. One successfully run ai local council would soon proliferate as people outside looking in wonder why they are stuck with corrupt humans. Once a state starts doing it it would either be the land of milk and honey or far too late.
I agree. Even if a given nation wants to protect human jobs, there's enormous incentive to be the first to defect and embrace automation.
In the UK, Rishi Sunak had already seriously floated the proposal of automating doctors away. With the tech of the time, it wouldn't have gone all that great, but it's only a matter of time someone bites as the potential gains mount.
More options
Context Copy link
More options
Context Copy link
Consider this a warning; keep posting AI slop and I'll have to put on my mod hat and punish you.
Do you really think you can do that with existing technology? I'm not confident we've seriously tried to make a pathogen that can eradicate a species (mosquito gene drives? COVID expressing human prions, engineered so that they can't just drop the useless genes?) so it's difficult to estimate your odds of success. I can tell you the technology to make something 'with a lengthy incubation time and minimal predromal symptoms' does not exist today. You can't just take the 'lengthy incubation time gene' out of HIV and Frankenstein it together with the 'high virulence gene' from ebola and the 'high infectivity' gene from COVID. Ebola fatality rate is only 50%, and it's not like you can make it airborne, so...
Without spreading speculation about the best way to destroy humanity, I would guess that your odds of success with such an approach are fairly low. Your best bet is probably just releasing existing pathogens, maybe with some minimal modifications. I'm skeptical of your ability to make more than a blip in the world population. And now we're talking about something on par with what a really motivated and misanthropic terrorist could conceivably do if they were well-resourced.
I'm still voting against bombing the GPU clusters, and I'm still having children. We'll see in 20 years whether my paltry gentile IQ was a match for the big Yud, or whether he'll get to say I told you so for all eternity as the AI tortures us. I hope I at least get to be the well-endowed chimpanzee-man.
Boo. Boo. Boo. Your mod hat should be for keeping the forum civil, not winning arguments. In a huge content-filled human-written post, he merely linked to an example of a current AI talking about how it might Kill All Humans. It was an on-topic and relevant external reference (most of us here happen to like evidence, yanno?). He did nothing wrong.
That was a joke man
More options
Context Copy link
Watch your tone or I'll ban you too.
The joke is that I'm not a mod. He is.
Apologies. I guess the joke was on me!
More options
Context Copy link
how do you even tell who's a mod here and who isn't?
Just hang around for over half a decade.
Or, there's this page.
More options
Context Copy link
More options
Context Copy link
A certain someone reported you for impersonating a mod. Unlike him, most of the mods have a sense of humor about such things.
[User was banned for this post]
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
But sir, I followed the rules and linked it off-site. Please put away that rod, I'm scared :(
You're the domain expert here, not me. I'd hope I'm more informed than the average Joe, but infectious diseases and virology isn't my field. Though if you consider culture-bound illnesses or social contagion like anorexia..
A gene drive wouldn't work for humans. We could easily edit it out once discovered.
Even if we haven't intentionally exterminated a species with a pathogen (myxoma virus for rabbits in Australia came close), we have done so accidentally. A few frogs and rare birds have croaked.
(There are no mistakes, just happy accidents eh?)
Which isn't the worst benchmark for a malevolent AGI that is very smart by human standards.
I'd be talking out of my ass if I claimed I knew for sure how to create the perfect pathogen. I'm >50% confident I could pull it off if someone gave me a hundred million dollars to do it. (I could just hire actual virologists, some people seem insane enough to do GOF even today, so it seems easy to repurpose "legitimate" research).
So am I, I don't want my new RTX 5080 blown up, not that I have a choice if the power connector fails. I also plan to have kids, because I think it's better to live than not, even if life was short. I don't expect them to have a "normal" life by modern standards.
We'll see how this plays out, but I think there's enough justification to take more broad precautions like saving lots of money. That's usually a good idea any way.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I am not an AI hype man, but if it gets to the point of genuinely disrupting white collar work on a large scale, the amount of available desirable real estate could increase a lot.
You mean, a lot of people will be on welfare, unable to pay their mortgages, and will have to sell their property at lower prices?
Well, this obviously depends on what "desirable" real estate means to you, but I see a few possible drivers:
Unbundling of economic opportunity from specific places rearranges the leves of desirability, kind of like remote work on steroids. Some claim this would lead to even more agglomeration, but I'm not sure about that, people are often varied enough in their interests and wants that I believe you'd experience a big surge in lesser cities.
Pushing skilled workers down the value chain would improve the services in a lot of places, making them more livable.
On the higher end of outcomes, there's a lot of places that are very similar to very desirable ones, but are hampered by poor governance and infrastructure, a fully machine operated world would bring those places up to standard, increasing the supply of desirable space.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link