This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I like Anatoly Karlin's argument:
Naturally, people who speculate that their safetyism is protecting quintillions of eventual superduperhappy podmen scoff at a few tens or hundreds of excruciating megadeaths of their contemporaries.
Yudkowsky, for me, was at his most sympathetic when he lamented the death of his brother Yehuda Nattan.
It's still about one minute, likely zero, for our cryogenic technology didn't progress much since then. I liked Yud more when he worried about that, instead of freaking out on podcasts about the need to slow progress to a crawl.
OTOH, never cared much for Roko.
Nice try with the "if," as if to convince the mods that you are not, in fact, calling the other poster an idiot.
You've been warned and banned multiple times for this kind of petty antagonism. You do not seem to post anything but these attempted dunks on other people, providing no value whatsoever to the discussion. Your last ban was one week; this one will be two. Next one will likely be permanent. This place is not for giving yourself a dopamine hit testing how creatively you can call people stupid.
More options
Context Copy link
That was Big Yud at his most sympathetic?
I dunno, I just can't put myself in that mindset. I think it's probably because I don't really like anyone currently alive very much, so I don't feel "thousands of deaths of sentient people every minute" as a thousand tiny knives stabbing at my soul. People are a renewable resource! Sure, some will die, but, no big loss: basically identical ones will take their place.
...until they don't, because mankind wholesale gets paperclipped. At THAT, I feel Yud's doomer schizo panic.
Ah yes, «people are the new oil».
I like people. A few of them very much so, and many more at least a bit. My sympathy is egoistic: people are the set of beings to which I belong. I am a history and a world unto myself, and a particular take on the universe that we share. Others are the same. We are similar in nature and in scope, but unique in a way that needs no justification – unique like numbers are. So every death is a catastrophe, every death immiserates the sum of reality that remains.
Trivializing this with childish cynicism is, to my eyes, merely a petition to be excluded from that set of valuable beings. If you do not see yourself as an entity of immense worth, I can understand not extending the same courtesy to others.
More options
Context Copy link
(this is not my view, I don't like this longtermist pascal mugging spree people have just rolled over for. HUmanist allthe way, bayyy beee)
Why do you give a shit?
All those future humans are the same boring assholes you don't feel anything for right now.
An infinitely large pile of garbage might be impressively massive, but it's still garbage.
If you are truly selfish, better to be the last human on earth, enjoying the fruits of the most modern economies of production right up until you get turned into computational substrate.
I don't think most people roll over for a pascals mugging. Most EA/LW people believe there's a high probability that humanity can make transformative AGI over the next 15/50/100 years, and with a notable probability that it won't be easily alignable with what we want by default.
I'm skeptical that I'd agree with calling longtermism a pascals mugging (it has the benefit that it isn't as 'adversarial' to investigation and reasoning as the traditional pascals mugging), but I'm also skeptical that it is a loadbearing part. I'd be taking roughly the same actions whether or not I primarily cared about my own experience over the next (however long).
More options
Context Copy link
Because my not encountering anyone interesting in the thousand or so people I've met in my 30 or so years of living at the turn of the 21st century, does not mean that humanity couldn't produce anyone interesting in the 10^90 transhuman people who could exist across the next trillion years of seizing the cosmic endowment.
Welcoming the paperclipper because people are boring in 2023 is analogous to suiciding yourself because you're a kissless virgin at 16. There is still plenty of time for the situation to improve, provided you stay alive.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
My concern with safetyism is mostly that it basically ensures that AI will be much more likely to be first realized by the most dangerous people to be doing so. Basically, all of those concerned with safety, or who would be willing to consider safety alongside profits would stop, and everyone else laughs at the stupidity and goes on building AGI and leapfrogs ahead. China might well be willing to take the risk of a containment failure to be guaranteed to be the global leader in all aspects of military and economic and social power for centuries. If the AI boosters are correct, this is the chance of the millennium, and nobody with power and who understands the technology is going to let a concern for safety keep them from having such staggering power for themselves and their children to the next fifty generations. If we have a moratorium, that’s just going to mean AGI comes from China, India, or somewhere else.
I wonder if there is anything at all that can shake American confidence in this projection. Not to mention the premise that Chinese leaders understand what is at stake.
Anyone with a brain would do so, again if they understood the significance of this technology in potential. AGI, once achieved to a reasonable standard and given the ability to iterate more intelligent versions is going to quickly be a technological leap on the order of the invention of writing or agriculture. Those that get there first will rule over the rest of us as the Europeans ruled the Native Americans. Those who don’t have access will be at the mercy of those who do.
I've watched too many Liveleak videos of Chinese industrial accidents to believe that anyone with a position of power in the PRC isn't a degenerate high-time-preference psychopath who'd take the safety rails off anything if he thinks he can sell them for scrap metal and earn a few extra yuan.
More options
Context Copy link
Would anyone with a brain initiate zero COVID, suffocate the country for three years, then cancel it overnight and overload the medical system?
China is observably not maximizing their odds of geostrategic dominance, nor much of anything else, sans ass-covering by party elites.
And before we start worrying about Choynese AGI, we should focus on something they've had much more of a head start in: Choynese Eugenics.
How's it been going for the last decade?
China is not a competitor to the West. China will implement any braindead regulation the West devises, faster and harsher and stupider. China is less relevant than Turkey or, certainly, Israel. I will say it as many times as I have to.
So they couldn’t be doing both? I’m not sure they are, but given that they aren’t poised to impose stupid moratoriums on research as the west seems ready to (we’ve already banned eugenics). it’s seems that Americans and perhaps thx Atlantic countries will hobble themselves as decadent falling empires often do, and will reap the benefits of having an irrelevant moral high ground and watching as others overtake them.
This could be a western version of Haijin (https://en.wikipedia.org/wiki/Haijin) in which we just decide to not move forward in science and technology. It’s never worked.
More options
Context Copy link
I agree, but with the catch that non-competitors can radically change for contingent reasons, get their shit together, and become competitors, sometimes. It'd be easy to say something similar about historical China's economy, and now they're top 2 nominal gdp.
It's historically normal for China to have the world's greatest GDP. USA is the weird one here.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Even now, most people don't understand the significance. At least in polite circles, the main thoughts around AGI seem to be worrying about how it will help kids cheat on their homework or reinforce racial/gender biases (that applies all around; see all the worries about ChatGPT being too woke). A significant number of those who do recognize its power are caught up in catastrophizing sci-fi stories about Clippy. As for China, it's not treating AGI as an existential issue; it mostly seems to worry about losing a couple productivity points relative to the US and using it to more efficiently enforce its internal security, and it wouldn't hesitate to give up all its (so far trailing) efforts if it could get Taiwan in exchange.
More options
Context Copy link
Eh, that wasn't as dramatic as usually depicted. Europeans had the massive benefit of playing local tribes off against one another, and horrific waves of disease that killed 95% of the native population.
That being said, I agree with the game theory argument against AI safety.
It's a good thing, then, that there are no bitter tribal divisions within Western countries and no dangerous infectious diseases spreading from China.
More options
Context Copy link
It also took a couple centuries for Europeans to establish their absolute dominance over and destruction of Native Americans. Even that would have been uncertain if not for the the aforementioned disease-induced massive depopulation of the Americas; something more like sub-Saharan Africa would have been the more likely outcome.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I pray that we lift the ridiculous restrictions on bio-tech and longevity soon. Even if we do get AI, regulation is a powerful force and I'm not holding out any hope.
One thing I wish transhumanists/cryonics folks would get better at is lobbying public opinion.
furiously takes notes
no, you are on to something:
If the transhumanist/cryonics crowd actually seemed to give a shit about the population at large, maybe the population would give a shit about them.
As is, they all get dragged down by their lunatic fringe sucking up all the oxygen
More options
Context Copy link
Trans humanists aren’t Machiavellian enough. They need to throw away all their other principles and mouth whatever BS the zeitgeist wants to get their research done.
Reframed more snidley:
Transhumanists give more of a shit about their politics than their stated goals: they'd rather protect their NAP and genderinos and take the TRANS out of transhuman than actually accomplish anything, which is why they haven't and never will.
More options
Context Copy link
Nah I think they've been nerd sniped by AI safety, that's the failure mode. They're still attached to the goal of saving the world they've just failed at rationality, and chosen the wrong means.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link