This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I don't know how @CloudHeadedTranshumanist sees it, but as far as I'm concerned, all a transhumanist future requires is to sit back and twiddle our thumbs (well, us specifically, not the entire world taking a day off) because regardless of ideological appeal, transhumanism is inevitable when there's technological progress. If not inevitable, then by far the default path into the future.
The reason why I explicitly praise transhumanism and spread it is to accelerate the process, not because it wouldn't happen anyway.
We gradually cure more diseases, until we breakthrough and cure aging. We keep fucking with power generation, and even if we don't get commercial fusion or a room temperature superconductors, even plain fission is amazing.
We start with MRNA vaccines, and we proceed to robust gene therapy. We simply try to replace missing limbs with adequate artifical counterparts, and then market forces inevitably at least try and make them better than the real deal.
You just have to ride the wave of the future, some of us just want to paddle too.
(And this is all ignoring AI, because at that point the future is transhuman whether you like it or not!)
Man I've got to hold off on the rum so I can make it with you guys. Sometimes I forget how incredible the future is going to be.
The future is going to be amazing, and hopefully we'll all be there to see it.
If you want a vision of the future, Winston, imagine crunchy corn, yum, kittypaws prrr, nyum nyum nyum icecream, yum -- forever.
What about my funkopops?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What, life as a brain in a jar?
Forget all the 50s techno-optimism SF "we'll have endless free energy, space colonies, eternal youth, and eventually fantastic cyborg bodies where we can fly like birds, swim like dolphins, and run as fast as Superman!"
Nah, all that is too expensive in material. Easier to be a brain in a jar that is wired up to a VR version of "eternal youth, flying and swimming, living in a space colony", and all the rest of it.
The future is you being owned, in effect, by the likes of Meta and whatever mega-corporations are under the AIs running the economy (if we get AI that far).
We’re already ‘owned in effect’ my friend. Humans have never been free and never will be, in the literal sense. We need society and others not to go completely batshit.
Brain in a jar? Well it’ll be a little weird but hell if I get to extend my natural lifespan a few hundred years, I’d imagine I might get bored with a normal human body anyway.
I don’t deny that we could end up with a horrible future. It’s certainly possible. But I think better futures are possible and more likely. I also think that pessimistic, doomer takes actively push against that good future.
More options
Context Copy link
More options
Context Copy link
Why do you think you're going to benefit from any of this?
@self_made_human has already expressed concerns about being pushed out by AI and I don't know your situation but unless you're on the board of a big corporation I don't think you have any leverage over the development and adoption of these advances.
In short, I agree with Scott's Meditations on Moloch (https://slatestarcodex.com/2014/07/30/meditations-on-moloch/). One of the few mechanisms that limits a race to the bottom is the physical and mental limits on what humans can endure. Transhumanism promises to make these limits programmable.
In video games, market forces gave us fun games with ever-increasing graphic fidelity and then veered sideways into DLC, lootboxes, and other exploitative practices because that's where the money was. Market forces gave us the world wide web and then swiftly optimised it for ad exposure and addiction. What makes you think your new limbs won't have a shelf life of only a few years like smartphones do, so that walking becomes effectively a subscription service?
Which will at best be required to compete with your colleagues on an even footing, as adderall seems increasingly to be. At worst, we will finally be able to rid the majority of humanity of the primal impulses that make them resist the obviously-correct rule of their betters: conservativism, religion, the desire for personal fulfilment...
Bluntly, what makes you think that the people who develop and own these technologies will want to spend eternity with you?
The only there even is a race at all is because human cognitive and physical labor isn't yet entirely obsolete.
That's what AI is for, and given that it's 2023, the usual reply that advanced AI doesn't exist is no longer true. It just isn't broadly superhuman.
Uh, IDK about you, but there are plenty of video games that aren't just narrowly disguised Skinner Boxes out there, and there are plenty of parts of the internet out there that haven't degenerated to Tiktok levels. I'm the case of the latter, look, we're in one.
Our cars aren't designed like F1 vehicles where the engines need to be replaced after our every race. Our phones don't catch fire after their battery runs out of charge. Those sound like excellent ways to make a profit, I wonder why manufacturers don't do that..
If you're so concerned about the longevity of your limbs, then buy models known to be robust, or stick with your biological ones. Heads up, they're not forever either, you'll start getting aches and pains well before you reach the lifetime warranty.
No amount of gene therapy can help a baseline human become competitive with a mid-future AGI. Biology isn't that robust, and the modifications required to make a human on par with a purpose built AGI look awfully like turning a human into a robot/AGI. I wouldn't complain, since I think that's awesome, but I stress that the existence of humans is a luxury good in the future. It's like trying to gene therapy a Peregrine Falcon into being faster than an F22 Raptor. You might get a really fast bird out of it, but it's not beating a jet aircraft without a rocket strapped to its ass.
Either we accept that humans are allowed to exist while being economically unproductive, or (almost) all of us die of starvation.
Even concerned as I am, I still think the former is more likely than the latter.
Same reason that you, I, and Bill Gates drink the exact same can of coke. Market forces and outside of some weird Hollywood morality plays, there's no reason to assume the technology will stay extremely expensive.
And how many falcons are out there, except as conscious re-creation of mediaeval falconry for the tourist experience? Sure. we still have falconry, but it's not the same as when it was an important part of life. Humans may indeed become a luxury good - which means cutting down numbers of humans drastically. Why does the AI need billions of us, when a picked population of a couple million for traits that are interesting and amusing will do just fine?
Luxury goods mean scarcity, remember.
Falcons did not raise humans from scratch with input into our core programming, to their detriment.
On the other hand, we can simply make AI care about us, and want to keep us around and in charge well after we're otherwise useless.
Of course, the "simply" elides the difficulty of this task, hence all the fuss about AI Alignment, but at least we know it's an option, unlike falcons or horses who have to put up or shut up.
Actually I have a lot more faith in AI alignment since GPT + RLHF appeared and it turned out that making AI act like a human with certain personality traits is actually pretty straightforward. It's the (post)human owners I'm worried about.
I've said much the same myself.
The way I heard it described over on LW, we can teach a smaller model to emulate a larger model very faithfully by training on its outputs. Modern LLMs are trained on the collective output of all of humanity, hence why they act so human. It's a lot less surprising when you see it in those terms.
We're not in the worst possible world when it comes to understanding or modifying our AI, so I'm suitably thankful.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Look at what happened to horses the moment they stopped being necessary.
More options
Context Copy link
More options
Context Copy link
Thanks for replying! Sorry, I have more cynicism :P
EDIT: To be clear, I was a transhumanist back in the day and I'm still optimistic about certain things. Most obviously I think sensory augmentations are pretty straightforwardly good and don't really have any nasty social consequences. I tried the fingertip magnet thing once (with superglue, not surgery) and I would be very interested in a non-surgical version of North Sense (https://www.vice.com/en/article/78ke3x/cyborg-implant-magnetic-north). I'm also very interested in GPT provided we can get a leaked version of the original weights and do our own RLHF. It's anything related to the body or brain that worries me, especially things that can alter personality or desires, or things that lead to involution such as increased ability to ignore fatigue / burnout. I also want the output of the whole process to remain something that I would recognise as human; anything else should be cleansed with fire.
Again, you've expressed deep concern about what will happen to you if your medical knowledge becomes obsolete. What gives you any optimism at all about what happens after that? The options that I see (other than death) are: remaining competitive (in which case the race to the bottom remains), receiving charity from one of the few people with power, or somehow managing to get democratic control over these things. The latter two vary from okay-I-suppose to horrifying, dependent on how much compunction people feel about altering you to fit their values.
Yes, it's possible to avoid this stuff. But the vast majority of people don't: https://www.statista.com/statistics/1201880/most-visited-websites-worldwide/. As far as games go, it's not actually too bad but still by far the biggest games are CS:GO, DOTA, Pubg and Apex (https://steamdb.info/charts/). All of these have lootboxes / ingame currencies AFAIK. More to the point, again, your desires are going to be rewritable. Why would I bother trying to appeal to you by making something you like when I can just make you like what I want?
But a luxury good for whom? You? Or whoever happens to feel like keeping you around?
I'm not thinking about the cost, I'm wondering what incentives whoever/whatever ends up on top has to let you and me keep stinking up the place. Maybe they'll think we add atmosphere but even so I doubt they'll want us cluttering up their private beaches - some goods are scarce by nature.
In the past, people have mostly obtained things via the charity of elites or through asserting their own power. The charity of elites is unstable and comes with strings attached. I do actually think that humans will be allowed to exist, I just doubt very much that they will be allowed anything resembling self-determination. Resignation in the face of this is something I understand but you genuinely seem to think more good than bad will come out of all of this and I don't understand that at all. Partly because futurist aesthetics stopped meaning anything to me once I saw the 90s/00s tech optimism sour.
Well. I do think we ought to be careful when it comes to tinkering around with our biology or cognition. I'm just not so obsessed with safety that I wouldn't want to explore our options, unlike some who use it as a reason to ban it.
I want to be smarter, faster, stronger. If that can be achieved while remaining "recognizably human", that's great, but I don't really prioritize it that much. I went from being an embryo to well, this, and I am curious to see what humans can metamorphosize into.
If it makes you feel any better, I have no issues with people who want to stay much the same as they are, as long as they don't bother me.
The relevant question in both cases is are there non-degenerate alternatives that are easily accessible? That's obviously true for video games, and I see no reason to see why it wouldn't be the case for cybernetics.
To use the example of actual cybernetics, rudimentary as they might be, cochlear implants and pacemakers last a good chunk of time and are absolutely not designed to be forced into obsolescence.
Surely advanced manufacturing will be more accessible and not less, such that if you want to make a sturdy prosthetic with open source design and software, you can? There are already models out there.
The latter. I'm not so wealthy or connected that I expect to be in charge after the Singularity. Unlike @DaseindustriedLtd, I don't think the class of people in charge of modern AGI, such as Altman, are so evil or ruthless they'd let us all die.
If the AGI in question is misaligned, they're likely as fucked as we are.
I mean, I obviously want more agency than this in my own affairs, but I'm already a negligible influence in a democracy of a billion+ idiots. What difference does it make?
I'm happy enough to live entirely in VR.
I don't expect the super wealthy to particularly care about this, since at this point a lot of the downsides of having poorer humans around (they have some degree of power, covet your wealth, might commit crime or try to kill you) become much more reasonable.
Elon Musk doesn't want to rub elbows with a SF hobo, he's likely cool with a random middle class or above human just vibing.
Like I said, I don't really think I have all that much power or agency, for almost everyone here, if we vanished tomorrow the world would keep on spinning.
But what makes me mildly hopeful is that a lot of the reason for strife between humans, outright scarcity of material resources, will be largely nullified in the near future. Not gone, of course, we likely live in a limited and dying universe, but you won't need to kill for bread or women, and there's a lot of room to expand into for centuries, millennia or even millions of years hence.
I see. The problem is that my intuitions and preferences are ultimately different from yours.
Depends on exactly what you mean by this. I can be flexible on physical body (though I don't like it much) but anything that is mentally inhuman by my estimation is a competitor and has to die. I'm invested in the survival of humanity and human culture and (in so far as I have any power to do so) won't tolerate anything that seriously threatens it. Which means, of course, that you can't tolerate me. Thus, conflict.
This is a very good point, and I appreciate your bringing it up. Don't have a counter ATM.
In general my experience is the opposite. Advanced technology usually requires more specialist tools and facilities, and offers increased opportunities for walled gardens / DRM / software-locked hardware etc. Look at old vs new cars - the new ones are impossible to repair yourself. Or desktop PCs vs smartphones/tablets. Not a global rule, just my experience + intuition.
I can't speak for Altman but I would 100% let you die :P As I am letting a number of people die this very second by not investing in cheap mosquito nets. I have no particular investment in you and no particular desire to have you around for ever and ever. Sorry, it seems weird to say to a stranger on the internet but I think you're being much too optimistic about this.
I think you're projecting. (So am I, of course.) Power and status are limited by their nature. I get where you're coming from better now, for what it's worth. But I think you've seriously failed to consider the possibility that other people have very different philosophies and desires from you and those are guaranteed to lead to conflict even if it were possible to have true abundance of all material resources.
Eh, at least I can claim that I didn't start the aggression. But if you do want baseline humans to remain in charge, force and violence you will need, and I wouldn't bet too highly given the odds.
Sure, but it's not a given. The EU is forcing even Apple to open up the walled garden, and right to repair is spreading.
Worst case, you have your human limbs, and advanced biotech will keep arthritis at bay.
I don't think humans will even need to work except as a hobby, so why would I care if the Samsung S-Limb Note 7 Explode Edition has additional proprietary features when I can get one 80% as good without the vendor lock-in?
:(
No worries, I don't really intend to rely on the charity of strangers. Nor am I an effective altruist, or even very charitable myself. I don't give money to charity, and I don't treat people I don't like for free.
To the extent that you're as powerless as me, why get worked up about it?
Will the future not have conflict? Almost certainly not, unless we end up under the silken chains of some kind of AI Hegemon. I simply don't think it's likely to be worse than what we experience today, and will largely be peaceful and prosperous after teething pains.
At any rate, like you, I'm going to have to wait and see for myself.
Oh, of course. I'm personally invested in making sure we avoid civilisational Bad Ends as much as possible, so to that extent I oppose undirected tech optimism. But as you say, we only have so much choice in the end. Thanks for the discussion.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
If the pessimistic scenario comes true, at least we'll have an opportunity for an exciting struggle against the big corpos for our immortal future. Some people say war gives humans purpose.
I’ll totally join the Motte resistance.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link