This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I was using magic in the Arthur C. Clarke sense, hence why I said "sufficiently advanced". Sorry if I was being unclear, I just wanted to establish that when we're talking about this perfect germline editing tech then super advanced plastic surgery is probably within reach too and hence relevant to the conversation.
I think that in many cases the social difficulties and the high IQ are separate manifestations of the same underlying phenomenon - you can't get one without at least the risk of the other. While there's definitely a decent amount of low-hanging fruit (and a lot of it doesn't even need to involve genetics - proper nutrition from conception and beyond, ensuring no oxygen deprivation at early ages, etc) I think that you're going to run out of low-hanging fruit and start running into the trade-off zone. As I've said before, IQ is not an unalloyed good - we can just observe the world and notice that there are environments which select for it and environments which actually select against it. Some of these tradeoffs we don't give a shit about in the future, like longer development times and nutritional requirements, but some of them we very much will care about (blindness, social difficulties, higher rates of neurological disorders as seen in Ashkenazim etc). There's a decent chance that we live in a world where you'd be able to get a baseline level of enhancement by clearing out the low-hanging fruit but eventually reach a point where you'd have to start taking risks - i.e. hearing a doctor present an option with "This configuration will result in an incredibly high IQ, but at the same time there'll be a moderate risk that they end up with disorder."
My personal theory is that autism/aspergers represent a developmental failure that grows more likely with certain combinations of alleles that lead to higher IQ. That supports both the existence of non-autistic individuals with a high IQ and the notion that there are risks associated with it. But that said it is just my personal theory and I haven't done any real scientific study on the matter, so take it with an awful lot of salt.
However there's another issue that we've walked into here by using the word intelligence rather than, say, g. Given that we know a lot of political beliefs are biologically heritable, there's a decent case to be made that the various moral foundations that give rise to political opinions are ultimately genetic in basis. When you're selecting for intelligence, are you going to pick the alleles that make people more conservative or more liberal? There are plausible arguments that either side represents an increase in functional intelligence in the world, though at the same time that's also dependent upon the environment (a gene that makes you a hawk is a great idea in a world with nothing but doves, but that doesn't mean being a hawk is the optimal strategy all the time). Similarly, you can make the case that the intelligence required to be a really compelling artist in certain mediums is actually qualitatively different and mutually exclusive with the cognitive traits required to be a world-class performer in other fields.g doesn't really have political connotations in the sense that it is pure problem-solving ability, but "intelligence" is a word with a much broader meaning that makes things a lot more complicated.
The blindness is neurological so this won't actually help.
If you don't care about biology at all then you don't need to bother with germline editing and just go straight to making the AI. On that note, I don't think you can really make a good comparison between our current AI and the human brain - they just aren't the same thing, and in either case I've definitely seen autistic behaviour from ChatGPT and various other LLMs. But at this point we've stopped talking about anything resembling current technology and entered the realm of magic/post singularity tech that we cannot talk about sensibly.
Thanks for the clarification, while I accept that in most times and places, pinning your hopes and dreams on technological advancement within your lifetime is certainly fraught, we're living at a particularly unusual time after all.
My own understanding is that autism is basically too much of a good thing. Some traits that by themselves are not on the spectrum, if present in both parents, and passed onto the child, will produce outright autism which becomes a net negative.
Compare this to being heterozygous for the allele that produces sickle cell anemia if homozygous. Having only one copy is very handy if you live in an area where malaria is endemic, hence its commonality in much of Africa, but having two copies produces a disease that outweighs the benefits.
This is something I dimly recall, and haven't double checked, but it sounds plausible to me. We know that assortative mating in high IQ individuals working in fields where autism-adjacent traits are valuable tend to have more autistic offspring, such as when two engineers or computer scientists have kids.
Of course there's also autism that occurs de novo from mutations or developmental anomalies, but I don't have figured at hand for which is more frequent in absolute terms. I suspect that high functioning autism is likely the former.
Eh, I expect that to be solvable, but at the very least that particular state of affairs sounds rather unlikely to be actually true. Neuroplasticity is strong, hooking up an ordered stream of information into the brain almost inevitably produces the ability to interpret it, hence current trials of systems such as one where they use an electrical implant over the tongue that encodes visual images, which the blind come to recognize as a form of sight.
I wouldn't take that tradeoff if there was no available treatment, but at the end of the day, I suspect that we'll all end up on the pareto frontier where hardly anybody will be objectively better.
I am certainly less fussed about our civilization's stupidity in not exploring avenues like genetic enhancement because I expect post-singularity tech to make it moot.
I still think we should be investing a great deal more into it than we already do (close to zero), as a hedge if for some unforeseen reason the Singularity fails to materialize on schedule. After all, if we refrain from creating ASI that isn't provably aligned, we could still get a great deal of utility from having smarter humans running around.
Same reason we should be working on fusion, commercial space travel and so on, they're amazing right now, even if it turns out that a future AGI can solve them in about 2 minutes of wall clock time.
More options
Context Copy link
More options
Context Copy link