This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
For me the cool part was that I'm not on twitter much and it seemed eminently plausible that the discourse there has actually already been captured by toxoplasmic AIs. Can't be long now, anyway.
The nice thing about twitter is that you can have little siloed areas of good conversation, for now at least.
That's how Reddit used to work until moderation killed all the good places.
More options
Context Copy link
Oh, there's lots on both reddit and Twitter (and 4chan, although most of that sewer is just regular troll posts). Some are so good they almost pass for human , but others still use all the instantly recognizable chatgpt tells: "However, it's important to consider factors like", etc.
Others seem like real language until you realize they're just sampling random phases from the conversation while throwing in slogans.
I think dead internet theory is going to be a reality by the end of next year, if it isn't already.
Scarier than a dead internet, I'm worried that the rise of AI assistants is going to lead to a dead humanity. Even logging off won't save you from that one.
I've already seen reddit arguments that descended into "chatgpt, lecture this bigot about The Science," and there's something absolutely horrifying about someone making an active decision to hit the "brain turn off, let machine move mouth" button.
I always got that sick feeling of watching an ideology speak through someone, but never expected it to become this literal.
No exaggeration, we're only a few advances in machine interface away from becoming wetware for mind parasites, like an even gayer John Barnes novel.
Do you, honestly, think ideology doesn't speak through you? Never?
I don't know if his threshold is at the same level as mine, but I don't think being ideologically captured is all that common. A normal person with an ideology will have their moments of cognitive dissonance and scrambling to come up with an excuse, but someone who is being spoken through by an ideology will say things like "imagine the backlash against Muslims, if ISIS set off a nuke in a major metropolis", or will refuse to publish a study because their political opponents might "weaponize the results". Or if you want an example from the other side, reportedly Ron Paul's reaction to 9/11 was "oh no, now we're gonna have big government!".
In any case the scary thing about the discussed scenario has nothing to do with any particular ideology, ideologies are just an example of how people already outsource their thinking. The point is that with AI this is going to reach a whole new dimension. It will go beyond "we've all read the same book(s)" or "we all attended the same church", it will be people literally refusing to engage their brain and outsourcing all their reasoning to AI. Be my guest if you want to focus on petty tribal squabbles, but the idea is just as terrifying to me no matter who this power is wielded by.
To be honest, I don't see the distinction you've drawn between "will have their moments of cognitive dissonance and scrambling to come up with an excuse" and your following examples. Certainly I can imagine those people having moments of cognitive dissonance and scrambling to come up with excuses for their priorities.
Your prophesized AI future is certainly unseemly. However, I'm more often disgusted by the extent to which we already are slaves to genetic propagation and social patterns. Not often enough to make me nihilistic about life in general, but enough to look towards the future rather than embrace the past.
The distinction is that cognitive dissonance and making up excuses is mostly just biding time. The reaction recognizes that it ran into something that cannot be explained by a given ideology, but isn't, just yet, ready to admit it. I think this is 100% valid, and people shouldn't just change their mind in a moment of doubt. On the other hand, the reaction I'm contrasting it to basically sees the inability to explain the facts as irrelevant. If facts contradict the theory, so much worse for the facts.
For all their faults, there is still an element of spontaneity in genes and social patterns. With the dystopia I'm seeing, we're talking about wholesale engineering, which is what freaks me out.
Are you worried that engineering will lock it in? From what I know, genes and societies can stay the same for millenia until sufficiently disturbed. In light of that, I'm not that much terrified by the longevity of AI dystopia.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link