site banner

Culture War Roundup for the week of May 15, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

I acknowledge these risks are real but the other obvious application for LLMs is that mass government surveillance will get dramatically cheaper and more pervasive. It doesn't seem obvious to me that the boost towards a bad actor's capacity for destruction will outstrip the government's surveillance boon.

Should it? Do we want to live in a world where government capacity decisively outstrips that of individuals, where the authorities really can make people shut up and do as they're told?

If not, how badly do we wish to prevent such a world? If such a world seems to be what we're heading toward, but the balance of power still lies with the public, should the public take steps to forestall the formation of an unrivaled government?

I find it very, very difficult to believe that a future where the government has perfected truly effective, effectively inescapable surveillance is one that I want to live in. There is no plausible route I can imagine where this sort of power doesn't result in mountain-ranges of skulls.

In any case, your 100x multiplier is difficult to assess, mainly because most people aren't thinking about the problem from the right angle. I'm convinced the base threat is significantly underappreciated, and the second- and third-order effects are largely being ignored.

My post was descriptive, not prescriptive.

I absolutely do not endorse increased government surveillance but all that is careening towards inevitability. Around the time of the Snowden leaks, one of the comforting refrains from those worried about surveillance was to note that at least the government lacked the gargantuan computing resources required to monitor everyone (newly minted Utah data center notwithstanding). That coping mechanism seems so quaint in retrospect given the technological strides since.

Despite my aversion to government surveillance, I nevertheless must acknowledge that governments maintain a zeal towards prosecuting acts of terrorism and mass violence which likely serves as some kind of deterrent. A good illustration of this retributive zeal occurs with acts of violence where the perpetrator is too dead to be punished, so the state goes after tangential "accomplices" in its hunt for a scapegoat. This happened with the prosecution (and acquittal) of the Pulse nightclub shooter's wife, the prosecution of the friend who made a straw purchase for the 2019 Dayton shooting (The idiot invited the FBI into his home with weed in plain view and readily admitted to lying on the 4473 form. Also, the shooter had no record that would've barred firearm purchases, so the straw purchase made no difference.), and the ammunition dealer who got 13 months in federal prison after his fingerprints were discovered on unfired rounds from the 2017 Las Vegas shooting.

I'm not saying that I endorse this modern variant of collective punishment, but it is good indicator of how much retributive energy animates the government's actions in these circumstances. Obviously governments have an interest in leveraging increased surveillance into suffocating population control, and this interest would only magnify as costs drop. But even as an anarchist I would be lying if I claimed that the state's only motivation for surveillance is control. However clouded and selectively applied it might be, there's clearly a genuine interest from the state in punishing and preventing bad acts.

My post was descriptive, not prescriptive.

No, I get that. My question is whether we should be rooting for the Authorities or the Chaos, in the final analysis. Faced with that choice, my own bias is heavily in favor of the Chaos, but I try to be aware of it and compensate proportionally. This becomes harder when people argue persuasively that the road we're on clearly leads to the iron chains of long-term dystopia. Some people argue that terrible things are coming, but there's nothing to be done about it. Other people argue that there's things we can do to alter the future, but we shouldn't be in a hurry to do so because intervening would be worse. And it has to be one or the other, doesn't it? Either the coming future is worse, or the things needed to forestall it are worse. One must prefer one or the other, must one not?

However clouded and selectively applied it might be, there's clearly a genuine interest from the state in punishing and preventing bad acts.

The question is, is it in our interest to tolerate the continued existence of the current state?

Ok fair, I apologize for misinterpreting your post. The initial hypothetical is about LLMs empowering bad actors' ability to cause immeasurable destruction, and my response to that hypothetical was to consider that in such a world LLMs would also empower governments to establish immeasurable surveillance and policing. Whether or not we "should" do anything to stop that massive accumulation of power is impossible to decisively answer because we're already buried under an avalanche of hypothetical layers. It depends in part whether you agree that LLM-equipped terrorists are a risk worth worrying about in the first place.

I guess the way I'd put it is that it seems a lot more plausible that LLMs or similar can allow an effective panopticon than that they can allow mega-death terrorism, and so the assurance that Mega-death terrorism would probably be prevented by a government panopticon leaves me more worried on balance, not less. What saves us from the government panopticon?

There was an old woman who swallowed a fly...

I find it hard to believe that the federal government is capable of building a perfect panopticon in any reasonable timeframe. There are just too many leaky gaps in how info is collected. I imagine that criminals long since abandoned cellphones and facebook for sending business communications, and even chat gpt doesnt know what criminals are up to. What i think will be interesting is when we will see a sort of parallel construction of evidence using AI- the feds could feed their mega cache of comms data into a gpt-esque thing and ask who the likely ne'er do wells are and then go and start busting doors. Presumably, if the input data is solid and the AI isn't seeing rainbows, you could get some hits even if they are mixed in with some misses. Presumably some agency or PD will eventually try this, and presumably at some point it will become a point of evidence in trial that this is happening.

Parallel construction is super illegal. would using an AI be a loophole until further noted? who knows but ultimately its probably a bad decade to be starting up a scarface type situation.

Interestingly, i bet gpt would also be super amazing at figuring out who is dodging taxes, but what with the IRS having only 2 rusty pennies to rub together i doubt this will happen either.

I imagine that criminals long since abandoned cellphones and facebook for sending business communications,

The smart ones have but we mostly catch the dumb ones. I recall one instance where they did a drug deal under a live CCTV camera. Other times, they all switch their phones off at the same time when going out to do some crime. There are also occasional sting programs where they import phones that are supposed to be secured but the game was rigged from the start.

I find it hard to believe that the federal government is capable of building a perfect panopticon in any reasonable timeframe.

It doesn't have to be perfect to be almost unimaginably harmful. Removing the bottleneck of human labor in surveillance and analysis is a serious threat to the idea of limited or even responsive government.