This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I am very worried about this, I just have no idea how to stop it. What are we supposed to do? An AI slowdown will only serve to cement the power in the hands of the large labs. People are working on open sourcing models like LLaMa, but it's inherently a sort of technology that lends itself to centralization, with the massive data and compute requirements.
Honestly I think Open AI being the leader is better than many alternative outcomes. Sure Microsoft gets to use it but theoretically Open AI comes back under it's own control after $92 billion in profits is made. Seems like an okay situation compared to Microsoft or Google or another big evil corp controlling everything.
Why do you follow the teachings of the Leto II, God-emperor?
The obvious solution is technologies that either reward decentralization or punish centralization sufficiently to offset the appeal of silicon tyranny. The ease of development of such technologies appears to be the inverse of the mess and harm they create, though, so the problem is one of will and collective action. people do not perceive the threat, so they do not act when prevention would be cheap and easy.
More options
Context Copy link
I know what, I don't know how. The Free Software movement has the basic blueprint: promote the empowering of the end user wherever possible - open source, open data, distributed systems, whatever it takes.
The issue is that you're going to run into the same problem as the Free Software movement - opening the AI (actually opening it, not just putting the word in your name) hinders your ability to make profit, so it'll get no corporate support, and empowers political rivals so it will get no government support. What's worse, culturally Free Software is currently kneecapped, no one cares about it anymore. It used d to be a pretty strong movement, but still failed to hinder the centralization of the Internet. It doesn't stand a chance now.
Still, I would say that both the optimists and the pessimists have an obligation to talk about it. Of the two my bigger issue is with the doomers. They're generating a lot of buzz about the negatives of AI, but they're sucking all the air out of the conversation to talk about SciFi scenarios.
I honestly don't see the difference. They're already censoring it, it will only get worse. With the Internet we at least got a few years of the Wild West, with AI we're at BigTech social engineering from day one.
I see it as the value alignment of those at the head. Sam Altman isn’t perfect, but he’s at least nominally aligned with EA values of making things better for everyone. He’s not a classic sociopathic shark at one of the big tech firms that was born into massive wealth, went to Harvard, did the standard track, and parasitically drained value from the masses.
Again I’m not saying Altman is some sort of hero, clearly he’s sociopathic if he’s made it that far into the power structure. But at least he’s a relative outsider and there’s hope he can steer us to better outcomes because he thinks more deeply about the consequences of AI than the folks who stop at the idea of gaining money and power.
But it doesn't matter. Jack Dorsey is a free speech guy, he was still forced to make Twitter conform to the establishment's preferences. It wasn't because he had a change of heart, because now that he's free from his creation, he's working on decentralizing social media.
Even if Altman is truly devoted to the good of humanity, and even if EA actually benefits humanity, that means absolutely nothing. What can he do when they come for him and make him kiss the ring?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Yes. Well, it's unavoidable either way, because power is always centralized. If the power exists then it will centralize. It's a law of nature.
Who wields the power of guns? A few large actors.
Who wields the power of nukes? A few large actors.
Who wields the power of compute? The internet was supposed to be the ultimate tool of democratization, total freedom for anyone to say anything to anyone. But who reaps the benefits, who controls the discourse? A few large actors.
A few individuals seem to have a fantasy of using open source AI models to, like, fight OpenAI's models and Meta's models, in an epic battle for truth and freedom, or something. There will obviously be no such thing. It's as fallacious as claiming that the right to gun ownership in America protects us "from government tyranny". You have a gun, but they have bigger guns. No matter how smart your model is, the big guys in charge will always have smarter models, more compute, better logistics, more resources to throw at the problem.
The internet, ultimate tool of freedom and progress, introduced us to the concept of cancellation on a trans-national scale (they canceled the whole god damn country of Russia!), memetic social contagions that cause people to voluntarily sterilize themselves, all sorts of new ways to ruin someone's life with fraud and theft... what new and unanticipated forms of immiseration will AI introduce us to?
The only solution is to just not build it. If you have to build it anyway, well, good luck.
More options
Context Copy link
More options
Context Copy link