This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
So yeah. This was the first time I ever listened to/watched one of Fridman's interviews. He seemed to burst onto the scene out of nowhere around a year ago. And everything I gathered from secondhand testimony and snippets of his content, and his twitter feed, led me to make an assumption:
The guy is basically trying to bootstrap himself to become the Grey-Tribe version of Joe Rogan.
And after hearing this interview, I updated MASSIVELY in favor of that model. It wasn't quite like he's just cynically booking 'big name' guests who appeal to the nerdy corners of the internet, and doesn't care about the actual quality of discussion. He appears to be making an effort.
Yet his approach to the interview seems to be MUCH less based on engaging with the thoughts of the guest but more pressing them on various buzzword-laden 'deep' questions to see if they'll give him a catchy soundbite or deep-sounding 'insight' on a matter that is, to put it bluntly, pseudospiritual. He's in there asking if 'consciousness' is an important feature of intelligence and whether that is what makes humans 'special' and if we could preserve conciousness in the AGI would that help making it friendly? Like kinda playing with the idea that there's something metaphysical (he would NEVER use the term supernatural I'm sure) and poorly understood about how human thought processes work that gives them a higher meaning, I guess?
And EY has basically written at length explaining his belief that consciousness isn't some mysterious phenomena and it is in fact completely explainable in pure reductionist, deterministic, materialist terms without making any kind of special pleading whatsoever, and thus there's no mystical component that we need to 'capture' in an AGI to make it 'conscious.'
As you say, his blatant dodge on the AI box questions AND, I noticed, his complete deflection when EY literally asked him to place a bet on whether there'd be a massive increase in funding for AI alignment research due to people 'waking up' to the threat (you know, the thing EY has spent his life trying to get funding for) betrays a real lack of, I dunno, honest curiosity and rigor in this thought process? Did the guy read much of EY's writings before this point?
Its almost the same shtick Rogan pulls where he talks to guests (Alex Jones for example) about various 'unexplained' phenomena and/or drug use and how that shows how little we really know about the universe, "isn't that just crazy man?" But avoiding the actual spiritualist/woo language so the Grey Tribe isn't turned off.
At least the guys in the Bankless Podcast noticed right away they were beyond their depth and acted more like a wall for EY to bounce his thoughts off.
As for EY.
Man. I think the guy is actually a very, very talented writer and is clearly able to hold his own in a debate setting on a pure intellectual level, he's even able to communicate the arguments he's trying to make in an effective manner (if, unlike Lex, the other party is conversant in the topics at hand).
He even has an ironic moment in the interview, saying "Charisma isn't generated in the liver, it's a product of the brain" or some such. And yet, he does not seem to have done much beyond the bare minimum to assuage the audience's "Crank detector." Its not clear that his persuasive powers, taken as a whole, are really up to the task required to win more people to his side.
Of course, for those only listening in rather than watching, that won't matter.
I'm not saying EY should just bite the bullet and work out, take some steroids, get 'jacked,' wear nice suits, and basically practice the art of hypnotizing normies in order to sneak his ideas past their bullshit detectors.
But... I'm also kinda saying that. He KNOWS about the Halo Effect, so within the boundaries set for him by genetics he should try to optimize for CHA. Doubly so if he's going on a large platform to make a last-ditch plea for some kind of sanity to save the human race. MAYBE a trilby isn't the best choice. I would suggest it would be rational to have a professional tailor involved.
But I do grok that the guy is pretty much resigned to the world he's been placed in as a fallen, doomed timeline so he probably sees any additional effort to be mostly a waste, or worse maybe he's just so depressed this is the best he can actually bring himself to do. And it is beyond annoying to me when his 'opponents' focus in on his tone or appearance as reasons to dismiss his argument.
And to make a last comment on Fridman... he clearly DOES get that CHA matters. His entire look seems engineered to suggest minimalistic sophistication. Sharp haircut, plain but well-fitted suit, and of course the aforementioned buzzwords that will give the audience a little bit of a tingle when they hear it, thinking there's real meaning and insight being conveyed there.
I don't think intelligence correlates well with moral strength (i.e. ability to live your own moral ideals). Whatever the seeming importance of averting AI catastrophe, it seems to be less important to him than his identity as someone who won't pay a few hundred bucks for fashion tips.
More options
Context Copy link
Pretty much. As a long-time Lex listener, he is intolerable nowadays. I tried watching the recent podcasts with Altman and Yud, and I had to give in. Instead of talking about the 100s of potential technical/economic/political implications, he pushes the direction of the conversation in the "Is AI a living being (hurr durr boston dynamics spot doggo)?" angle of all things.
His old podcasts were not like this. He used to delve into the technical details with the guests. I think he realized that the midwit-Ifuckinglovescience crowd is a much better revenue stream and his youtube algorithming optimizing his way to becoming a Philosophy-101 cringe lord.
At least Joe Rogan pulls plays dumb convincingly, Lex comes off as a tool.
Yeah, as I said, pseudospiritual. It's a weird approach when you've got someone who has vast amounts of technical insights and industry knowledge to engage in, effectively, philosophical circlejerking. But then if your audience lacks the technical knowledge to follow such a conversation, they might tune out. Whereas engaging in fundamentally unanswerable meta questions means your audience can feel like they've received insights and can easily follow the conversation without thinking too hard.
I have not seen anything to suggest otherwise. With that said, it helped him build an audience which helps him snag some of the most popular guests, which helps him grow his audience, which helps him get more popular guests... etc. etc. It's a successful model.
He noticed that, like Rogan, if you punch through to the top tier of podcasting you can have a self-sustaining money printing machine because important guests will seek the platform you have, and audiences will gravitate to important guests. The only risk is either 'scaring' away guests by questioning too aggressively or getting cancelled (as almost happened to Rogan) for any particular controversial moments.
Which might fully explain why he's fallen back on the softest of softball questions that don't challenge the guest nor risk wandering into a cancellation landmine.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link