- 119
- -14
What is this place?
This website is a place for people who want to move past shady thinking and test their ideas in a
court of people who don't all share the same biases. Our goal is to
optimize for light, not heat; this is a group effort, and all commentators are asked to do their part.
The weekly Culture War threads host the most
controversial topics and are the most visible aspect of The Motte. However, many other topics are
appropriate here. We encourage people to post anything related to science, politics, or philosophy;
if in doubt, post!
Check out The Vault for an archive of old quality posts.
You are encouraged to crosspost these elsewhere.
Why are you called The Motte?
A motte is a stone keep on a raised earthwork common in early medieval fortifications. More pertinently,
it's an element in a rhetorical move called a "Motte-and-Bailey",
originally identified by
philosopher Nicholas Shackel. It describes the tendency in discourse for people to move from a controversial
but high value claim to a defensible but less exciting one upon any resistance to the former. He likens
this to the medieval fortification, where a desirable land (the bailey) is abandoned when in danger for
the more easily defended motte. In Shackel's words, "The Motte represents the defensible but undesired
propositions to which one retreats when hard pressed."
On The Motte, always attempt to remain inside your defensible territory, even if you are not being pressed.
New post guidelines
If you're posting something that isn't related to the culture war, we encourage you to post a thread for it.
A submission statement is highly appreciated, but isn't necessary for text posts or links to largely-text posts
such as blogs or news articles; if we're unsure of the value of your post, we might remove it until you add a
submission statement. A submission statement is required for non-text sources (videos, podcasts, images).
Culture war posts go in the culture war thread; all links must either include a submission statement or
significant commentary. Bare links without those will be removed.
If in doubt, please post it!
Rules
- Courtesy
- Content
- Engagement
- When disagreeing with someone, state your objections explicitly.
- Proactively provide evidence in proportion to how partisan and inflammatory your claim might be.
- Accept temporary bans as a time-out, and don't attempt to rejoin the conversation until it's lifted.
- Don't attempt to build consensus or enforce ideological conformity.
- Write like everyone is reading and you want them to be included in the discussion.
- The Wildcard Rule
- The Metarule
Jump in the discussion.
No email address required.
Notes -
I feel like you're strawmanning (or perhaps weakmanning) rationalists. Like your first example with the poorly reasoning Dr John reads like something straight off of less wrong from 10 years ago , it is absolutely nothing new to the community. Your second example has the exact same issue. The rationalists have pretty low opinions of mainstream media like Bloomberg already. The term gell-mann amnesia has been floating around for years to describe the phenomenon of temporarily forgetting just how bad journalists really are, that's not exactly something you'd expect in a community that blindly trusts mainstream media. In the third example you spend a long time attacking Krugman who has never been part of the rationalist movement, there are economists who could be considered well known rationalists like Robin Hanson and tyler Cowen but Krugman is not one of us.
Overall your post has a very /r/Iamverysmart vibe. You pat yourself on the back a lot for noticing things everyone else missed but you don't seem to actually have a good grasp of what the rest of the community actually thinks.
I wasn't talking about the rationalist movement, I was talking about people who are generally considered very smart / rational / scientific / humanist, or whatever term you want to call them.
That being said, people in the rationalist movement do suffer from precisely the same deficiency, and proof of that is that many were duped by Sam Bankman-Fried.
I think you are missing the forest for the trees. The examples in the article are used to exemplify a single problem, do you understand what that problem is?
More options
Context Copy link
More options
Context Copy link
If you want to explore the topic further, in a sort of corollary to Betteridge: if you say "x is too y" in the headline, then in the piece you have to say why "x is too y to achieve A" or that "x is too y in comparison to B." Otherwise the criticism "x is too y" just floats in space, unmoored from any standard by which to judge it, its impact unclear. A headline "Eagles' Quarterback Jalen Hurts doesn't spread the ball around to all his receivers enough" is probably just sports blog masturbatory analysis and nitpicking; "Jalen Hurts doesn't spread the ball around enough to defeat the more organized defenses the team will meet in the playoffs" is much better, showing the impact of the action on the team's goals and providing a clear standard to judge against; "Jalen Hurts doesn't spread the ball around as much as quarterbacks like Rodgers or Mahomes" gives us a direct comparison to show how other successful players have performed and where Hurts falls short if he wants to be an MVP candidate like them. It also gives us a standard by which to figure out if we agree with the author.
Right now all I'm getting is "Rationalists are easier to to dupe than Nassim Taleb's imaginary friend Fat Tony.*" Consider some alternative theses, and if you can support them from your evidence:
"Rationalists are more easily duped than other political groupings and actors."
This seems tough to argue for. Progressives fell for Smollett and Abrams and Beto, vast hordes of the Republican base fell for Q and Mike Liddell and JFK Jr. being alive, Fox News advertises nothing but gold buying scams and reverse mortgages, while MSNBC and CNBC gave you Tom Brady and Larry David pitching FTX out the wazoo. Congress fell for WMDs, the EU fell for Putin, the US military-industrial-intelligence complex just keeps getting duped into handing local partners in Vietnam and Afghanistan and Syria billions of dollars that fail to deliver any results, US corporations get duped into doing DEI work or partnering with Rivian, US investment banks got duped into holding the bag for financing Elon's twitter acquisition, when academics aren't getting duped into printing fake papers they're getting duped by non-replicable studies p-hacked by researchers to make it look like they meant something, Billionaires get duped into buying fake old wines, the greatest venture capitalists and tech investors get duped into wasting billions on WeWork or Robin Hood or Peloton. To paraphrase the quote in my profile, ain't nobody who ain't gullible, I looked.
"Rationalists are too easily duped to achieve their goals."
This probably moves the argument into a debate about what the "goals" of the rationalist movement are. If we're going with the Yudkowskian maximalist theory that people who have read The Sequences will one day bestride the earth and rule like intellectual colossi, then there's a pretty good argument there. For Rationalists to be Ubermensch Philosopher Kings they'd need to be basically impossible to fool, they are not impossible to fool, ergo they cannot be Ubermensch Philosopher Kings. If we're going with the EA theory that they should be in charge of directing money to far-off charities for 10% of my income, I'd also find that pretty accurate, I don't want to risk giving money to an org that might pass it to a fake org or to a doomed and ill-conceived political campaign. My more modest view of the role Rationalists play in my life, writing fun blog posts and faux-reddit screeds that make me think, I don't think Rationalists are too gullible to handle. SA might be too gullible to be "the guy," he might even be too gullible to be the-guy-behind-the-guy, but I don't think he's too gullible to be the-guy-behind-the-guy's favorite author's favorite author.
"Rationalists are more easily duped than ordinary citizens."
This might be true in the sense that Rationalists are more likely to try a nootropic supplement or counterintuitive fitness trend than the common run of human, but that's more hobby than anything else. The ordinary run of humans are so easily fooled that it beggars belief. Not only do time share scams exist, there's a meta level of tv ads for scam companies that will help you get out of your scam time share for a fee.
*@Faceh I have known a lot of Fat Tonys, or at least guys who would have read that piece (if they were in the habit of reading books like that) and identified themselves with Fat Tony. Self-made greasy guys, realtors and salesmen and house flippers and developers and entrepreneurs. They're the kinds of guys to tell me that a slot machine is "hot," that the Sixers should stick with Shake Milton because he hit a few threes against the Lakers, that this particular craps table always hits 11s and to bet on them, that girls who wear Frye boots are always freaks in bed, to never trust Albanians. What Taleb trumpets as street-philosopher skepticism is often closer to over-active pattern recognition; in his thought experiment Tony calls bullshit after 99 tosses, in my experience of those kinds of guys they would have called bullshit after the first 3 landed heads and gone 5:1 odds on Heads for the fourth toss. Which, in Taleb's hypothetical, is great; but just as often guys like that end up having to sign themselves out of the casino, the house is very good at convincing people that their lucky streak will keep running. And they will all tell you how their ex-wife fooled them in the divorce. Great salesmen, in my experience, are often easily sold by other salesmen. Mamet and Miller agree, of course. The quality that enables them to sell so convincingly, their belief in the narrative of the superiority of their own insurance products, makes them vulnerable in turn to believing the narratives of boat salesmen and casino croupiers and loose women. Of course, there are times they recognize a danger that my sheltered education missed, there are also times their superstitious pattern recognition results in betting big on the Giants to win the division because they "have the will to win." There's a reason why the big successful hedge funds hire MIT math majors by the bale, not greasy Eyetalians who successfully guessed how many gumballs were in the jar.
**One could of course argue that maybe these guys seemed like Fat Tonys, but they weren't real Fat Tonys. Fat Tony isn't a useful category if he's impossible to tell from any other fat loser until after he's always right, he's less superpredictor and more Texas sharpshooter.
He's not even getting that. He's getting that imaginary rationalists are easier to dupe than fat Tony. He doesn't give any actual example of a rationalist making this mistake.
More options
Context Copy link
Wait, Larry David was pushing FTX? As in the Curb Your Enthusiasm meme where he tries to do something, it fails spectacularly, and cut to the credits with that jaunty music?
…That’s hilarious.
https://youtube.com/watch?v=BH5-rSxilxo
Honestly, a fantastic ad. Sort of the opposite, which then worked out the other way.
Smartest celebrity in any crypto promo. He only agreed to it because he’d be written in as the non-believer.
Probably made 8 figures and no-one can even accuse him of endorsing it.
More options
Context Copy link
Wow I had never seen the whole thing. That's a solid commercial, too.
This whole thing works like an extremely meta CYE joke in and of itself. Just layers upon layers of irony at work here.
The more flack Larry David catches for his role in it, the better the meta joke gets.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Your essay doesn’t even use rationalists as an example once in the cases you examine. You have two examples, Ligma Johnson and a Scott Adams statement. The victims of which are journalists and Paul Krugman respectively, neither of which would describe themselves as rationalists. Just a nonsensical essay
More options
Context Copy link
Ooooh, Nassim Taleb set up a rigged thought experiment, well that has convinced me!
Taleb couldn't convince me grass is green. I agree the rationalists are a little too credulous, but that's not so much rationalism per se as that the people espousing it round these parts tend to be very nice, well-meaning types who are somewhat idealistic and progressive and are all about the openness to experience, trust, being charitable, and giving the benefit of the doubt.
Now, as to the coin flipping: who is the guy doing the flipping? Is he trustworthy? If you want to tell me that Fat Tony is right, because Taleb is the guy doing the flipping and he's shady as fuck so this is a loaded coin, okay - but I don't think that's the conclusion you want me to draw about the trustworthiness and believability of Taleb.
You want me to trust Taleb, that he is right about Fat Tony. But if Taleb is right about Fat Tony, then Fat Tony is right that this is not a fair coin, and so Taleb is a liar, and so Taleb is untrustworthy, so why should I believe him about Fat Tony?
(1) The coin toss could be fair, and Fat Tony is wrong.
(2) The coin toss could be rigged, and Fat Tony is right.
Just from the details given, we don't know which option is correct. Now, in general and in real life, if someone is trying to get you to bet money or agree to something based on "if I flip this coin 99 times and it comes up heads 99 times, we'll bet does it come up heads the 100th time" - yeah, be suspicious.
In a thought experiment? Where the guy has the incentive to make it that Fat Tony is right, not Dr. John? Yes, that's rigged - but not about the 50/50 chance of the coin coming up heads, but about Fat Tony never, ever being wrong.
With 99 heads in a row it doesn't really matter. There is literally no one in the entire world that you should trust enough that you would still believe the coin is fair. 2^-99 is a ridiculously small number.
More options
Context Copy link
More options
Context Copy link
This feels all over the place. Your title doesn't seem to be related with most of what you wrote, and your conclusion comes out of left field 'here's a bunch of examples about some jokes, this is why rationalists get scammed' seems more like nonsense than a coherent argument.
You also pattern-match badly here - Trump, Adams, Taleb and Musk are the kind of examples that intimate you aren't seriously thinking about this but simply want to dunk on the opposition.
Maybe take this one back to the writing board, and look for more salient examples that support the point you're trying to make.
They aren't jokes. It seems you don't want to see what actually happened, the pattern I pointed out, and the significance I very clearly explained.
Your assertion that Ligma Johnson was a genius 5D chess maneuver designed to undermine the authority of modern journalism as opposed to, y'know, a joke isn't compelling so much as it stretches the principle of charity to believe you believe it.
Your patterns aren't just non-obvious, they're non-existent and clearly contrived - badly contrived to support a single political side. You have a pattern of posting half-developed essays that meander for a long time and take a sharp left turn at the end into a conclusion completely unsupported by the argument.
Your thesis is "rationalists are too easily duped". Your supporting points are a hypothetical thought experiment from Nassim Taleb, a tweet including the eggplant emoji from Elon Musk and Krugman taking a Scott Adams tweet seriously to make his own point (this isn't duping and I have no idea why you think it is).
Notably, none of these events even involve scams, let alone rationalists. They involve 'deception' in the sense that Krugman doesn't really care if Adams votes for Trump or not, he was using Adams' tweet to make his own point. If you say 'I am a communist and I want higher taxes' in a discussion and I use that as a springboard to argue against higher taxes, you haven't fooled me if you're really a fascist and are just pretending to be a communist. I've said the piece I wanted to say, why do I care that you lied?
As for the end of your essay if you're genuinely in possession of a secret 'black pill' of deception and persuasion, why are you personally not convincing?
The answer lies in some fairly traditional elements. Pathos, ethos, and logos.
There's not much pathos to speak of here, so I won't.
Ethos matters. If your essay is 'wow, look at all these points from people who also hate the woke like I do', you lose a substantial amount of ethos from the get go. You've clearly picked a side and have an investment in it winning, or at least looking good. You seem untrustworthy - why would I believe you're genuine and honest about arguing this? This doesn't sink your essay, but it intrinsically loses you the trust you might need to stretch an argument further than it might ordinarily go.
Logos.
Why are none of your examples about real skeptics or rationalists failing to be skeptical? This is a complete failure of logos. You should bring a series of logical reasons I would believe in your stated argument, but you don't. You bring weirdly irrelevant culture-war bits - which ties back into ethos again. If your supporting evidence is both irrelevant and biased, your failure in logos simply increases the failure in ethos.
Hopefully that clarifies things.
I did not assert that.
But he didn't make his own point, because his own followers saw that he was fooled.
Which it isn't.
So, before criticizing what was being said, it would behoove you to actually listen to what was being said.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
As for naivety in general, I would say that we all can make wrong judgements when we step outside of our sphere of expertise. We can be wrong even in our own areas (for example, was FDA right to approve aducanumab or not) but it tends to be corrected over time. But if I tried to make judgements when it is appropriate to increase or lower taxes depending on current state of economy, I would be wrong most of the time, even if I have some superficial understanding of macroenocomics.
Or even more science-based example – someone complained that the main bridge in our city is unsafe and should be closed immediately while some city official publicly announced that it is only a rumour and the bridge is safe. Whom to believe? I wouldn't know unless I had spent really long time studying dynamics of bridge safety. It was a true case that happened 5 years ago in Latvia but the bridge is still standing and in use.
If you don't know who to believe, then don't believe anyone. Why must people trust anybody?
That doesn't always work because you have to make decisions sometimes. For example, to get covid vaccine or not. You may not trust doctors but without vaccine you could be fired from your job. That would make you even less to trust doctors even if you don't trust antivax particularly. Not an easy decision to make.
You don't need to trust anyone to make a decision. Nor do you need to believe anything.
No, I need something to trust before I make a voluntary decision.
I mean, I can be forced to make a decision. If someone threatens me, I would comply. But to make a decision, for example, get a covid booster or not, is based on my understanding and trust in the benefit or absence of any benefit.
No, you don't.
If you are thinking about asking a woman out, do you need to trust that she will say "yes" before making the decision to ask her out?
If you are thinking on rolling dice, do you need to trust that you will get a 7 before making the decision to roll the dice?
I don't understand how people don't even realize how they make decisions. Many of the decisions you make are based on chance, not trust, and most you are not even aware that you made them.
No, my medical decisions are not like asking a woman out.
Of course, you can make them like rolling a dice but that's not the best way. The whole medical history has led us to this point that we don't.
Yes they are. Rational medical decisions are based on probability, not trust.
I decided to use masks, did I trust that they would work? No.
I decided not to take any COVID-19 shot, did I trust that they weren't safe? No.
You can deny that you rolled the dice all you want, but you did. Rolling a die that has 99% chance of winning is still rolling the dice. You could have been wrong.
If you have 100% certainty that your medical decision is going to be correct, you are simply not rational.
And this is a red herring. You are basically saying "medical decisions are not black swans", but they don't have to be (even though they are), I showed you black swans. Your white swans are irrelevant. Case closed.
If I can show you 10 black swans that prove you wrong, you are just going to deny reality.
I decided not to wear masks because I had no evidence that I could trust. Sometimes I was wearing masks anyway because the state compelled me.
I was ok with taking first 2 covid shots because I trusted the evidence that I had about its effectiveness. I did not trust the evidence about booster shot effectiveness but took it anyway because the state required it for me to travel within the EU. I didn't take any subsequent boosters because the state didn't require them and I didn't trust the evidence.
That's how I operate. I don't understand about rolling a die. I can trust the evidence of the medicine with understanding that it is not 100% certainty. Especially if I know that the medicine works, for example, in 80% or 60% people taking it.
Medical decisions are not black swans because they are completely different things. Black swans are unpredicted events. Medical decision is not an event but a decision. I don't know how to compare them.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Using a contrived example to warn us against trusting contrived examples. Either outcome is a gotcha. Only fair move is to not play. and the outrage fat tony has should apply to the OP as well.
It's only a "contrived example" after you have seen the result, which was intended for you to see. Con artists rely on contrived schemes that are not easy for you see before; their objective is the opposite. This is what happened with Bernie Madoff, Elizabeth Holmes, and Sam Bankman-Fried. You think you are able to see the "contrived examples" before the fact, well, everybody does, and that's precisely what the next con artist relies on.
What I'm saying is that the entire thought experiment is contrived in bad faith against Dr. John. The sin of the Dr. John here is credulity in believing Taleb's assertion that coin will be fair but Fat Tony's priors have equally been captured. For all Fat Tony knows Taleb switched the coin out so that his statement was true and the story could just as easily have ended with the next toss coming out tails as a parable to not fall for the gambler's fallacy. Fat tony in taking the other side of the bet has also been taken in by the con. The real lesson if any can be taken from this story is don't play sketchy probability games with other people's coins/dice. And the Fat Tonys of the world seem much more likely to buy things like lottery tickets on this same vague intuition I'm supposed to be so impressed by.
I have done the experiment with academics without mentioning that the coin is fair, the result is the same: they assume the probability is 0.5.
Yes, because everyone knows thought experiments don't translate to the real world.
You've had an academic sit there and watch you flip a coin 99 times landing it on heads each time?
But narratives where the conclusion depends entirely on what the author wants aren't even thought experiments. For it to be a thought experiment you'd need to have actually caught some flawed logic and worked out why it was flawed. If the author is trying to differentiate Fat Tony and Dr. John then the author needs the victory of Fat Tony over Dr. John to rely on something other than that the author would prefer Fat Tony to be right instead of Dr. John as the narrative could just as easily have been written the other way.
Here is an alternate ending to illustrate the points:
No. I ask them what is the probability that the next coin flip will land heads.
Which I did.
That's
1:3
odds.So? You haven't illustrated anything. According to you, you need to show the flawed logic.
So at the end you told them the probability was 50/50 and then asked them what the probability was? Presumably you'd be the one determining if their answer was right or wrong. If they can't trust your premises why should they trust you evaluation?
You did not. Their logic was "I'm going to accept the premise given". You got to decide whether the premise was true or not. The outcome depended entirely whether you decide that the 99 coin flips in a row are the lie(in the form of a coin switch) or the statement about the probability was the lie.
It not illustrating anything was the point, I agree I did not show the flawed logic of Tony. I was demonstrating the flawed logic of thinking these stories can be used to show anything at all.
No, I already said what I specifically did not ask them.
I very clearly explained it in the article.
So it had absolutely nothing to do with my thought experiment.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Fat Tony is basically a contrived pastiche that is used for this purpose throughout Taleb. He's an annoying device for exactly this reason, and I like Taleb.
He comes across as less contrived when you've actually met a few Fat Tonys in real life.
More options
Context Copy link
More options
Context Copy link
Exactly, Dr. John might have rigged the coin throw but in a way that it doesn't land heads on the last toss.
More options
Context Copy link
More options
Context Copy link
Scott Adams has done enough lolcow-ish things throughout his life that I am not willing to believe he's an intelligent person "just duping everyone". Sure, you can easily "dupe" people and intentionally get them to engage with you by saying stupid shit on Twitter. That's not particularly intelligent or insightful. Ditto for Trump.
Nothing in Krugman's tweets indicate this. He doesn't seem to have talked about the "very fine people" quote at all. Besides, the basic problem with Trump is that he says enough stupid shit to the point where anyone who dislikes him can always find something to get pissed over. It's not worth arguing with them even if they concede that he never called neo-Nazis fine people.
I wanted to like this piece because of the segment on Elon Musk and the Ligma Johnson hoax but the rest of it was pretty meh.
Sure, that's your opinion, but in the real world: it works.
He didn't have to, people searched for it. Barbra Streisand didn't have share a link to her house, but by starting a lawsuit, everyone searched for her mansion, which is what created Streisand effect. This is memology 101.
That's the whole point of the dupe: the actions of the target create unintended consequences. It doesn't matter what the target consciously does.
More options
Context Copy link
What lolcow stuff has he done?
He keeps starting drama with people and saying dumb shit about mass shootings on Twitter. He has a KF thread, that says it all.
More options
Context Copy link
More options
Context Copy link
I mean, didn't Dilbert get yoinked from newspapers because Adams started injecting his politics into the strip? I'm amazed it took this long for it to happen, but nonetheless, it seems like a fairly strong repudiation of Adams' legacy.
Apparently it was pulled but probably not for his politics.
I read this and doesn't convince me. Surely, it wasn't just politics but still. It is like some people say in no way a white male could be overlooked for promotion in preference to some minority. But it happens all the time even when there are no specific quotas. There was one person here got freedom of information request from Canadian government that confirmed that they only hire people with some minority status because didn't want to sort through too many qualified candidates.
Okay, being cancelled from 77 newspapers at the same time is kinda suspicious, not gonna lie. But it's otherwise hard to tell the reason (though I grant that we would expect there to be no reason given to maintain plausible deniability).
Not at all. Those aren't really 77 newspapers, they're all owned by the same company. Given the wafer thin local coverage in most of these papers these days, it's probably better to think of it as 77 local editions of the same paper. The articles in most of those papers will be 50-60% the same, 75-90% the same within the same state/region, with a handful of truly local articles sprinkled through by a small team of local reporters.
This wasn't a "coincidence" because nothing coincided. One guy at corporate decided he didn't like Dilbert, maybe because of politics maybe because he was thinking the male counterpart to Kathy had maybe outlived his usefulness.
More options
Context Copy link
Like I said, I'm amazed at how long it took, so this thesis is believable, but at the same time, the counter-current the comic represented probably made it an easier choice to axe.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Nice sub. But the author failed miserable with the previous article: https://felipec.substack.com/p/the-boy-who-had-to-bring-a-wolf.
It wasn't that the US though that Russia is 50% weaker than it pretended to be. It was actually 50% weaker than the US thought it was.
Dictators tend to overestimate their power. The only reason the US does not push to overthrow them is that it becomes very ugly, like war in Ukraine with many innocent people dying. But Putin had a choice to do nothing and remain at status quo. Now he has destroyed his lie about powerful Russia.
More options
Context Copy link
But who is to say that cynicism can't itself be weaponized as easily as naivete? After all, we see politically-liberal people caution against cynicism and doomerism, saying it's deployed by those who wish to preserve the status quo.
It certainly could, which is why I'm not advocating for cynicism, what I advocate for is skepticism. Many true skeptics end up being cynics, but not all cynics are good skeptics.
More options
Context Copy link
More options
Context Copy link
Religions have clergy and laity. It's the clergy's job to study theology and know what's True, and it's the laity's job to trust the clergy and follow their lead. This archetypal form is so baked into human social programming that you're gonna be hard pressed to supplant it. So secular replacements for religion are bound to follow it.
"Boooring, muh everything's a religion." Rationalism doesn't try to be a religion, you say? Well I'm sure the clergy understands that but tell that to the laity. No actually the laity believe that too and recite it. But at the end of the day they follow the archetypal form. They will call themselves Rationalists but they have jobs and hobbies and lives (as is right and proper for laity of any religion—this is not a dunk) and so they will trust the EYs and the Scotts and the other clergy to do the actual work. The actual thinking. The actual application of Bayesian reasoning. Etc. For their part they simply have faith that they have found the True epistemology and that they follow the lead of trustworthy workers of Truth.
More options
Context Copy link
I do not think rationalists are duped more easily than normies that never heard about Less Wrong, but people who claim to be the smartest people in the world should be held to higher standard, and claim that studying their super special sequences will make anyone as smart as they are should be put to close scrutiny.
If you are boasting you are Airborne Navy Seal Ranger specially trained in gorilla warfare, and get your ass regularly kicked by ordinary drunks in bar fights, people would be justified doubting the value of super secret martial arts training you offer.
Obligatory link:
Where are All the Successful Rationalists?
Where are all the former losers who read the sequences, pulled themselves by their bootstraps and became brilliant winners?
Point me where in the sequences it makes the claims that you will become unusually successful for having absorbed them? Or where they claim that they're useful to everyone. In the matrix the red pill cannot be used on just anyone. And finally, who elected these main stream media figures who were criticized as rationalist representatives? I don't even really call myself a rationalist, but these are weak swings.
Rationality is Systematized Winning
More options
Context Copy link
More options
Context Copy link
I guess it hasn't made me "successful", but I do think I live a happier and more fulfilling life because of the rationalist literature. Having the tools to make the world make sense is a value in itself.
Exactly. "I notice I am confused" is a damn superpower.
Before, I used to observe stuff that didn't make sense, think "Huh. That was weird," and then go on about my day.
Now I actually interrogate the phenomena until it makes sense.
If you see a coin that turns up heads 100 times in a row, your first step should really be "Let me see that fucking coin" and not "wow, what a crazy random happenstance." Hell, if you see a coin that shows heads 99 times then tails on the 100th, you should DEFINITELY demand to examine said coin.
Although this has ruined the entire concept of magic performances for me.
More options
Context Copy link
More options
Context Copy link
I relate a lot. I have not read a lot of rationalists articles, but it seems to me that a lot of what they do is share ideas amongst themselves, but these ideas are not necessarily true or important, merely interesting. Few of these ideas have anything to do with the real world.
Nassim Taleb talks about putting skin in the game as a way to escape this intellectual circle jerking, because when you confront ideas with reality is the only way to know if there's any true truth to them. This follows Karl Popper's falsification principle: if your idea cannot be falsified (in the real world), then it's worthless.
I think the reason why there are no successful rationalists is because they don't want their precious ideas to actually be tested in the real world, they'll rather keep them unopened like collectionists do, and just admire them.
More options
Context Copy link
*✋️ raises hand✋️ *
Hi, I was in a rut of depression and uncertainty and fear of the future in my early twenties, then one summer I read HPMOR, then worked through the sequences, recognized the value, and spent the next several years doing the hard work to adjust my life onto the track that allowed me to actually become happy, healthy, and financially secure. I truly enjoy life now.
Also, I tempered that knowledge with some extra examination of neuroscience, statistics/risk (Fooled by Randomness is a REALLY good book), and the art of rhetoric since, it turns out, merely thinking rationally doesn't get you very far if you can't deal with other "irrational" people in normal conversation.
A few things I credit rationality/the sequences with:
Being aware of and buying Bitcoin very early, recognizing the potential upside.
Being aware to never go all-in on Bitcoin or crypto (we see many, many people never grok this and blow up) at any given time.
Avoiding every single collapsed exchange and rugpull, from Mt.Gox to FTX, and thus never losing my gains to some unexpected event.
Pulling the fuck out of crypto when it became clear it had gone full cheap-money-fueled casino.
It is likely that I would have been one of those poor rubes who got fucked by SBF if I had not gotten my epistemic foundation built on solid ground well in advance.
Now, the caveat is that I have defined "winning" in quite modest terms. So my success is not amazing when compared to what many others who aren't rationalists have achieved. But it has put me in a position where virtually no single event (not counting X-risks) can wipe me out. And that's the fucking dream.
So I will strongly maintain that the sequences are a force for good, even if they haven't caused humanity as a species to vault to a higher state of being in a single decade.
And I did all the same as you in crypto while never fully reading the sequences (have read some but find them to be too much of a waste of time to complete) and reading HPMOR fully but considering it a joke as anything other than an amusing diversion. Meanwhile there are also plenty of people who worship the sequences and HPMOR and "rationally" dismissed or significantly underestimated crypto (as complained about on LessWrong itself many times) in its free money bonanza days despite being fully aware of it.
It seems more likely to me that you simply have decent (at least in one proven realm) intellectual instincts and latched on to sequentialist rationalism as the means through which to express them. But it's the good instincts, not the book, without which you'd be nothing, same as plenty of people who had the book but not the instincts.
I think the "good instincts" amounted to being aware that I was behaving in irrational/suboptimal ways, to my own detriment and there were probably tools out there to improve on this, if only I could find them.
The question that kept recurring in my head was "there's plenty of people who can give me advice on various decisions I'll have to make... but how the fuck can I know which advice is good?" Blindly accepting the advice of people I considered "authority figures" had already failed me badly.
That was the "bootstrap" portion of it. Being able to assess information in a systematic way so as to identify and make use of good information and, generally, discard bad/useless information (and none too soon, given how the ratio of useful to useless information has decreased exponentially).
Or, as the sequences put it, to be more confused by lies/falsehoods than truth.
I lacked any reliable tools for doing this despite having, as stated, the intuitive sense that the tools ought to exist.
Which really speaks ill of my college education, I should add.
The main way this helped in Crypto was the very early realization that nobody on the crypto subreddits knew shit about finance, they were all self-interested, and mostly dishonest (or self-deluded). So I went and learned to understand finance and ignored 99% of what the community had to say.
Only regret I have is not jumping on Dogecoin early on. Had no reason to think it would have this kind of longevity, though I did predict that it's community would fail to keep any of its early ethos intact.
I dunno, I think my life ends up very different if I never read the sequences. I would probably be one of those types who "fucking loves science" but really just uncritically accepts what experts say. And that would have caused me some problems when Covid hit.
Also, being plugged into the rationalist community (and, relatedly, /r/themotte) kept me like 3 months ahead of the curve on understanding the pandemic.
Over the years I've made better life decisions in a hundred little ways that would be hard to sufficiently articulate here, that I think the counterfactual version of me handles more poorly overall.
Really? This place was overflowing with doomer takes about the pandemic as the "big one" (as opposed to the big scam) that aged terribly, and as far as I can still tell there's still no widespread recognition here that people were overly hasty and insufficiently scrupulous about their vax shilling.
If I had listened to /r/themotte I'd probably have my furniture made of worthless (or at least mostly unnecessary) N95 masks by now.
That's the thing. Themotte was quicker to see that masks might be helpful (whilst the CDC was literally saying "stop buying masks!"), but also shifting away from them as it became clear that this wasn't going to be the civilization-ending event it might have been.
The biggest insight I received from /r/themotte specifically was someone pointing out that viruses tend to mutate towards less lethal versions since that is optimal for long term spread.
Which is exactly. what. happened. Remember Omicron was more contagious and less deadly?
In absolute OCEANS of misinfo on the right and the left, and absolute collapse of expert guidance, themotte was basically the equivalent of a lighthouse in a storm.
Reddit at large was still in favor of mandatory masking FOR CHILDREN long after some posters here had already pointed out that this didn't actually help and might actually HURT young children's development. The latter being a point the CDC (I think) agreed with until it became politically unfavorable and they pulled that info from their site.
I'll go back and pull up the actual comments from the old sub if you don't believe it.
Of course, you do have to be able to sift out useful information from non useful to get the full benefit. But see my whole comment above about rationality teaching exactly that.
Fair, but I still resent the "rationalist" side of the Internet (well it wasn't all rationalists per se, but it was mostly fringey Internet commentators at least at the very beginning, not established media figures, Substacks and Mediums at best) for (and I'll admit contributing myself to some of the first point, to my regret, which is why I think it's worth pointing out):
Essentially greenlighting the whole hysteria. Sure, established authorities weren't taking the threat as seriously as they should have at the beginning and maybe needed a little kick in the pants, and sure many rationalists called BS on the alarmism once the novel virus became less novel and was revealed to be far less dangerous than initial concerns (which happened far before Omicron btw and as early as the first global strain, so anybody only admitting it then was way behind the curve), but if rationalists really were all that rational, they should have perhaps seen two steps ahead instead of just one and realized that it would be very hard to take back the panic they helped drum up once it got rolling, especially since it was known that viruses have a tendency to moderate their own mortality as they spread as you mentioned. Instead I think so many people were desperately excited to finally get to go into "X-risk" mode and prove how Serious™ they are, and then the resulting mindset of paranoid doomer absolutist safetyism was hijacked by established authorities for their totalitarian ends and became the dominant attitude of authority throughout the entirety of the pandemic until it was unceremoniously ended by Putin.
Particularly on /r/themotte (though obviously this particular issue was far worse in the non-terminally online realm in general), again the vax was shilled far beyond available sensible justification (and I haven't seen any retractions), especially for people who had supposedly appropriately absorbed SSC's reflections on metascience/the replication crisis and the flimsiness of so much "research" and so many "studies" because they are too hasty, unexacting, and corrupted by perverse incentives (like how about being conducted by the same people trying to sell the object of study as one of the most profitable pharmaceutical products of all time?). (But I'm pretty sure Siskind got the jab too (or I assume his polyorbit or whatever would have screeched at him until he had) so maybe even he didn't absorb his own reflections. Hopefully he faked getting it.)
I certainly won't say there was no insight on the subject to ever be found on /r/themotte. Its early campaign in favor of variolation was a good idea and probably would have been far superior to the vaccination we got.
The reason that you had many rationalists skeptical of the hysteria but still supporting vaccines is that rationalists tend to be scientifically trained, so even if they don't listen to the scientists who are signal-boosted by the media, they can understand what vaccines do based solely on their own knowledge.
If that were true then they would have been far more skeptical of taking them, especially for the younger demographics of their own community.
But no, I don't believe your average /r/themotte poster was in any way particularly "scientifically trained" in mRNA vaccine platforms before the debut of the most recent ones.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I am happy that you've found success. May you be your best self in enjoying the good that you've earned.
I would not describe myself as a rationalist, but I recognize what you've discovered--there's value to be found here, and it's worth the time to seek out. I suspect that SBF found some valuable facts about the world, but not the much more valuable attending wisdom, and proceeded to apply his lessons much less well than you've done. Perhaps he'll learn something from the ruin his mistakes have caused, but even if so, it will have been purchased at great cost to many others.
I'm increasingly convinced that SBF was acting with some level of malice aforethought and he was using EA as a decent camouflage.
But whatever he did learn from the rationalists, he missed the lessons on how deontology is extremely useful for putting up behavioral guardrails so that your fallible human hardware doesn't end up causing you to commit moral atrocities.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What is a Quokka?
Pejorative nickname given to rationalists when the Scott Alexander/NYT stuff went down. See this twitter thread by reactionary sci-fi/horror writer "Zero HP Lovecraft".
More options
Context Copy link
More options
Context Copy link
One of the thoughts that I've been kicking around in my head in relation to my long delayed (see procrastinated) effort-post is how a lot of blue tribe progressive types seem to be unfamiliar with the concept of the permissive vs contested vs hostile environment. I see people complaining about getting banned from an internet forum or reported to the FBI and my first reaction is the James Franco from Buster Scruggs meme, is this your first time? Similarly back in the Clinton days (that is the early 90s) I recall a lot of talk about "why are otherwise intelligent people buying this shit?" Nobody actually believes that the president didn't inhale or fuck Paula Jones do they?
My working theory is that wealthy Yale and Stanford types don't really get a lot of exposure to predators and con-men at a young age and thus they don't really develop the mental antibodies against them before entering the buisiness world. Meanwhile the kid who grew up around used car salesmen probably understands "the nudge" better than those with a 4-year degree in marketing.
That's right. Rationalists claim it was rational to trust Sam Bankman-Fried, because if his pitch was part of an academic exam to see if this person was credible, trust would be the right answer.
But that's the thing: we are not in an academic exam, this is the real world, and people are going to try to exploit your blind spots.
I often wonder if these people play poker, video games, or any kind of board game were deception is part of the game.
More options
Context Copy link
I'm not sure this hypothesis is correct. IME propensity to be conned doesn't really have much to do with community values but does have a lot to do with education, time preference, and intelligence (though they're certainly not proof against it).
Nobody believes it. The important thing there is that the president kowtowed to prevailing norms by disavowing his behavior, even if his excuses are obviously bullshit.
IME, propensity to be conned is correlated with exposure to cons, and has no relation to education, time preference, or intelligence. It's just about having the mental habit of double checking "Could this person be conning me?" and a willingness to accept when the indicators are yes.
I agree, but I think the word is skepticism. You don't need to be intelligent or educated to be skeptic. It's just a mental muscle: the more you doubt claims, the easier it becomes to doubt claims.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I don't follow. Not growing up around con men could result in the kind of naivete that would make you believe Bill Clinton did not inhale, but like you said, I don't think anyone actually believed that.
George Orwell was probably more on target with the concept of doublethink.
More options
Context Copy link
More options
Context Copy link
No submission statement, not reading.
More options
Context Copy link
The headline talks about rationalists, but the article actually talks a lot about people who aren't rationalists at all. Like journalists. Or Krugman. Which are very easy to dupe, because they want to be duped. They actively go out and look for people who can be used as props to launder their agenda through them, and in some cases if they fail, they manufacture it (somehow this is considered to be much worse behavior than cherry-picking props, while being essentially the same). This is an easy trap to fall into - and I am sure many people declaring themselves rationalist fell in it too, because they are human. If you build a trap skillfully and put tasty enough cheese inside (different cheese for different people), a lot of people will get caught. Some of them may be calling themselves "rationalists", some of them may even try and become less easy to get caught - but they are imperfect humans, so they'll get caught anyway. That is to be expected. Doubly so if they actually profit in one way or another from getting caught (like journalists or political activists - which are pretty much one and the same nowdays). For those, passing a good "boo outgroup" story is almost inhumanly hard, so here are most of your examples.
To add to this, there's also the element of betting.
Humans, even rationalists, have to make decisions without the time to obtain perfect knowledge. It's only prudent to place bets if you think the upside might be big and the downside small. In other words, there were probably rationalists in the OP's sample that donated/took money from SBF while thinking this is all likely going to blow up in their face. This isn't the case of conflicting beliefs--it's playing the odds.
Plus, the characterization of "rationalists" seems to me a faulty generalization. There are probably very few people who make their life revolve around rationalism. But rationalism isn't some monastic order that stamps out mentat-like Rationalists, so in the real world, "rationalist" describes everyone from hyperlogical baysian wizards to folks who like a good argument and enjoy eating popcorn while watching the Culture War eat itself.
Yes, sometimes, but a lot of times they don't have to make a decision, and they do anyway. For example if I enter a meeting I will want to sit down, I don't know if the chair isn't broken, but I sit down anyway. Is not checking the chair a mistake? No, I can make a decision without perfect knowledge. But what about a raffle? I also don't know that I'm going to lose, so it might make sense to buy a ticket, but I don't have to. You'll say that I made a decision anyway, but not necessarily, a lot of times the result is "I don't know", and that's not really a decision.
That depends on the odds. A small upside and big downside might make sense if the odds of losing are sufficiently small.
But those are two different things. Taking money from a person is one decision, trusting that person is a completely different one. You can take money from a person without trusting them.
The difference between skeptics and normal people is not readily apparent. We both sit on a chair without checking if it's broken, but I as a rational skeptic do not assume it is unbroken. The end result looks the same, but the mental model is different: I do not have a belief.
And yet you assume you have access to other people's mental models.
No, I ask them what they believe, and they tell me.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
But the point is not that they get caught, all humans indeed have the potential to get caught at some point in their life, the point is why. Why do people get burnt touching a pan?
So, what's your answer for the why, that is special for rationalists? I say my answer is a common human one - they thought the pan is not hot, or maybe they wanted what's in the pot too much to reasonably evaluate the chances that it'd be too hot. People do that. I'm not too proud to admit it happened to me.
You are forgetting the most common reason: they have never encountered a hot pan in their life (e.g. kids). They get burnt because they didn't think they could get burnt. This also happens to adults who should know better after a while of not dealing with hot pans.
People who have never been scammed are the easiest to scam, precisely because they don't think it could possibly happen to them. Hubris and overconfidence are known to make intelligent people make obvious mistake they otherwise would not commit.
It's actually looking like the most un-common reason. You can only do it once in your life. If you ever been burned by a hot pan more than once in your life (I have, I assume most other people too, pretty much any adult had this experience - and yet adults are regularly getting burned by hot pans) - that's not the most common reason for you. It's hard for something to be most common reason for something if you can do it only once in your whole life, and you have plenty of warning before it.
OTOH, I'm pretty sure a lot of people tried to scam rationalists - because a lot of people try to scam everybody, look into your mailbox under "Spam" and you'd probably see a dozen scam attempts every day. Surely, they haven't been scammed this particular way before, but nobody has been scammed this particular way before, so there's nothing special for rat circles. BTW, a lot of much more weathered people - like journalists, politicians, Hollywood types, etc. - had accepted SBF with open arms. It's not like everybody but rationalists rejected him, but those doofuses got caught. Nobody within the Well Respected People circles rejected him. He had investment from the best and most respected venture funds. Financial regulators planned to use them as the example of "good crypto investor". He had CFTC license. Those people not only have seen every scam there is, they are supposed to be the supreme authority of the land to determine which is scam and which is not. They failed. Surely there were many reasons for that. Not having seen a scam before in their lives isn't one of them.
No it's not, it's basic statistics. You can only donate your heart once by dying, and guess what's the most common reason for heart donation: death.
False equivalence fallacy.
Yes they have. Financial fraud is not new.
This has nothing to do with my argument, you are attacking a straw man of it. Of course there are dumb journalists who fell for the scam, but the intelligent ones with solid epistemology likely did not, because they have epistemic humility.
That's pretty cheap trick. Of course in a set of one, the only element is the maximum. But when you have multiple ways to do something, the it's different - it's hard for the way that you can do only once to be the most common.
Fraud in general is not new. This one in particular is. You are just substituting a set we were talking about with much larger set encompassing many more elements. It's like I said "I never heard this language, its sound is completely new for me!" and you replied "Lies! You certainly heard people speak a language before!". A language - yes, this particular one - no. A fraud - yes, this particular one - no.
It remains to be proven that no intelligent ones with solid epistemology in fact did, and only dumb ones did. And your criteria for "dumb" better not include "falling for this particular scam".
Most people when faced with something they have not imagined complain about that.
No it's not. Do I really have to explain it with statistics?
Say everyone will experience event
X
once in their lifetime, which is 80 years in average, that means in a population of 1000 in every given year around 12.5 people will experience it for that reason in average. Now let's say there's another way they can experienceX
that also happens for everyone in their lifetime, so again it's 12.5. In this case the percentage of people who experienceX
for the first time every given year is 50%, so it's not the most common cause.But, what if the other way doesn't happen for 100% of the people, they learn their lesson and it only happens to 50% of the people? In that case it's only 6.25 people and the percentage of people who experience
X
for the first time any given year is 67%, therefore it's the most common cause.Your failure of imagination is not an argument.
No. All fraud relies on people trusting without good reason, or more specifically: not distrusting enough. This is no exception.
Indeed, but it doesn't have to be proven because the hallmark of having a solid epistemology is not believing things without evidence, and in order to fall for the fraud you have to believe things without evidence. So if anyone with a solid epistemology fell for the fraud, they would have to be almost by definition a very rare exception.
That's a useless statement, it's like saying all deaths are caused by not living long enough and presenting it as some ultimate discovery in medicine. Of course fraud relies on trust, that's by definition, and of course in the hindsight, that trust was misplaced. But one absolutely can not function in a society without trusting somebody with something. Even low-trust societies have some trust. You go to a store and you trust the owner not to murder you, feed your body to the pigs and take your money. You put your money in the bank and you trust the bank not to refuse to give it back, or the society to be on your side if they do. You get employed and you trust your employer to pay you and not to sell your data to the identity thieves and ghost you, etc. (Sidenote: before you say "I actually never trust anybody, I grow my own food on the top of remote mountain and never speak to another human being unless I see them through the sights of my rifle, and only to procure ammunition for the said rifle, and I demand it upfront" - good for you, it's not how human society works, please understand "you" as collective pronoun here). We trust somebody many times a day if we live in a society, and in the most of these cases the trust is reciprocated with cooperation. Sometimes, though, there are defectors. We recognize the pattern of defection and avoid trusting them - if somebody comes to you on the street and offers to sell you genuine Rolex watch for $5, you rightfully mistrust them - because you have prior experience that says in this context, trust is not warranted. However, absent such context, the cases of misplaced trust would always exist, because it is not possible to perfectly calibrate one's trust without decent knowledge of the matter at hand.
Again, this is a banality which on closer consideration comes apart as useless. You can not evaluate the quality of evidence without experience in evaluating the particular kind of evidence, and not many have experience with evaluating evidence in this particular area.
No you don't. You just would believe the evidence that in the hindsight proves wrong or low quality. In most topics, you can not evaluate evidence by yourself - nobody can. Most people rely on authority of some sort for that - we're back to trust. The modern newspaper fashion of sanctifying "evidence" is a meaningless ritual - anything can be "evidence" or "not evidence", depending on how you evaluate it and relate it to the question at hand. How you know if some investment is good or a fraud? You check its description, its references, the opinion of other people, the data about similar investments, your knowledge about how financial system works, you knowledge about who particular person is - all this relies on myriads of sources which you can not check empirically - it's trust all the way down. There's no procedure that can guarantee you absence of possibility of being deceived here - only methods to reduce this possibility to the level you would find tolerable, but even these calculations again rely on some data which you'd have to take on trust. Sometimes the whole house of cards fails, and you find yourself defrauded. It may be because you personally misjudged the evidence, it may be because somebody who you trusted made a mistake, it may be because somebody somewhere in the web of trust defected. There's no "solid epistemology" that would provide you a guarantee against that. If you think there is - you are the one that is believing things without evidence.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This is all part of a ploy to get me viewing another mediocre substack.
Not today!
All that glitters is not gold.
I believe the appropriate retort would be: "Fare you well; your suit is cold."
...Well, not quite. This is the second article of yours I've read; both have been "haha, everyone's such a moron because they don't know this thing I know" gloating but getting basic facts wrong about the main subject of your post ("what is Z4?" and "who is and isn't a Rationalist?" respectively), and on top of that, both times you've gone in the comments section trying (quite fruitlessly if your reaction counts are any barometer) to out-rhetoric your critics.
I'll be blunt: if, the next time you make a thread linking your blog and I see it (i.e. not the one you've already made that I haven't read yet; a new one), the article is this poorly-researched and/or you're playing these kinds of games in the comments, I'll stop following your links. I'm not asking that you even write something good - that'd be unfair, since nobody can write good articles all the time and some people can't write good articles any of the time - just a) don't act all superior without getting the basic premise of your article right, b) either don't engage with the criticism, or do it in a constructive fashion rather than trying to score points.
Two converse error fallacies don't make one right.
Your conclusion is still unjustified.
You talk as if your intellect is superior to mine, but I seriously wonder if you even know what a converse error is, and if you can provide an example without looking it up.
More options
Context Copy link
More options
Context Copy link
If it's any consolation, I like your Substack and don't think it's too mediocre.
Thanks. It's a work in progress to try to question the fundamentals of belief, and the discussions it has generated show it's surprising difficult to get intelligent people to question their own cherished beliefs, which in the case of rationalists in theory should not be the case.
Only if your prior was that intelligent people should be easy to get to question their cherished beliefs. The reverse seems to be the case, it is dumb people who know they are dumb who change belief easily. Smart people do not, by and large, change their beliefs, no matter the evidence.
http://culturalcognition.squarespace.com/browse-papers/motivated-numeracy-and-enlightened-self-government.html
Raw intelligence, g, or IQ is an impediment to wisdom. It allows us to bully others with complex arguments, Euler math and factiods, which reinforces our intellectual arrogance. Being smart moves you further from Truth, not closer. It is a handicap to be struggled with, not a superpower.
Not really. Only people who claim to follow logic, reasoning, and scientific thinking who tend to be intelligent, but not all intelligent people do that. These people (the scienticians) should in theory understand that they should conform their beliefs to the data, not the other way around. Science is supposed to be set up to avoid confirmation bias, and that's why the falsification principle that Karl Popper set up was supposed to be so powerful.
But yeah, they disregard all that when their beliefs are sufficiently cherised.
That has been my experience.
Very interesting. But not at all surprising to me.
Weird, I started the article writing precisely about the difference between intelligence and wisdom, but it diverged so much that I changed the topic. I'll finish the article about wisdom later.
I think this is the case, but it shouldn't be the case. Smart people have the capacity to move closer to the truth, but only by using the right heuristic, and scientific thinking clearly doesn't seem to be sufficient. Intellectual humility is necessary, and accepting the possibility that perhaps they could be wrong, which many don't.
More options
Context Copy link
This is a very Motteish, meta contrarian hipster thing to say. This seems absurd, contrary to reason and personal experience. Intelligent people are not right about everything, but I would find it hard to believe they are wrong more often than stupid people.
The upper classes are not entirely devoid of superstition and conspiracy theory, but talking to an average lower class person for even a few minutes generally exposes truly wild reptilian-level beliefs in a senseless mishmash. You are romanticizing retards
That's not what I or the research said. They are less likely to change their beliefs. Whether they are wrong more than less intelligent people depends a great deal on the intellectual fashions of their social class and the subject of debate. If it's something boring, not culture war and technical, they're probably wrong less than dumb people, if it's classifying the sexes of the human species in 2022, they're probably wrong more.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
"Rationalists" are just as inclined to use "rationalism" to reinforce the priors they came in with as opposed to challenging them. That's not the only reason, but I consider the whole "movement" silly.
I'm not very familiar with the movement, but after a few interactions with them I feel like they are even more inclined to reject evidence against their beliefs than the average person. I debated Scott Alexander in reddit, and after I pointed out fallacies he committed, he straight up rationalized that making fallacies wasn't a problem, and me pointing them out was too basic and "uninteresting".
He said by pointing out fallacies taught in philosophy 101 I was not responding to his argument, but isn't the whole point of fallacies being taught in philosophy 101 to avoid making them in arguments? A fallacious argument is invalid, so "this is a fallacy" is all the response needed.
I don't see how he could possibly think he is beyond the realm of fallacies.
I'd love to see a link to this if you wouldn't mind.
Sorry about the delay.
Here's the subthread: Rationalists are too easily duped.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
That was a lot of words to say very little that the quokka tweet didn't say already.
More options
Context Copy link