You may be familiar with Curtis Yarvin's idea that Covid is science's Chernobyl. Just as Chernobyl was Communism's Chernobyl, and Covid was science's Chernobyl, the FTX disaster is rationalism's Chernobyl.
The people at FTX were the best of the best, Ivy League graduates from academic families, yet free-thinking enough to see through the most egregious of the Cathedral's lies. Market natives, most of them met on Wall Street. Much has been made of the SBF-Effective Altruism connection, but these people have no doubt read the sequences too. FTX was a glimmer of hope in a doomed world, a place where the nerds were in charge and had the funding to do what had to be done, social desirability bias be damned.
They blew everything.
It will be said that "they weren't really EA," and you can point to precepts of effective altruism they violated, but by that standard no one is really EA. Everyone violates some of the precepts some of the time. These people were EA/rationalist to the core. They might not have been part of the Berkley polycules, but they sure tried to recreate them in Nassau. Here's CEO of Alameda Capital Caroline Ellison's Tumblr page, filled with rationalist shibboleths. She would have fit right in on The Motte.
That leaves the $10 billion dollar question: How did this happen? Perhaps they were intellectual frauds just as they were financial frauds, adopting the language and opinions of those who are truly intelligent. That would be the personally flattering option. It leaves open the possibility that if only someone actually smart were involved the whole catastrophe would have been avoided. But what if they really were smart? What if they are millennial versions of Ted Kaczynski, taking the maximum expected-value path towards acquiring the capital to do a pivotal act? If humanity's chances of survival really are best measured in log odds, maybe the FTX team are the only ones with their eyes on the prize?
Jump in the discussion.
No email address required.
Notes -
This post is silly. Effective Altruism != Rationalism. Rationalism doesn't say people need to place a huge emphasis on charity or helping others. At most, there's a peripheral link since some big names like Scott are involved in both.
There's also no direct link to Rationalism since crypto isn't intrinsically pro-Rationalism either. Perhaps this SBF guy considers himself a Rationalist (I don't know, this FTX blowup is the first I'm really hearing of him), but even if that was the case it doesn't impugn Rationalism any more than WW2 impugned facial hair since Hitler and Stalin happened to both have moustaches.
SBF is very much a rationalist.
Do you have a rundown on how he is? Maybe a link or something? Again I'm not familiar with this guy.
Was he justifying his actions in the name of Rationalism or something? People are acting like this is a huge deal that all Rationalists need to reflect on, but as far as I can tell this is just a guy who donated to EA who went down for fraud in his business. There's not much literature in the Rationalist movement that says it's a great idea to commit fraud, so I really don't see why people are doing large mental updates on this.
More options
Context Copy link
More options
Context Copy link
they have 90%+ overlap though. The line is very blurry.
Edit: it's pretty funny to me that one of the replies claims I'm wrong because EA is a subset of rats, and the other reply claims I'm wrong because rats are a subset of EA.
You can see how this would be confusing to an outsider, right?
It's a squares and rectangles thing. Most EA proponents are likely Rationalists, but Rationalism is much larger than EA. Again, nothing in Rationalism particularly predisposes people towards altruism, it's just that if you're conspicuously charitable then EA gives you a framework for determining the effectiveness of your contributions.
Not all (American) Christians are pro-life, but nearly all (American) pro-lifers are Christian. And if you're pro-choice, this gives you a reason to dislike Christianity. So while we can agree that Rationalism != EA, if EA starts getting bad press because of the FTX implosion, it will start giving non-Rationalists a reason to distrust Rationalism.
More options
Context Copy link
More options
Context Copy link
Rationalism has a big overlap with EA, but EA does not have a big overlap with rationalism. EA has grown significantly beyond its origins in rat-spheres
More options
Context Copy link
EA is the subset of rationalism that takes utilitarianism Very Seriously.
There's much less overlap with TheMotte specifically, which seems pretty critical of utilitarianism. One of the main criticisms is this exact failure mode: because the numbers are all made up, a smart enough person can justify doing whatever it is they wanted to do anyway. "Why yes, of course I'm defrauding thousands of people, because I can better direct their money towards things which satisfy my utility function such as..."
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link