You may be familiar with Curtis Yarvin's idea that Covid is science's Chernobyl. Just as Chernobyl was Communism's Chernobyl, and Covid was science's Chernobyl, the FTX disaster is rationalism's Chernobyl.
The people at FTX were the best of the best, Ivy League graduates from academic families, yet free-thinking enough to see through the most egregious of the Cathedral's lies. Market natives, most of them met on Wall Street. Much has been made of the SBF-Effective Altruism connection, but these people have no doubt read the sequences too. FTX was a glimmer of hope in a doomed world, a place where the nerds were in charge and had the funding to do what had to be done, social desirability bias be damned.
They blew everything.
It will be said that "they weren't really EA," and you can point to precepts of effective altruism they violated, but by that standard no one is really EA. Everyone violates some of the precepts some of the time. These people were EA/rationalist to the core. They might not have been part of the Berkley polycules, but they sure tried to recreate them in Nassau. Here's CEO of Alameda Capital Caroline Ellison's Tumblr page, filled with rationalist shibboleths. She would have fit right in on The Motte.
That leaves the $10 billion dollar question: How did this happen? Perhaps they were intellectual frauds just as they were financial frauds, adopting the language and opinions of those who are truly intelligent. That would be the personally flattering option. It leaves open the possibility that if only someone actually smart were involved the whole catastrophe would have been avoided. But what if they really were smart? What if they are millennial versions of Ted Kaczynski, taking the maximum expected-value path towards acquiring the capital to do a pivotal act? If humanity's chances of survival really are best measured in log odds, maybe the FTX team are the only ones with their eyes on the prize?
Jump in the discussion.
No email address required.
Notes -
It's the worship of intellect in rationalist circles. They are very smart, in general; they do tend to be well-intentioned, in general; and they do think they are working on "how to win" where that means putting their smarts and good hearts to work for the benefit of all humanity. They forget that brains alone are not enough, and you do need a leavening of common sense or practical experience.
Whereas me and other old-fashioned types like me were pointing out all along that thinking you know how to do charity better than all the groups that have ever done it over the history of humanity is boundless conceit, and no it doesn't matter if you use financial analysis and statistics and all the rest of the jargony tools. They dug a pit, and fell into it themselves.
I'm not happy this happened, but I think a little chastening about "all those other sets of individuals did it wrong and made dumb mistakes, but not us" is no harm in the long run - if they learn the correct lessons from this about not believing their own hype and that even if Sam or Caroline or whomever were your good old pals from college, that doesn't mean a tap when it comes to running a billion-dollar business with no real experience.
I'm not sure how this really relates to SBF. Is it a tenet of EA that they are better at divining sources of ethical funds than normal charities? From what I can tell, the purpose of EA has always been that they would be better at spending funds effectively, not sourcing funds. That a big donor proved to be engaging in criminal actions doesn't really have anything to do with EA, does it?
More options
Context Copy link
I don't think this is a good example, considering it was skewered on LessWrong itself.
More options
Context Copy link
There's an obvious and inevitable problem of self-awareness if the way you approach a problem is "as a rationalist..." It would be like if I created a new school of thought in ethics called Obviously Morally Correctism. So as someone who is Obviously Morally Correct, I could fix all the world's problems and you should trust me with your billions
More options
Context Copy link
Couldn't have said it better myself. A big weakness of EA is that older folks are almost nowhere to be found in the movement, and despite the fact that retirees make up a ton of the volunteers out there, EA tends to scoff at the idea of reaching out to retirees. I've heard various different reasons but it seems to boil down to "old people aren't cool and don't have interesting ideas."
I think EAs novel way of looking at things is valuable and will take them far, but yeah the movement, especially at the higher levels of power, really needs to start courting older more experienced folks.
Yeah. When I saw the photo of Caroline Ellison I was "bloody hell, is she still a teenager???" and even though I found out that she's twenty-eight, she doesn't look it. Even worse that she went from "straight from college to some quant fund for six months then put in charge of parallel company run by her boyfriend". Take her on as a mid-level person, sure, but CEO of the whole shebang?
There badly needed to be a few fifty+ year old guys in charge, even if that means asking Evil Privileged White Cis Males for help. The other part of the problem is that they are well-connected, so the worst of both worlds: Daddy's influence and connections got them a whole heap of push up the ladder, but there were no corresponding "friends of Dad" in charge so the kids were given a fistful of dollars and let loose in the candy store.
Then when they ran into trouble, there was nobody senior enough to take the wheel and we see what happened. This latest story of the alleged hack is just the icing on the cake, I don't know what rumour to believe next (I could give credence that it's somebody down the chain at FTX who saw everything collapsing around them, knew that Sam wouldn't save their neck, and decided to help themselves to whatever was left in the piggybank because at this stage it's every man for himself. But it could be an outsider altogether, which would just be the finishing touch on how bad this entire set-up was).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link