You may be familiar with Curtis Yarvin's idea that Covid is science's Chernobyl. Just as Chernobyl was Communism's Chernobyl, and Covid was science's Chernobyl, the FTX disaster is rationalism's Chernobyl.
The people at FTX were the best of the best, Ivy League graduates from academic families, yet free-thinking enough to see through the most egregious of the Cathedral's lies. Market natives, most of them met on Wall Street. Much has been made of the SBF-Effective Altruism connection, but these people have no doubt read the sequences too. FTX was a glimmer of hope in a doomed world, a place where the nerds were in charge and had the funding to do what had to be done, social desirability bias be damned.
They blew everything.
It will be said that "they weren't really EA," and you can point to precepts of effective altruism they violated, but by that standard no one is really EA. Everyone violates some of the precepts some of the time. These people were EA/rationalist to the core. They might not have been part of the Berkley polycules, but they sure tried to recreate them in Nassau. Here's CEO of Alameda Capital Caroline Ellison's Tumblr page, filled with rationalist shibboleths. She would have fit right in on The Motte.
That leaves the $10 billion dollar question: How did this happen? Perhaps they were intellectual frauds just as they were financial frauds, adopting the language and opinions of those who are truly intelligent. That would be the personally flattering option. It leaves open the possibility that if only someone actually smart were involved the whole catastrophe would have been avoided. But what if they really were smart? What if they are millennial versions of Ted Kaczynski, taking the maximum expected-value path towards acquiring the capital to do a pivotal act? If humanity's chances of survival really are best measured in log odds, maybe the FTX team are the only ones with their eyes on the prize?
Jump in the discussion.
No email address required.
Notes -
I agree. You could say that Chornobyl was just a Russian (actually Ukrainian but during Soviet Union times the difference was not important) negligence and it shouldn't have impact on our nuclear energy policy. After all, coal burning causes even more radioactive pollution and people die in industrial accidents all around the world. Chornobyl was clearly an outlier that shouldn't prevent us from continuous development of nuclear power.
But that's not what happened. Some countries decided to completely stop using nuclear energy despite serious problems with reaching sustainability goals.
On the other hand, crypto is not a nuclear power. It has very little usefulness and most of it is hype and speculations. Maybe that's why people will not reject it even when it becomes clear that most of it has no benefit for the society.
Was Chernobyl really the reason? The process may have started thanks to that (plus Three Mile Island and all the other fearmongering from the 70's and 80's), but I was given to understand that civilian de-nuclearization really kicked into high gear after Fukushima.
Fukushima was the major prompt behind de-nuclearization, but most of the missing nuclear capacity today isn't from reactors that were shut down, it's from reactors that should have been built but weren't.
For those, Chernobyl wasn't the reason, Three Mile Island was, but Chernobyl was certainly the biggest nail in the coffin, in the end. What's the pro-nuclear rebuttal, at that point? "The Soviet nuclear industry could make disastrous mistakes, and the American nuclear industry could make major mistakes, but the American nuclear industry wouldn't make disastrous mistakes!" That doesn't sound believable to me, and I think it's true!
I think the stronger argument would be that Three Mile Island was not actually a major mistake. It was an embarrassing one, but ultimately there's been worse, more deadly, and more expensive pedestrian bridge mistakes, and it's not like we've stopped making pedestrian bridges. Or take the countless industrial chemical processing lesser disasters: you can make a drinking game for USCSB videos. And those have Bhopal and a half-dozen kilodeath explosions to point toward!
In any other industry, we'd have smacked a few dozen wrists, written up a report that only a handful of nerds would have read, and hopefully implemented some better hardware.
More options
Context Copy link
How about "the mistakes weren't actually that disastrous and a Chernobyl every decade would be worth it if it meant we could get rid of coal"? (Not sure about the decade thing, but my understanding is that consequences of Chernobyl were seriously exaggerated.)
The messaging about nuclear seems to always implicitly admit that nuclear disasters are unimaginably bad but it's okay because we've made sure that the new reactor designs are so super safe that it could not happen again. That's bullshit and people know it. Could they instead be persuaded of a more grounded assessment of the risks?
So, basically, "torture vs. dust specks," just magnified?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Chornobyl definitely had a strong impact on anti-nuclear movement and irrational fear. A lot of problems in the post-Soviet countries were blamed on Chornobyl.
Ah, so it's more a West-vs.-East thing? I admit I wouldn't have thought that Eastern Europeans had anything like the anti-nuclear movement here in the West, but I could also see how they might.
Lithuania closed their nuclear powerstation and didn't build a new one. Russia, Belarus and Ukraine continued to build new ones only because their governments were not accountable to people and could override all resistance. I probably sound like Anatoly Karlin now but sometimes people make bad decisions.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I have mixed feelings about this - from one side, some people in rationalist circles made very inflated claims about how their methods of rationality give them almost superpowers and how they are way better than regular humans at things that regular humans are bad at. So, for a regular human it's both a bit of reassuring that those supposed gods have exactly the same failure modes as mere mortals, though on the other side it was nice to imagine there are gods walking among us, for a short time.
That said, I don't see how it really should hurt rationalism that much. I mean yeah, a bunch of rationalists got caught on a scam. Smart people getting caught on a scam is not that rare. Isaac Newton - an intellectual titan by any measure - lost lots of money on South Sea Company. There were numerous other cases. It's not good, it's ok to feel bad about it, in fact it's probably necessary to feel bad about it, but it's hardly some kind of huge global catastrophe for rationalism or EA, IMHO. Stand up, dust off, analyze and admit your errors, try to do better next time.
Its not that they got caught on a scam. They were the scam.
Nah. The scam is unrelated. Yeah, they overtly endorsed "double or nothing coin flips and high leverage", but that wasn't the scam. The scam was what they did when things went bad. If SBF had just let Alameda collapse in August (and massively damage FTX with it), that would have been a vindication of their philosophy and a success by their standards and those of most EAs.
The only thing that went wrong was that FTX, seeing Alameda on the verge of bankruptcy, gave them $8 million of customer money rather than let it happen. That was a classic panic response of someone running a bad hedge fund. Everything before that was fine.
I'd argue that FTX's financials were broken before that point (even assuming no other wacky transfers shows up, which I wouldn't bet surprised by), but in a more boring way: extremely high (unnecessary!) expenditures compared to a business model that would very likely never support them. This is boring, compared to the fraud, but it's also a very high-visibility problem.
Maybe FTX could have pulled a rabbit out of a hat, in some alternate universe where Alameda had broken even or only had small losses. It's possible they could have scaled up another order of magnitude, and/or kept some sort of premium where people would be willing to pay five times the rates other exchanges offered. Even if they hadn't, the collapse almost certainly wouldn't have been exactly as abrupt or as entertaining. But they at least had a present and clear risk to nearly any pause or drop in crypto, or even decrease in growth, and it's important to notice that.
More options
Context Copy link
More options
Context Copy link
I don't think so. The EA work of FTX/SBF were rather incidental to the main scam activities, and I don't see much input from the EA/rat circles into the scam. They got some outputs from the scam, and maybe lent it a small measure of credibility, but I don't see any way they were an inherent part of the scamming and had any substantial input to the decisions FTX investors and clients took.
You would be wrong. Here is Caroline explaining her trading philosophy literally replying to Scott Alexander.
The money quote: "Those people are lame and not EAs; this blog endorses double or nothing coin flips and high leverage". That's the smoking gun
Maybe you misunderstood me. That person is the CEO of FTX-affiliated hedge fund Alameda research. She's the one making the trades (with Sam's approval).
OK yes, I didn't realize whose messages those were, my mistake. It's more of a smoking gun than I realized, I'll probably need to research it deeper.
It's not. Double or nothing with leverage is just normal finance with EA goals. The scam is what they did when they hit 'nothing' - rather than take the L, they siphoned off customer money for one last big bet. (And it might have worked, if they got another six months.)
I don't agree this is "normal finance". It's a high risk game that is sometimes played in finance, but they committed fraud by the way they played this game and the way they presented it. My question was about whether EA/rat angle played any role in it. Initially, I thought the connection was purely coincidental, now reading more about it, I am not so sure - it looks there are some aspects of both a con aimed at EA/rat world and using EA/rat premises as justification for doing evil things, or at least to destroy all fencing that usually keeps one from doing evil things - which eventually inevitably led to doing evil things.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
In general, the whole thing with EA seems like similar to many other things that appear ridiculous about rationalism. You take a simple idea that is eminently sensible when you put it in a few words. Charitable giving is often inefficient - what if we start evaluating charitable giving by how much bang for the buck you get for it? Very sensible!
Then you put it in a crowd of people with a few well-known features, like love of Big Ideas, addiction to novelty of new ideas or revisionist takes on existing ones, almost comical belief in the power of Reason in comparison to tradition/law/taught ethics/societal approval/etc (right to the name of the crowd), and a tendency for constant iteration -and soon the original idea starts mutating to new forms, so that soon you're giving all your money to the Computer God, or becoming utter caricatures of utilitarianism straight from the philosophical debates ongoing for decades and centuries or banking on gee-whiz businesses as long as they're aligned with the cause, or just opening yourself up to all manner of grifters and fast talkers in general.
The same applies to polyamory, or nootropics, or all manner of whack-ass political ideologies beloved by rationalists - not that the simple idea behind them is necessarily good to begin with, but even then it just all seems to get worse and worse, and doing so quite fast.
What one seems to need is stopgaps, intellectual roadbumps - but even then, what would these be, who would set them, and how would you take care the movement doesn't just barge through them with the power of Reason, like with everything else?
Rationality at least has an ethos you can boil it down to to determine if you're doing it 'correctly.'
"Rationality is About Winning."
If you are trying to practice rationality and find yourself losing, or constantly being defected against, or otherwise in a disadvantageous Nash equilibrium, you know you're doing something wrong and need to step back and reassess. Whether most rationalists do this, I dunno.
But EA seems to have a much squishier set of metrics by which it measures success, and I think the strongest critiques against it are mostly around the lack of serious tracking of outcomes in a way that is truly reliable and legible.
Which rationalists can be comfortable with since merely increasing the probability that people's lives are improved is just as good as 'actually' improving them.
But in practice, I would guess that fraud and simple inability to measure outcomes accurately interferes with the amount of good they're truly doing in the world, and they seem to lack what I would call 'defensive cynicism' about the world and the people they end up trusting.
More options
Context Copy link
More options
Context Copy link
I've written anough on it already. It's arguably (hopefully) EA's Chernobyl, but Chernobyl did not doom USSR, nor even civilian nuclear power. And EA will make do without SBF. IARPA/RAND/natsec ties plus remaining SV allies are enough.
I find EA sympathizers' fretful objections funny, though. He who pays for the girl dances with her; the identity of sponsors is of crucial importance, much more so in this case than with Epstein, and it is, IMO, unthinkable hubris to assert the pose of philosophers in the ivory tower who were just gracefully accepting SBF's donations. He was an EA, and EAs will have to own the lion's share of the reputation fallout.
More options
Context Copy link
So this "Effective Altruist" was funneling money to the Democratic Party and has a multi-million dollar bet against Trump 2024.
His promo video talks about "climate change" and "pandemic preparedness", literally the two most over-promoted fashionable causes where almost all funding has been a waste or worse exacerbated problems for people who need the most help around the world.
Sorry, SBF is a joke using the EA label as a shield from scrutiny and to bolster his brand.
But we have to admit SBF and Caroline seem to fit the rationlist communities stereotypes. Honestly I see a bit of myself in them: eschewing materialism and living an apparently humble lifestyle, the chill math-nerd vibes, passion for technology and making the world better, the girl looking ten years younger than she is - she could be getting fisetin from the source as me for all I know.
To me this is one of the best things about this little ideological pocket of the internet. We can approach the world with the same sort of curiosity and skepticism, read a lot of the same material from SSC and elsewhere, and yet end up in wildly different places in terms of our beliefs and values. And then come together to try to understand each other.
That's one of the key things that made the motte (and other rat-adjacent communities) valuable and interesting. We are not all the same. Although the motte participants seem to be more right-skewed, there are lot of obedient conformist woke types that follow a lot of the same content.
So now we have a situation where one of the most successful people to be associated with this sphere turns out to have ruined a bunch of people's lives and defrauded major investors of billions. If Rationalism was some kind of united community, this would be a chernobyl. But we're not. Some of us believe that the elements of rationalism that think "we know so much better than everyone else" needed to be taken down a notch and this event only helps.
It's high time we moved to post-rat anyways. The idea of rationalizing your way through the worlds most complex problems was a dead end. EA might have made sense in a stable world with clear optimization pathways. What we are confronting now is a profound social and cultural shift where anything with positive connotations seems to get usurped for evil purposes under the guise of doing good. The collective actions of the world's elite are displaying an almost schizophrenic form of control: e.g. malaria nets to save lives while warning about overpopulation.
I could care less about how this affects the rationalist community. What we're really trying to do here is avoid the bullshit on the rest of the internet where you submit a post with relevant information trying to add nuance to a debate and it turns into just trying to figure out which "side" you're supporting. We can continue that regardless of what happens, and perhaps over time more people will see value in it.
Literally no one is working on pandemic preparedness seriously. None of his funding was for COVID, it was for biosecurity work which could stop the next pandemic. We haven't even gotten gain of function research to stop receiving federal funding; this was and is desperately needed.
While lots of people talk a
goodkinda-mediocre game about climate change, almost none of them are working on actual fixes. Money going toward carbon capture would be fantastic. (And TBH I think if this hadn't flamed out, some of FTX or Alameda's people would have eventually taken the money and eaten the bullet of dumping iron filings in international waters, previously the highest-profile potential EA Crime.)Observe the actual post-rats. They're just non-rationalists with a couple cultural signifiers; no more effective, and in general much less, than anyone else in the community or the adjacency. They have absolutely reverted to the condemnable behavior you're mentioning. Post-rationalism is a crock of shit and has been since the minute it started.
Okay, I'm curious, what's this supposed to do? Stimulate algae growth?
Yeah, basically. Dust erosion from continents feeds the ocean ecosystem, and there's certain bottleneck elements the same way there usually are with gardening, where a little more of one thing boosts growth tremendously. Iron is a big one. You can get a whole bunch more calcium carbonate dropping to the bottom of the ocean that way.
Back when I was in school there was a lot going on with studying continental formation/drift and global wind patterns having feedback loops for temperature on multi-million year timescales, but I don't know what came of it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
It's more of a hit to EA than to rationalism but it's still a hit to rationalism because a fair number of rats glommed themselves onto EA's glow up and it's a somewhat reciprocal relationship (the EAs are fine with rats lending their hand as long as they don't provide bad press, in which case disavow - that's the +EV move after all)
I can only offer advice as I will not profess to be a devoted adherent to rationalism. I enjoy the commitment to spirited debate their communities foster and Scott in particular is (was?) a good welcome mat for the community.
Bankman-Fried and the EA crew are the reverse. This is preaching to the choir here but the Carrick Flynn fundraising campaign exposed the movement as a pretty naked political project of more or less basic Democrats with a slightly refocused issue set. The engorgement of donations, including those late enough in the campaign they likely had minimal impact, to a low importance primary demonstrated none of their professed values and was instead a rather desperate grasp for power.
Bankman-Fried embodied this more explicitly, as an institutional insider following the steps of "EA" outsiders trying to ledge their foot in the door. He leveraged his connections to create this fraud, and would have continued to get away with it as long as his laundered reputation of EA work kept him Cathedral-approved to the institutional actors he partnered with.
Apply your normal caveats to anecdata here, but as a once decently involved crypto user/trader precisely zero people I am in touch with kept money on FTX. Why? To people in the space, myself included, the guy radiated inauthentic. This is a world full of scammers, hackers and overall shady individuals. Anyone selling you that they're anything but nakedly self-interested in making a buck is probably lying to you and untrustworthy. That's where mantras like "the code is the law" come in. This is poison to retail investors but the idea is that you're buying exactly what you can see whether it's a smart contract, an app or a wallet. The libertarians in this space are not Scott's "grays" who we now know were just a shallower shade of blue that wish the rest would slow down the swimming leftward into the deep blue of the Pacific (note those guys were all from California) - they're hardline libertarian econ types with a side of some gun nuttery and no snake stepping
Bankman-Fried's ultimate goal was to buy enough government and institutional power to legislate a favorable environment for himself and a poor one for his main competition, an exchange run by a Chinese-Canadian that's had to dodge his own government a couple times that the hardline libertarians will still tell you not to trust (their hardline is Bitcoin-only with your own keys) but one where I actually know people with money on it because their motives are less vague.
Now, you're going to ask me who actually lost the $10 billion here if I'm Pauline Kael-ing away all these people and the Bitcoin libertarians were never on board to begin with? Well, some long-time crypto whales did trust them as did a lot of those institutional investors (VC types, or as a short-hand basically anyone who owned Solana) and celebrities they got in bed with it seems, and probably a wave of retail that came after me. All of them are fucked, which means they're gone from the space for quite some time if not permanently. Does Sam care? Nah, you can read his takes on betting it all for a bigger win down the thread. And he could have gotten away with it too if his grift kept going just long enough to legislate away his meddling competition before they could catch him with his pants down!
Is this an Effective Altruism or rationalist mindset? No, it's just what every other corporate oligopoly/monopoly in the United States tries to do. Maybe he could justify it with some ends justify the means consequentialism that once he dominated the industry he could gather all these resources and better utilize them but that just lays bare my issue with EA in general. If all you want is power to effect political action to reshape the world to best fit your interest, you don't have a new or unique movement. Lots of people now and throughout history have wanted that!
More options
Context Copy link
More options
Context Copy link
A company can be comprised of 99% of geniuses, and 1% idiots at the top and fail.
All it takes is 1 idiot.
I was in Nokia with the most elite team of open source programmers and hardware engineers I've ever seen in Nokia's Skunk Works at the height of Nokia's success. Our software was way better than Android and had features many phones didn't get for more than a decade, and some they still don't have. The future was bright.
It didn't matter: one person at the top ruined everything.
I'd have loved to see some of the successors to the 6600.
We managed to finish one product against all odds, even with many people jumping ship right before the launch: Nokia N9.
I was really curious about that one, but they seemed to be impossible to find when I was shopping for a smart phone at that time.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Well, thank you for making this post so I didn't have to.
I would hope that this doesn't blow back on rationalism, but Twitter people are making a big deal about his connections to EA and Rationalism, however tenuous they may be.
More options
Context Copy link
This would be a wakeup call for me if I thought all aspiring rationalists were operating at a level of rationalism too high to dig themselves into a hole like this, but I did not in fact have such a high opinion of this community (which I do like very much).
More options
Context Copy link
This post is silly. Effective Altruism != Rationalism. Rationalism doesn't say people need to place a huge emphasis on charity or helping others. At most, there's a peripheral link since some big names like Scott are involved in both.
There's also no direct link to Rationalism since crypto isn't intrinsically pro-Rationalism either. Perhaps this SBF guy considers himself a Rationalist (I don't know, this FTX blowup is the first I'm really hearing of him), but even if that was the case it doesn't impugn Rationalism any more than WW2 impugned facial hair since Hitler and Stalin happened to both have moustaches.
SBF is very much a rationalist.
Do you have a rundown on how he is? Maybe a link or something? Again I'm not familiar with this guy.
Was he justifying his actions in the name of Rationalism or something? People are acting like this is a huge deal that all Rationalists need to reflect on, but as far as I can tell this is just a guy who donated to EA who went down for fraud in his business. There's not much literature in the Rationalist movement that says it's a great idea to commit fraud, so I really don't see why people are doing large mental updates on this.
More options
Context Copy link
More options
Context Copy link
they have 90%+ overlap though. The line is very blurry.
Edit: it's pretty funny to me that one of the replies claims I'm wrong because EA is a subset of rats, and the other reply claims I'm wrong because rats are a subset of EA.
You can see how this would be confusing to an outsider, right?
It's a squares and rectangles thing. Most EA proponents are likely Rationalists, but Rationalism is much larger than EA. Again, nothing in Rationalism particularly predisposes people towards altruism, it's just that if you're conspicuously charitable then EA gives you a framework for determining the effectiveness of your contributions.
Not all (American) Christians are pro-life, but nearly all (American) pro-lifers are Christian. And if you're pro-choice, this gives you a reason to dislike Christianity. So while we can agree that Rationalism != EA, if EA starts getting bad press because of the FTX implosion, it will start giving non-Rationalists a reason to distrust Rationalism.
More options
Context Copy link
More options
Context Copy link
Rationalism has a big overlap with EA, but EA does not have a big overlap with rationalism. EA has grown significantly beyond its origins in rat-spheres
More options
Context Copy link
EA is the subset of rationalism that takes utilitarianism Very Seriously.
There's much less overlap with TheMotte specifically, which seems pretty critical of utilitarianism. One of the main criticisms is this exact failure mode: because the numbers are all made up, a smart enough person can justify doing whatever it is they wanted to do anyway. "Why yes, of course I'm defrauding thousands of people, because I can better direct their money towards things which satisfy my utility function such as..."
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I think the most likely explanation is that at some point in the past they were a legitimate business that ran out of legitimate funds, probably due to their known penchant for highly leveraged bets. Then they deluded themselves into believing that if they dipped into customer accounts they could gamble their way out, return the customers money, and have nobody be the wiser. Cut forward some undefined span of time and the hole gradually grew to 8 billion dollars and the whole thing collapsed.
I mostly say this because most people aren't sociopaths and this seems like the most likely route this could have happened if Bankman is not a sociopath. If he is a sociopath and planned the elaborate fraud from the start, i guess nevermind. Feels less likely, though.
Anyway, I don't think we're looking at anything more or less than a polycule of stim-abusing rationalists with a gambling problem, good PR, and access to several billion dollars with which to gamble.
I think that the main lesson here is that you can't trust people just because they use lots of ingroup shibboleths and donate lots of money to charity, even though (to be honest) that would be kinda my first impulse.
There's a LOT more to it than that. His extensive anti DeFi lobbying and his donation history (donating money he didn't have) point to a much deeper rabbit hole than a Bernie Madoff situation. Between this and the possibly related murder of Nikolai Mushegian, it's a very strange time to be in the DeFi sphere. This is like our Epstein.
I know this sounds like a "just trust me bro" post but there isn't a lot written about it that's up to date that I can reference and it's unlikely the media will ever dig deeper.
I don't think I can have an educated opinion on whether the opposition to DeFi was (a) principled advocacy for something he genuinely believed, (b) basic self-interested moves typical of big players in most industries, or (c) nefarious shit that should tank his credibility among honest folk. My money ordinarily would be on (b), but that's just priors.
More options
Context Copy link
More options
Context Copy link
Agree with all of this, seems pretty clear (as much as anything is clear at this point) that Alameda Research was deep in the hole with bad trades and SBF decided to try to help them gamble their way out of the hole with FTX customer money.
I do think there's a genuine EA angle here though. SBF did not believe in declining utility of money because he was going to use it to do good in the world. Saving ten lives in the developing world is ten times better than saving one life, in much the way that buying ten fancy cars is not ten times better than buying one fancy car. SBF was willing to take this to the extreme, even biting the bullet on St. Petersburg Paradox in his interview with Tyler Cowen:
So yeah -- he sees literally no moral limits to this style of gambling in our finite universe.
This is both a worldview that (a) is distinctly consistent with EA, and (b) encourages you to double or nothing (including, as in the hypothetical, with other people's stuff) until you bust. And now he took one too many double-or-nothing bets, and ended up with nothing.
I think the honest response to this disaster is to say "yeah, I gambled with customers' money, and it was the right thing to do because I had a better than even chance of pulling it off, and I would have used that money to do good in the world so there's no declining value to each dollar. Sure, I gambled with other people's money, but wouldn't you dive in the pond to save the drowning child even if your expensive suit were borrowed from an acquaintance? Well that's what I did, with a lot of people's suits, and it was the right thing to do."
Of course, utilitarians don't believe in honesty -- it's just one more principle to be fed into the fire for instrumental advantage in manufacturing
paperclipsmalaria nets.Now, who knows -- maybe he would have committed the same kind of fraud even if he had never heard of EA and were just a typical nerdy quant. But, when your whole ideology demands double-or-nothing bets with other people's interests, and when you say in interviews that you would follow your ideology and make double-or-nothing bets with other people's interests, and then you do make double-or-nothing bets with other people's interests, and one of those bets finally goes wrong... yeah, I think one can be forgiven for blaming your ideology.
If you ignore the caveat he gave up-front which is that this reasoning only applies if the universes are non-interacting?
That caveat is only to establish that you actually double your utility by duplicating the earth -- and not just duplicating it onto a planet that we would have settled anyway. He is explicit about that. The point is that in raw utility calculations short of the infinite, he is willing to gamble with everyone's utility on any bet with positive expected value.
More options
Context Copy link
nah that was a trivial objection on his part, most of the value of the earth comes from the fact that we might colonize the universe, so he wanted to make sure that the "double" in "double or nothing" truly meant doubling the value of the earth, if the second earth appeared close-by to us, it wouldn't really be doubling the value since there's still only one universe to colonize. But if one is assured that the value can indeed double, then SBF was completely on-board with betting it all.
More options
Context Copy link
More options
Context Copy link
There's a bunch of argument about what utilitarianism requires, or what deontology requires, and it seems sort of obvious to me that nobody is actually a utilitarian (as evidenced by people not immediately voluntarily equalizing their wealth), or actually a deontologist (as evidenced by our willingness to do shit like nonconsensually throwing people in prison for the greater good of not being in a crime-ridden hellhole.) I mean, really any specific philosophical school of thought will, in the appropriate thought experiment, result in you torturing thousands of puppies or letting the universe be vaporized or whatever. I don't think this says anything particularly deep about those specific philosophies aside from that it's apparently impossible to explicitly codify human moral intuitions but people really really want to anyway.
That aside, in real life self-described EAs universally seem to advocate for honesty based on the pretty obvious point that the ability of actors to trust one another is key to getting almost anything done ever, and is what stops society from devolving into a hobbesian war of all-against-all. And yeah, I guess if you're a good enough liar that nobody finds out you're dishonest then I guess you don't damage that; but really, if you think for like two seconds nobody tells material lies thinking they're going to get caught, and the obvious way of not being known for dishonesty long-term is by being honest.
As for the St. Petersberg paradox thing, yeah, that's a weird viewpoint and one that seems pretty clearly false (since marginal utility per dollar declines way more slowly on a global/altruistic scale than an individual/selfish one, but it still does decline, and the billions-of-dollars scale seems about where it would start being noticeable.) But I'm not sure that's really an EA thing so much as a personal idiosyncrasy.
There's a problem with that: a moral system that requires you to lie about certain object-level issues also requires you to lie about all related meta-, meta-meta- and so on levels. So for example if you're intending to defraud someone for the greater good, not only you shouldn't tell them that, but if they ask "what if you were in fact intending to defraud me, would you tell me?" you should lie, and if they ask "doesn't your moral theory requires you to defraud me in this situation?" you should lie, and if they ask "does your moral theory sometimes require lying, and if so, when exactly?" you should lie.
So when you see people espousing a moral theory that seems to pretty straightforwardly say that it's OK to lie if you're reasonably sure you're not getting caught, when questioned happily confirm that yeah, it's edgy like that, but then seem to realize something and walk that back, without providing any actual principled explanation for that, like Caplan claims Singer did, then the obvious and most reasonable explanation is that they are lying on the meta-level now.
And then there's Yudkowsky who actually understood the implications early enough (at least by the point SI rebranded as MIRI and scrubbed most of the stuff about their goal being creating the AI first) but can't help leaking stuff on the meta-meta-level, talking about this bayesian conspiracy that, like, if you understand things properly you must understand not only what's at stake but also that you shouldn't talk about it. See Roko's Basilisk for a particularly clear cut example of this sort of fibbing.
More options
Context Copy link
That's like saying that Christians don't actually believe that sinning is bad because even Christians occasionally sin. You can genuinely believe in moral obligations even if the obligations are so steep that (almost) no one fully discharges them.
Why on earth would a deontologist object to throwing someone in prison if they're guilty of the crime and were convicted in a fair trial?
Well it sure seems like Caplan has the receipts on Singer believing that it's okay to lie for the greater good, as a consequence of his utilitarianism.
Sure, except for when it really matters, and you're really confident that you won't get caught.
Fair enough! I suppose it depends on whether you view the morally relevant action as "imprisoning someone against their will" (bad) vs "enforcing the law" (good? Depending on whether you view the law itself as a fundamentally consequentialist instrument).
I think the relevant distinction here is that not only do I not give away all my money, I also don't think anyone else has the obligation to give away all my money. I do not acknowledge this as an action I or anyone else is obligated to perform, and I believe this is shared by most everyone who's not Peter Singer. (Also, taking Peter Singer as the typical utilitarian seems like a poor decision; I have no particular desire to defend his utterances and nor do most people.)
On reflection, I think that actually everyone makes moral decisions based on a system where every action has some (possibly negative) number of Deontology Points and some number (possibly negative) of Consequentialist Points and we weight those in some way and tally them up and if the outcome is positive we do the action.
That's why I not only would myself, but would also endorse others, stealing loaves of bread to feed my starving family. Stealing the bread? A little bad, deontology-wise. Family starving? Mega-bad, utility-wise. (You could try to rescue pure-deontology by saying that the morally-relevant action being performed is "letting your family starve" not "stealing a loaf of bread" but I would suggest that this just makes your deontology utilitarianism with extra steps.)
I can't think of any examples off the top of my head where the opposite tradeoff realistically occurs, negative utility points in exchange for positive deontology points.
I mean... yeah? The lying-to-an-axe-murderer thought experiment is a staple for a reason.
Fair in general, but he is a central figure in EA specifically, and arguably its founder.
How about stealing $1000 of client funds to save a life in a third world country? If they'd be justified to do it themselves, and indeed you'd advocate for them to do it, then why shouldn't you be praised for doing it for them?
The fatal flaw of EA, IMO, is extrapolating from (a) the moral necessity to save a drowning child at the expense of your suit to (b) the moral necessity to buy mosquito nets at equivalent cost to save people in the third world. That syllogism can justify all manner of depravity, including SBF's.
Yeah, fair, I'll cop to him being the founder (or at least popularizer) of EA. Though I declaim any obligation to defend weird shit he says.
I think one thing that I dislike about the discourse around this is it kinda feels mostly like vibes-- "how much should EA lose status from the FTX implosion"-- with remarkably little in the way of concrete policy changes recommended even from detractors (possible exception: EA orgs sending money they received from FTX to the bankruptcy courts for allocation to victims, which, fair enough.)
On a practical level, current EA "doctrine" or whatever is that you should throw down 10% of your income to do the maximum amount of good you think you can do, which is as far as I can tell basically uncontroversial.
Or to put it another way-- suppose I accepted your position that EA as it currently stands is way too into St. Petersberging everyone off a cliff, and way too into violating deontology in the name of saving lives in the third world. Would you perceive it as a sufficient remedy for EA leaders to disavow those perspectives in favor of prosocial varieties of giving to the third world? If not, what should EAs say or do differently?
I don't have a minor policy recommendation as I generally disagree with EA wholesale. I think the drowning child hypothetical requires proximity to the child, that proximity is a morally important fact, that morality should generally be premised more on reciprocity and contractualism and mutual loyalty than on a perceived universal value of human life. More in this comment.
Is there, do you think, any coherent moral framework you'd endorse where you should donate to the AMF over sending money to friends and family?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I don't understand your point. Are you claiming that it's impossible to believe that you have a moral obligation if you aren't living up to it? That obligations are disproved by akrasia?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I'm not saying I could predict the future or anything, but I always confused Bankman Fried and Bernie Maddoff in my head.
there is some nominative determinist element here that I keep reading his name as "Bankman-Fraud"
Bankman-Fried is almost good enough on it's own.
The man got cooked!
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
A giant crypto platform has collapsed
Lots of the people involved were prominent figures in EA
EA is popular among rationalists, or postrationalists, or neorerationalists, or whatever
Therefore...
I don't buy it. As much as we like to talk/praise/complain about EA, it's not synonymous with this community. Not to justify that guilt-by-association. How many people here actually gave financial support to anything EA? How about FTX? I'd never actually heard of it before this, despite hanging out in these spaces since 2014, because I don't want to mess around with crypto.
Uh...sure. My read is that you're overvaluing EA to the rationalists as well as FTX to EA.
EA obviously deserves some egg on its face, especially outlets which were evangelizing for SBF (like @TheDag's Sequoia article). Is this supposed to be so surprising as to sour people on the idea? "Breaking news: Silicon Valley company commits financial crimes!"
This is akin to seeing Nixon go down and posting "haha conservatism sure is discredited. Perhaps they were moral frauds just as they were political ones?"
For what it's worth, I selected AMF as my charity of choice on Amazon Smile. My original choice was charity:water, which TotalBiscuit once did some fundraising for, IIRC.
More options
Context Copy link
Yeah, this is much more a problem with crypto than it is anything involving EA or Rationalism.
I'll say it again, as it stands right now, it's a Ponzi scheme, more or less, hoping that eventually something will be mainstreamed into broad, basically universal usage. I'm not saying that as a moral criticism, FWIW, I'm sure the people who engage in it believe that some day it'll actually get there. But until that day, it really is just a house of cards waiting to collapse.
More options
Context Copy link
Sequoia Capital is an ordinary venture capital firm with zero relation to EA.
https://en.wikipedia.org/wiki/Sequoia_Capital
As I mentioned in response to his post, a search of the EA forum and Lesswrong doesn't find any evangelizing for SBF or FTX. He got occasionally discussed as a major EA donor and EA-supporting billionaire, and that's pretty much it.
From what I've heard, senior figures in EA (Will MacAskill etc) knew him well for years, associated with him, and promoted him as a big success of EA.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I still don’t understand a lot of peoples hatred of this guy. From what I can tell, he made a dumb mistake in a very complicated system that went against him. His main transgression was playing it too risky, and I can’t tell that, other than whatever that implies, there’s any bad morality at play here.edit: information revealed since I made this comment makes it clear that he was an overconfident asshole and there is definitely lots of reasons to hate him. (Tho I maintain that the early kvetching reeked too much of “his company crashed so he must be doing something wrong”, which reduces to “successful = good”)
He gambled with his customer's funds without asking or telling them, which is big time fraud.
Replace “gambled” with “traded with”, and its now basically the same as what any major modern bank does, I think. (Outside of the idiotic TOS, which I agree they effed up on, since the problematic phrase wasn’t even in there until May)
Except real banks are heavily regulated (the ratios they can lend out, what they're allowed to invest in, what they have to disclose, etc.), they can borrow money from the Fed, they're required to have keep a certain amount of cash in reserve, and they have customers' money insured by the federal government (and customers know this, which psychologically guards against the bank runs that cause this sort of catastrophe).
I swear, this crypto shit is just a speed-run of the last 200 years of issues in banking and relearning all the same lessons over again.
More options
Context Copy link
More options
Context Copy link
From what I've read, it was specifically in the terms that they wouldn't...
It was definitely fraud... but considering their 'political donations' it'll be interesting to see what comes of this.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
EA is how smart people who understand that they are being lied to and exploited by the MIC/DNC/MSNBC confluence sometimes referred to as The Cathedral justify supporting it nonetheless.
https://www.opensecrets.org/outside-spending/donor_detail/2022?id=U0000004705&type=I&super_only=N&name=Bankman-Fried%2C+Sam
How so?
More options
Context Copy link
Alternatively, SBF came from a "democratic Deep State" family and just used EA talking points as a skin to wear in service of fooling gullible outsiders. I mean, if you look at his family's history and connections they were knee-deep in the DNC machine. So I don't think it says much about EA. I think we are simply dealing with a corrupt fraudster - potentially even a psychopath - who is highly adept at manipulation and gaining people's trust. The fact that he chose EA as his vector of infection probably flatters EA in some bizarre way, because it implies it is high-status with people who matter.
SBF definitely believed in EA from his college days.
Doesn't change the fact that his family background is deeply connected to democrat elites. Which also probably tells us why he was such a huge backer (2nd after Soros) of that party. The EA thing perhaps was genuine, but he still used it cynically to advance other political objectives.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I don't know if I'd say that exactly, but I do think it's one big red flag for the EA movement, where they do seriously need to sit down and reconsider what their goals and aims are, what they are doing to achieve those, and what their current slate of recommendations/deeds have been.
The first big red flag was trying to get involved in politics, with the Carrick Flynn election. Though the EA movement seems to have been on a trajectory over the past few years of moving away from their initial goals (e.g. "bednets save more lives, donate to bednets" persuasion) into "okay, now we have got so much money committed to this, there aren't enough [bednets or whatever] to spend it on, now what do we do?"
And that's a good question, but their answer seems to have been "start taking in one another's washing" - conferences where it's all net-working and 'how to get a job at an organisation running conferences about EA' and talking-shops, rather than the concrete "we bought 10,000 bednets last year and sent them on to So-and-So distribution centre".
EA has gotten big and influential, and this is the trap they fell into: becoming one of the same old political lobbying groups. Sam Bankman-Fried and his band of merry persons was a symptom of this, not a cause.
More options
Context Copy link
Interesting post, much of this mirrors my thinking in the last few days. The members of this grim little cabal are rationalists in the first degree, each one of them a type specimen. And though they claimed to be effective altruists - and would have been proudly lauded as such up until two weeks ago - it turns out they're degenerate gamblers and crooks whose amoral recklessness has hurt millions of people. It is statistically nearly certain that some victims of FTX will kill themselves, if they haven't done so already.
You can't delegate morality to mathematics. All it leads to is arrogance, and the 'freedom' to always be able to justify your own behaviour to yourself, even when your actions are those of a base criminal. Rationalism is not a wholly failed project; effective altruism is an important and useful dogma - but these ideas must be alloyed with traditional morality to be effective at inducing virtue.
You can't delegate morality to non mathematics either for the same reasons, whether it's telling an axe murderer where your friend is, the numerous contradictory things Christians are taught, or reporting your own family to Mao to preserve Communism.
Proponents of every moral system can justify plenty of terrible things. This has nothing to do with mathematics which constrains the types of arguments you can make. That's not to say mathematics solves the problem, but it's really hard to believe it exacerbates. Jesus literally says things like
And virtually every Christian finds it easy to ignore Him (He's not really being ambiguous here...).
People justifying whatever is convenient isn't a problem with a particular moral system, it's a problem with people.
More options
Context Copy link
More options
Context Copy link
I have spent a lot of my time in neckbeard libertarian circles, so I have heard a lot about cryptocurrencies in the past 10 years.
I made an early decision that I would not touch them with a bargepole for the foreseeable future. No subsequent event has made me regret that decision.
I like my investments like my taste in paintings: safe, traditional, and matched to my low time preference.
I'm in the same boat as you. A friend explained bitcoin to me in 2012. How it works makes sense but I kept circling around to "But what is this for?". The basic arguments:
No government control of money
Making transactions easier.
It's scarce
(1) is a pipe dream and just makes it a big flashing neon target for government capture or destruction.
(2) is just false. Crypto is an immense PITA to actually transact with and debit/credit cards are almost too easy to use. Add in scams, complicated passwords/keys that can be easily stolen/lost, and widespread usage seems like a joke.
(3) is meaningless without some underlying value. Lots of things are scarce that nobody wants.
Everything about it screams "speculative asset"/"baseball cards". I guess I would be much better off today had I not been skeptical and set up a miner or ten.
More options
Context Copy link
If you had bought bitcoin or ethereum in 2012 you would likely have much greater wealth now. So this seems a very odd statement, unless you don't care at all about money
You can say this about virtually any security you think of, but a few landmines would have taken you to zero. It does not a compelling argument make. If im going to gamble money for 10:1, 100:1, 1000:1 returns i can already do that with conventional regulated securities, and not be at risk of getting hacked and losing everything.
If you had bought $100 dollars worth of bitcoin you could have been a millionaire. No other investments have been quite so explosive.
More options
Context Copy link
More options
Context Copy link
I care about money, but I also care about risks.
Think of it with ex post/ex ante. I don't regret my ex ante decisions in 2012, because I couldn't have rationally anticipated that (thus far) I would be making money buying bitcoin.
Similarly, if I'd put the right numbers into a lottery ticket, I could have won all sorts of lotteries, but I don't regret not doing that.
Same here, and I DID put money into Crypto back then, and it DID pay off in numerous ways, but my risk tolerance was low so I never made any bets I wouldn't have been willing to make at, say, a poker table at a casino.
People don't realize how many crises early crypto endured, and how many points it hit where any normal person would have pulled out their money either because they wanted to take profit on their absurd gains or they lost too much to stomach.
I don't know how many people got wiped out by losing private keys or from sketchy exchanges collapsing or bad altcoin bets in the last 6 years. The deeper you got into the space (beyond just BTC and ETH) the higher chance you'd get burned HARD.
So ex post it is entirely possible to have zero regrets about not jumping on the train. Ex post I am happy with how well my Crypto career paid off (legitimately the main reason I could buy my house when I did) but also well aware it could have gone differently had I decided to put in more money on riskier bets.
I have almost entirely released all my Crypto holdings and put more money into tangible things where the risks are much more straightforward.
Good points. And to make it clear, I was joking when I said above that I felt "superior" to anybody. My aversion to crypto, like my aversion to gambling, is a subjective preference. It is as subjective as my taste in art, which is why I mentioned that in my first post.
As it happens, I am very sympathetic towards crypto as a political mission (non-gov cash is cool and anonymous non-gov cash would be very cool) but that's different from my investment choices.
I mean, when I've been perusing the Crypto subreddits the past few years, a sense of superiority is hard to avoid when you see newbies making the same mistakes over and over again, whilst you've learned your lesson (hopefully by observation and not experience) in the early days. And I'll speak up and be like "Guys I've seen this exact scenario develop before and here's what you should watch out for" only to be drowned out by "FUD, WHEN MOON? NGMI, HODL HODL HODL!"
Bingo. I'm crypt-optimist but at some point during the NFT craze I realized how hard 'we' had lost the plot.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
huh? What does this have to do with the topic?
I guess the important part is that you feel superior :marseyeyeroll:
This was unnecessarily antagonistic, don't do this.
More options
Context Copy link
Neckbeard libertarian circles have a lot of crypto talk. I was hearing about crypto in such circles more than 3 years before anywhere else.
Exactly, that was precisely the point I was making. I also feel superior for taking my waterproof jacket on a day when it might rain and consider all people who get wet as inferior human beings.
https://www.merriam-webster.com/dictionary/hubristic
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I just want to see one that develops a real transaction ecosystem, where the use case is millions of transactions processed.
More options
Context Copy link
More options
Context Copy link
Yeah it's pretty bad. I'm fairly involved with EA and while I knew SBF was a big donor, I had no idea how bad the hero worship had gotten. Both among EA and big financial institutions. To my eyes this reflects even more poorly on VC funders and high finance/trading in general, they were supposed to have done due diligence on FTX (which presumably they did and the whole using $10b in customer funds came later) but they didn't see this coming either.
For instance look at this archived profile on SBF from Sequoia, a VC fund that made FTX happen and memoryholed this article after the disaster. The hero worship in there is cringey, and in retrospect it's horrifying:
and
and
it goes on
Of course the answer to that last question is: SBF. The blatant pedestalizing of the man in here is inherently disgusting to me, the fact that it comes from a well-respected VC firm really lowers my faith in that entire class of folks. Especially after the WeWork Adam Neumann disaster and all the other disasters from startup founders.
Either way, I've been trying to beat the drum in EA spaces for a long time that EAs put far too much focus on credentials. It's ironic that so many folks in the movement will tell you to your face they don't care about credentials, only impact, and yet the entire leadership is nothing but blue-blooded Ivy League grads and other well-connected elites. It's a shame because I think most people in EA have their hearts in the right place, they just can't take off the credential blinders and see that most of the academic/elite class is specialized in bullshitting and not much else.
Isn't that what VC is supposed to do? Invest 100 million in 100 little companies, to see 98 of them fail, one make the investment back and one grow to 300 million over the next decade? Of all the players, the VC firm seems like the least out of their normal activity.
Yeah, though the current ratio is more like 24 out of 25 investments fail, and the 25th blows up at 30:1 (or so ive been told by friends in the industry).
The fawning article is a bit nauseating, but its really just marketing, and a VC doing VC things.
More options
Context Copy link
I don't blame them for investing in a fraudster, like you say it'll happen all the time. The fawning in the article is what turns me off.
More options
Context Copy link
More options
Context Copy link
You say both but link only a venture capital fund. Doing a Google site-search of effectivealtruism.org and Lesswrong excluding the past couple weeks I don't really see any hero-worship. There's some mentions of him as a guy who donates a lot of money to effective-altruist causes, some mention of the FTX Future Fund, and some discussion that maybe his example means sufficiently talented EAs should pursue "become billionaire entrepreneurs" as a strategy more often. He wasn't some kind of thought leader within the EA community, he was a guy who donated a bunch of money and came up in discussions of ultra-rich EA supporters. I think people, including effective altruists themselves, are overstating his involvement in retrospect.
Agreed, which is why I specified that yes it was bad EA was linked to him, but venture capital and high finance are the worse perpetrators. Sequoia specifically.
I don't think EA has a huge problem with hero worship and generally they are good at tearing down egos of overconfident folks. I think the issue that blinded EA to the whole SBF fiasco is that again, they put far too much weight on the value of traditional credentials and connections.
More options
Context Copy link
More options
Context Copy link
It's the worship of intellect in rationalist circles. They are very smart, in general; they do tend to be well-intentioned, in general; and they do think they are working on "how to win" where that means putting their smarts and good hearts to work for the benefit of all humanity. They forget that brains alone are not enough, and you do need a leavening of common sense or practical experience.
Whereas me and other old-fashioned types like me were pointing out all along that thinking you know how to do charity better than all the groups that have ever done it over the history of humanity is boundless conceit, and no it doesn't matter if you use financial analysis and statistics and all the rest of the jargony tools. They dug a pit, and fell into it themselves.
I'm not happy this happened, but I think a little chastening about "all those other sets of individuals did it wrong and made dumb mistakes, but not us" is no harm in the long run - if they learn the correct lessons from this about not believing their own hype and that even if Sam or Caroline or whomever were your good old pals from college, that doesn't mean a tap when it comes to running a billion-dollar business with no real experience.
I'm not sure how this really relates to SBF. Is it a tenet of EA that they are better at divining sources of ethical funds than normal charities? From what I can tell, the purpose of EA has always been that they would be better at spending funds effectively, not sourcing funds. That a big donor proved to be engaging in criminal actions doesn't really have anything to do with EA, does it?
More options
Context Copy link
I don't think this is a good example, considering it was skewered on LessWrong itself.
More options
Context Copy link
There's an obvious and inevitable problem of self-awareness if the way you approach a problem is "as a rationalist..." It would be like if I created a new school of thought in ethics called Obviously Morally Correctism. So as someone who is Obviously Morally Correct, I could fix all the world's problems and you should trust me with your billions
More options
Context Copy link
Couldn't have said it better myself. A big weakness of EA is that older folks are almost nowhere to be found in the movement, and despite the fact that retirees make up a ton of the volunteers out there, EA tends to scoff at the idea of reaching out to retirees. I've heard various different reasons but it seems to boil down to "old people aren't cool and don't have interesting ideas."
I think EAs novel way of looking at things is valuable and will take them far, but yeah the movement, especially at the higher levels of power, really needs to start courting older more experienced folks.
Yeah. When I saw the photo of Caroline Ellison I was "bloody hell, is she still a teenager???" and even though I found out that she's twenty-eight, she doesn't look it. Even worse that she went from "straight from college to some quant fund for six months then put in charge of parallel company run by her boyfriend". Take her on as a mid-level person, sure, but CEO of the whole shebang?
There badly needed to be a few fifty+ year old guys in charge, even if that means asking Evil Privileged White Cis Males for help. The other part of the problem is that they are well-connected, so the worst of both worlds: Daddy's influence and connections got them a whole heap of push up the ladder, but there were no corresponding "friends of Dad" in charge so the kids were given a fistful of dollars and let loose in the candy store.
Then when they ran into trouble, there was nobody senior enough to take the wheel and we see what happened. This latest story of the alleged hack is just the icing on the cake, I don't know what rumour to believe next (I could give credence that it's somebody down the chain at FTX who saw everything collapsing around them, knew that Sam wouldn't save their neck, and decided to help themselves to whatever was left in the piggybank because at this stage it's every man for himself. But it could be an outsider altogether, which would just be the finishing touch on how bad this entire set-up was).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
It is not that the movement was exploited, it is that the organizations around EA were almost entirely funded by this Sam Bankman Fried, who seems to be as true believer as it gets, despite having highly unethical and illegal business practices. It was effectively HIS movement after he bankrolled it and everyone else was a useful idiot who gladly took paychecks from a billionaire.
More options
Context Copy link
Whenever malfeasance is revealed in an organization or institution, I remember Jesus describing the kingdom of God growing from a small mustard seed to a giant tree. The tree is so great that the birds of the air can nest in it. While this sounds lovely, in another parable nearby (and probably delivered in the same occasion), Jesus warns that the birds of the air (the evil one) can steal the farmer’s seeds (seeds of faith/words of God) when he sows them.
Together, these remind me that any institution founded with noble intentions can easily grow to hide all sorts of darkness within its branches. The obvious example was the Roman Catholic Church at its most decadent: powered by tithes-as-tax, colonizing the world, demanding indulgences, and so on.
FTX, with its billions and its devotees donating ten percent of their income sounds like exactly the kind of institution which needed to be more ornithophobic.
More options
Context Copy link
Yeah. I'm not saying they're immoral or degenerate, but they are very into non-traditional modes of living, shall we say, so breaking old taboos around caution in business etc. are also part of the mindset. Scott at least realises the importance of Chesterton's Fence, most of the EA/rationalist overlap think fences are all part of the musty old society that needs to be cleared away for the beauty of the technocratic utopia to flourish.
Well, now they know why that fence about "don't take money out of one account to prop up another failing one" was there, along with the rest of all the financial regulations that red-tape bureaucrats put in place to stop precisely what Sam was doing when it all went belly-up.
More options
Context Copy link
That doesn't quite follow, since the alternative to quokkas is to have virtually nothing but bad actors vying for control.
Whereas in quokka-land the number of bad actors is on average lower, so we'd expect fewer of them to rise through the ranks, in aggregate, and it would be all the more notable when one did because it bucks a trend.
The alternative to "quokkas" are people who are aware that their big pot full of gold will attract bad hombres and take some precautions, people who will not welcome anyone who repeats their lingo as brother.
We are talking about charities accepting donations, they're not the ones providing the gold. We're not talking about Sequoia Capital, the 50-year-old venture-capital firm that gave FTX hundreds of millions of dollars, had access to internal information, and actually had a duty to their investors to try to avoid this sort of thing. We're not talking about any of their other institutional investors like the Ontario Teacher's Pension Plan, Tiger Global Management, Third Point, Altimeter Capital Management, or Softbank. Since when has it been the job of charities to investigate the businesses of the people donating them money? "Failed to do unpaid amateur investment analysis trying to beat institutional investors at their own jobs for the sake of refusing donations that might turn out to be from a criminal" isn't exactly a test of quokkahood, especially if the label isn't being applied to the institutional investors who actually invested and lost enormous sums of money.
More options
Context Copy link
Yes, and in most such communities, there are already "bad hombres" in control, and they spend an excess amount of time combating other bad hombres to stay on top of the pile.
Quokka-land isn't going to be more likely to have bad actors, indeed the whole reason quokkas exist is because they lucked into a habitat where they have no predators. The predators would have to be introduced from outside the community.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I have no idea what you're talking about and a quick googling only tells me something about a hacked cryptocurrency platform. What does that have to do with rationalism or EA and why should anyone care?
The main person behind the FTX situation is Sam Bankman Fried, a very influential donor and prominent figure within the Effective Altruism movement. His exchange blew open due to his financial crimes and the movement he is tied to is taking heat. He allocated millions into Effective Altruism while embezzling from his exchange. What is currently happening is a Bernie Madoff level collapse. A company that had ads with Larry Davis on the Super Bowl this year is imploding overnight and all it's customers are turbo fucked.
What exactly do you think his crimes were? I haven’t seen anything that says he was operating any differently from a normal bank, except doing it in with much riskier assets.
He gambled with customer's money (by giving lines of credit to his hedge fund, amongst other things), when the terms of the exchange explicitly said it wouldn't.
I thought this video did a pretty good job of explaining it, with some receipts.
https://youtube.com/watch?v=MWfuDeO9thk
I don’t think that’s anything that a modern international bank wouldn’t do? Like, surely JP Morgan Chase extends lines of credits to their own hedge funds, using bank deposits to fund them. (Granted they prolly don’t extend 50% of their assets to them. But I think that’s a difference in appetite for risk than any real ethical boundary)
The terms claiming they wouldn’t do that is dumb. I’m not sure they exactly violated them, since they didn’t loan the assets to FTX trading, they just allowed another client (Alameda) to trade on margin on shitty terms. But they definitely violated the spirit of it. (Though I don’t really know what “title” is supposed to mean in this context)
More options
Context Copy link
More options
Context Copy link
The terms of service of FTX said that money held in the exchange was backed 1:1 with assets. He was using the customers funds to make bets with his hedge fund, which is illegal.
He also attested to this type of thing in front of congress, and worked frequently with legislators so he may have committed perjury on top of that.
Ahh that would definitely be something if it were true. That said, it seemed to me like a strange thing for a company to put in the TOS, since those things are about protecting the company. So I pulled the FTX TOS from January and don’t see anything in there promising thatedit: I was wrong. January was too early to pull the TOS.You say he made that claim publicly?
Turns out January was too early. They added the section about ownership in May. So I retract what I said before.
More options
Context Copy link
He tweeted, "FTX has enough to cover all client holdings. We don't invest client assets."
He then deleted that tweet, as it was a bald-faced lie.
Ah yep. Hadn’t seen that one. That’s idiotic and effed up.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Well, his exchange blew open because of unethical dealings that are considered crimes if an American financial institution carries them out. And while a federal probe is starting up, the WSJ has said it’s yet unclear what the charges might be, given crypto is far-less regulated and some of FTX’s legal entities were offshored. Which hints at the reality of the spergy, libertarian dream of deregulating financial markets.
More options
Context Copy link
Thanks!
https://youtube.com/watch?v=WHgGSeNY7Dw
Here's a good recap of it so far. There's a lot more that has happened since but it's not settled yet.
I appreciate the breakdown of what happened, and man, Changpeng Zhao is right now living the best life according to Conan:
All the shade he's throwing on FTX and SBF, and now I understand why he first issued and then withdrew that letter of intent about buying/taking over FTX - it was to steady the market before people started a run on Binance, and to point up that Binance was okay, was strong, and had no problems unlike FTX. Smart guy, his strategy seems to be working (so far).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Or maybe just taking the maximum expected-value path towards becoming insanely rich?
The collapse was dramatic but it's a consequence of the same high-risk strategy that had FTX valued at 32 billion USD a few months ago. If that's the upside then their actions can be rational even if the chance of success was quite low.
EDIT: Also SBF may have lost most of his money but according to this article he's still worth around $600 million US. So even if the company failed, rationalism still seems to have paid off for him personally.
He's going to be subject to decades of personal lawsuits, fines, and forfeitures. This admission also might be big enough to pierce the corporate veil and will certainly mean prosecution for everyone who knew.
Even if he is sued into oblivion, by his own metrics he can still believe he made the right choice. He was very up front about the whole thing being a huge gamble, with a low chance of success.
"Guys, we had a 51% chance of duplicating the Earth, the algorithms say we should take that chance! Newcomb's paradox! One-boxing! Maximise utility!"
More options
Context Copy link
More options
Context Copy link
Yep. The next ten years of this guy's life now belong to lawyers, both his and his victims. Maybe longer, depending on how the liability shakes out.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link