I made this a top level post because I think people here might want to discuss it but you can remove it if it doesn't meet your standards.
Edit: removed my opinion of Scott from the body
I made this a top level post because I think people here might want to discuss it but you can remove it if it doesn't meet your standards.
Edit: removed my opinion of Scott from the body
Jump in the discussion.
No email address required.
Notes -
I'd say about 1:1, at minimum, of avoiding donors who a) will make up a charity-redefining portion of the funds, and b) are more than slightly likely to collapse.
Yes, you can dial up the specificity until this exact problem is very rare, but the broader class of scuzzy rich or 'rich' people with reputational and financial risks that will disrupt your organization isn't even specific to charity, and has tremendous downside that at best set back causes and financials by years, if not simply scuppering entire organizations. And let's be direct: this will kill a good few charities, including a few pretty good ones. And risky investors, at least at levels where of donation that can close an org, are high enough a percentage of the field that I don't think this runs into the joint under- and over-diagnosis problems.
I don't want to make too much of past problems internal to LessWrong-sphere charities (and tbf, many like the embezzlement thing at early SIAI is genuinely different), or crypto (although there are so damn many), or the specific combination of the two (tbf, far more marginal 'criminality'... and still a big financial risk). But this is not some alien problem teleporting in from another dimension, which no one could have expected. Nor are we in the heady days where a crypto rug pull was unprecedented: you can look at the stats a week or three years ago. And the guy was regularly taking interviews and talking his willingness to take very high risk bets!
Maybe people were making these evaluations internally -- a /
2% chance of fucking over your entire charitable giving organization against a 95% chance of having twice or five times the funding is an attractive bet for some people! -- and just rolled snake-eyes, and don't want to publicly admit it. Or maybe people were making these evaluations and guessed wrong, say, coming up with a /0.5% chance of FTX collapsing (whether as here, or just legal investment risks). But if you're getting a large portion of your funding from any crypto organization, this isn't just a risk someone somewhere was warning about: it was the default common knowledge.It's acceptable to be wrong; but refusing to recognize that loses a lot of potential for growth.
That's a fair defense, if it turns out we're talking 50% or 25% or 10% of randomly selected EA charities, or the 10% of thirstiest ones, or the 10% most political-candidate ones. I don't think that sounds to be the case: Scott himself has to change some plans for future grants because of this! This estimate says 35% of 2022 EA (organization?) funding!
That's a fairer defense, and the Ontario Pension Plan admins should be facing serious scrutiny, if not potential review of their licensing (if they have any), as should any who make serious crypto investment with other people's money and no extremely clear disclosure. And for people who are solely Effective Altruists, I can accept a certain level of naivete.
But a lot of the EA movement is LessWrong-sphere, or part of the broader rationality sphere. Nor were effected charities limited to EA-normie focuses like lead abatement; the Good Judgement Project is marked for a 300k grant, and Manifold Markets for half a million. That is, the people who were working to make this class of bet accurately. Now, that's necessarily going to be an aspirational thing... but it's good to notice when you've failed to meet your aspirations.
((The sportsball stadium thing is funny, but I don't think it's very wise to admit 'our movement for applied rationality is no better at predicting massive risks to our own interests than the corrupt organization that we've spent a decade using as an example for bad spending and quick ways to get a concussion'.))
And more importantly: EAs can walk away from the money. Someone who knows more than the market has to spend significant amounts of cash, at serious risk, over long periods of time, to make the millions that everyone's mentioning when motioning around EMH. Even the pensioner's funds, as wildly unethical as that was, have contractual or social obligations to get certain levels of increasingly hard to achieve return. Not to defense them, but this is a reason you can't consider their judgement at simple face value.
An EA organization has to... look at other funders? Which, according to Scott's claims here, were already thirsty for good causes to give money to? Scale down operations for a year? Maybe if you were talking a Pink Ribbon knockoff, where your failure means a thousand clones will come to fill the spot after you: Effective Altruism is supposed to, by definition, not be targeting the crowded fields. Risk aversion makes sense in a case where catastrophic failure has longer-term risk.
Well ... there are two at-scale donors (moskowitz and SBF) ... so 1:1 refusal means zero donors?
More options
Context Copy link
Or indeed, "turns out our much-lauded donor and funder was just as happy to have his company's name splashed all over the places we've used as examples of bad ways of donating money". Ouch, indeed.
More options
Context Copy link
Should they though? I think the standard should be higher for institutional investors than for charities accepting donations, but that doesn't necessarily mean the standard for investors should be significantly higher than it already is. They're an easy target because they're partially sponsored by the government, but they were just doing the same thing that the entirely private investors were doing. And the private investors have an appetite for risk because ones that were too risk averse would get outcompeted and replaced in their roles by ones that pursued a more successful strategy. Sequoia Capital is a 50-year-old firm managing $85 billion, and while you could speculate that their employees have recently become less competent or too reckless it seems perfectly plausible that their decision-making here was just the same kind of decision-making that led to these investments:
Meanwhile, charities accepting donations both have less to lose, since rather than outright losing an initial investment there's just any money/time wasted by planning around future funding that doesn't come and vague reputational concerns potentially affecting future donations, and more to gain, since you're outright getting money for nothing rather than trying to get a return for money you already have. There's a direct tradeoff between the two, if it's 35% of your funding you risk having wasted more money if it evaporates, while Sequoia obviously doesn't invest that much in a single company - but if you refuse you know you're out a whole 35% of your potential funding, whereas Sequoia can just invest their money in something else. If it's 100% of your funding because you've been soliciting funding for your new charity and they're the first donors to say yes, there's certainly a risk the money will dry up and destroy your charity if you can't find a substitute, but if you refuse there's a risk you won't find enough donations to begin with. You talk about it killing charities, but if a sudden loss of funding can do that how much is because of "less funds than expected" vs. just "less funds, same as if you had refused"?
The reason why there was more funding than EA charities knew what to do with in the short term was because of FTX suddenly showing up and throwing around a bunch of money, if everyone had refused that wouldn't have been the case. If those other donors don't materialize for the current funding crunch would they have done so to begin with?
It seems like the tradeoffs here pretty strongly favor not being particularly picky about who you accept donations from. Sure if you know someone obtained money from criminality you don't accept the money, but if a dozen institutional investors and the police/SEC don't have a problem then why should you? Now, you could try to mitigate risk in other ways than refusing money outright, like saving more of the money rather than finding ways to spend it immediately, or better yet persuading them to give you a larger endowment rendering you more self-sufficient. But obviously this might not be possible and carries significant disadvantages, for one donors (especially EA donors) want to see actual results from their donations and evaluate your performance, not "we'll do some charity with this money someday". It transfers the risk of the donor having problems to a risk of the charity having problems, like becoming The Wikimedia Foundation with an enormous pile of cash and a huge stream of donations coming in while meanwhile only a tiny fraction gets spent on anything of value. That is after all one of the big problems EA sought to address, and unlike an incompetent/fraudulent for-profit company which eventually collapses to remove the problem, an incompetent/fraudulent charity can continue to waste people's donations indefinitely. I'm not saying that no improvement is possible, for instance maybe there are measures to be more resilient in case funding is lost, but I don't think it justifies extremely costly measures like outright refusing funding because the donor is in a risky field, and I don't think it reflects some deep problem with EA.
Part of my argument is that this failed to meet the existing standards for institutional investors, at least for the pension funds -- I'm not (just) encouraging delicensing for encourage les autres, but because I'm extremely skeptical that this investment could have been made while honestly and completely complying with the Canadian principles for pension fund investments. (though I am not a lawyer, this is not legal advice, so on)
Funny example! The current Sequoia Capital page on FTX is iso-standard lawyer speak, but the previous page was hilarious. I think I can make a pretty strong argument that the people investing 85 Billion USD to a guy who has a business plan worthy of the underpants gnomes and who plays League of Legends was not competent or prudent.
((During an investment meeting, specifically? Well, that too.))
If you refuse 35% of your potential funding, you can go look for other funders. Very few charities are in a field where they just need or are capable of taking in infinite dollars, because either their goals are limited, or their ability to scale up is limited. This is pretty well-understood in EA and GiveWell circles: indeed, a number of GW top charities have been 'capped out' for years.
This also strikes the other direction. Sequoia isn't just investing less in any individual company than a charity was receiving from FTX Future, but it's devoting a lot less in FTX specifically. Sequoia put something like 210 million USD into FTX, which is absolutely embarrassing, and also well less than 1% of the specific Sequoia fund. The Ontario Teacher's Pension is almost an order of magnitude smaller on a percentage-of-assets measure. I don't think they did adequate due dilligence at this level of expenditure, and there are significant limitations to using models like the Kelly criterion even for investment funds, nevermind for charity, but the scope of the difference is a marker.
((Though not all 'conventional' hedge funds were that sober: Galois put nearly half of their bank on FTX, which would have been irresponsible to do for something like Apple.))
There are serious problems specific to having committed funds disappear on short notice, that are not present to not having the funds to start with. In the extremes, you've already bought crap from vendors and have to pay for it, signed leases, hired and/or moved people. Short of that case, there's still a bunch of softer commitments that have significant reputation or financial costs.
If we were talking about FTX-screwed charities as those which focused more on the Earning to Give side than the Rationality one, I think this would be a more plausible argument. My worry is not that someone got snookered: my worry is not even that someone focused on avoiding these risks got made a bad bet. But right now we're combining all of that with a 'whoops, shit happens, nothing we could have done'.
I do find it funny how LOL is catching so many strays in this scandal. Like, its a popular video game. Some hopped up fake billionaire likes it, like the rest of the world of hopped up people.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link