The most common response to "AI took my job" is "don't worry, soon the AI will take everyone's jobs, then we'll all have UBI and we won't have to work anymore." The basic thesis is that after the advent of AGI, we will enter a post-scarcity era. But, we still have to deal with the fact that we live on a planet with a finite amount of space and a finite number of physical resources, so it's hard to see how we could ever get to true post-scarcity. Why don't more people bring this up? Has anyone written about this before?
Let's say we're living in the post-scarcity era and I want a Playstation 5. Machines do all the work now, so it should be a simple matter of going to the nearest AI terminal and asking it to whip me up a Playstation 5, right? But what if I ask for BB(15) Playstation 5's? That's going to be a problem, because the machine could work until the heat death of the universe and still not complete the request. I don't even have to ask for an impossibly large number - I could just ask for a smaller but still very large number, one that is in principle achievable but will tie up most of the earth's manufacturing capacity for several decades. Obviously, if there are no limits on what a person can ask for, then the system will be highly vulnerable to abuse from bad actors who just want to watch the world burn. Even disregarding malicious attacks, the abundance of free goods will encourage people to reproduce more, which will put more and more strain on the planet's ability to provide.
This leads us into the idea that a sort of command economy will be required - post-scarcity with an asterisk. Yes, you don't have to work anymore, but in exchange, there will have to be a centralized authority that will set rules on what you can get, and in what amounts, and when. Historically, command economies haven't worked out too well. They're ripe for political abuse and tend to serve the interests of the people who actually get to issue the commands.
I suppose the response to this is that the AI will decide how to allocate resources to everyone. Its decisions will be final and non-negotiable, and we will have to trust that it is wise and ethical. I'm not actually sure if such a thing is possible, though. Global resource distribution may simply remain a computationally intractable problem into the far future, in which case we would end up with a hybrid system where we would still have humans at the top, distributing the spoils of AI labor to the unwashed masses. I'm not sure if this is better or worse than a system where the AI was the sole arbiter of all decisions. I would prefer not to live in either world.
I don't put much stock in the idea that a superhuman AI will figure out how to permanently solve all problems of resource scarcity. No matter how smart it is, there are still physical limitations that can't be ignored.
TL;DR the singularity is more likely to produce the WEF vision of living in ze pod and eating ze bugs, rather than whatever prosaic Garden of Eden you're imagining.
Jump in the discussion.
No email address required.
Notes -
Yep. Perfectly succinct way of putting it.
True. If the rich (or whoever develops and gains control of AGI first) can just rely on AGI for all their labor needs, then why should they even bother to keep the peons alive with UBI? They could just as easily let everyone else starve - what motivation would they have besides pure altruism?
It won't be the rich. Not nasty enough as a class.
The moment you invent a replicator that can fit in a large building or a big ship, some sect or ethnic group, somewhere is going to think, "gee, I don't need the rest of you, I'm going to wipe you out with weaponised smallpox and mop up with kill bots and you won't know what hit you".
Countries can't do this, they'd get nuked if they tried something funny with biowarfare. Small groups, on the other hand.
More options
Context Copy link
I suspect that any hopes that pure altruism would be enough aren't terribly well-founded. Counting on cultural inertia to overcome changes in incentives doesn't seem to have a good track record; I can easily envision moral arguments that it's best to euthanize the excess population "for their own good" flipping from unthinkable to unquestionable in not too much time.
More options
Context Copy link
More options
Context Copy link