This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Because they aren't. They're collectively deluding themselves into believing in the «soul» and that programming will never be automated by AI. Just like certain artists are.
I am a programmer. OpenAI scares me. I'm putting every effort I've got into the Grind, because I think the industry's due for a phenomenal crash that'll leave the majority in the dumps. You are free to disagree.
There is no problem humans face that cannot be reframed as a programming or automation problem. Need food? Build a robot to grow it for you, and another to deliver it to your house. Need to build a robot? Make a factory that automates robot fabrication. Need to solve X medical issue? Write a program that figures out using simulations or whatever how to synthesize a chemical or machine that fixes it. Given this, the question of "what happens to programmers when computers can write code for arbitrary domains just as well as programmers can" answers itself.
I expect that fully automating coding will be the last job anybody ever does, either because we're all dead or we have realized Fully Automated Luxury Space Communism.
Is there something misleading with the way I phrased my comment? I don't understand why multiple people have succeeded in reading "programmers will be completely replaced by AI" into my words.
And this isn't a nitpicking thing. It is an extremely important distinction; I see this in the same way as the Pareto Principle. The AI labs are going to quickly churn out models good enough to cover 95% of the work the average software engineer does, and the programming community will reach a depressive state where everyone's viciously competing for that last 5% until true AGI arrives.
Your first paragraph misses how hard it is for human programmers to achieve those things, if it is even possible under current circumstances (find me a program that can acquire farmland & construct robots for it & harvest everything & prepare meals from raw materials). Even hiring an army of programmers (AI or no) would not satisfy the preconditions necessary for getting your own food supply, namely having an actual physical presence. You need to step beyond distributed human-level abilities into superhuman AI turf for that to happen.
There is a sense in which the job of coding has already been automated away several times. For instance, high-level languages enable a single programmer to accomplish work that would be out of the grasp of even a dozen assembly-language programmers. (This did, in fact, trash the job market for assembly-language programmers.)
The reason this hasn't resulted in an actual decline in programmer jobs over time is because each time a major tool is invented that makes programming easier (or eliminates the necessity for it in particular domains), people immediately set their sights on more-difficult tasks that were considered impractical or impossible in the previous paradigm.
I don't really see the mechanism by which AI-assisted programming is different in this way. Sure, it means a subset of programming problems will no longer be done by humans. That just means humans will be freed to work on programming and engineering problems that AI can't do, or at least can't do yet; and they'll have the assistance of the AI programmers that automated away their previous jobs.
And if there are no more engineering or programming problems like that, then you now have Automated Luxury Space Communism.
Roughly speaking, I see your point and agree that it's possible we're just climbing a step further up on an infinite ladder of "things to do with computers".
But I disagree that it's the most likely outcome, because:
I think the continued expansion of the domain space for individual programmers can be partially attributed to Moore's Law. More Is Different; a JavaScript equivalent could've easily been developed in the 80s but simply wasn't because there wasn't enough computational slack at the time for a sandboxed garbage collected asyncronous scripting language to run complex enterprise graphical applications. Without the regular growth in computational power, I expect innovations to slow.
Cognitive limits. Say a full stack developer gets to finish their work in 10% of the time. Okay, now what? Are they going to spin up a completely different project? Make a fuzzer, a GAN, an SAT solver, all for fun? The future ability of AI tools to spin up entire codebases on demand does not help in the human learning process of figuring out what actually needs to be done. And if someone makes a language model to fix that problem, then domain knowledge becomes irrelevant and everyone (and thus no one) becomes a programmer.
I think, regardless of AI, that the industry is oversaturated and due for mass layoffs. There are currently weak trends pointing in this direction, but I wouldn't blame anyone for continuing to bet on its growth.
For (1), what you're saying is certainly true; the better abstractions and better tooling has been accompanied by growth in hardware fundamentals that cannot be reasonably expected to continue.
(2) is where I'm a lot more skeptical. A sufficient-- though certainly not necessary-- condition for a valuable software project is identifying a thing that requires human labor that a computer could, potentially, be doing instead.
The reason I called out robotics specifically is because, yeah, if you think about "software" as just meaning "stuff that runs on a desktop computer", well, there's lots of spheres of human activity that occur away from a computer. But the field of robotics represents the set of things that computers can be made to do in the real world.
That being so, if non-robotics software becomes trivial to write I expect we are in one of four possible worlds:
World one: General-purpose robotics-- for example, building robots that plant and harvest crops-- is possible for (AI-assisted) human programmers to do, but it's intrinsically really hard even with AI support, so human programmers/engineers still have to be employed to do it. This seems like a plausible world that we could exist in, and seems basically similar to our current world except that the programmer-gold-rush is in robotics instead of web apps.
World two: General-purpose robotics is really easy for non-programmers if you just make an AI do the robot programming. That means "programming" stops being especially lucrative as a profession, since programming has been automated away. It also means that every other job has been (or will very soon be) automated away. This is Fully-Automated Luxury Space Communism world, and also seems broadly plausible.
World three: General-purpose robotics is impossible at human or AI levels of cognition, but non-robotics AI-assisted programming is otherwise trivial. I acknowledge this is a world where mass layoffs of programmers would occur and that this would be a problem for us. I also do not think this is a very likely scenario; general-purpose robotics is very hard but I have no specific reason to believe it's impossible, especially if AI software development has advanced to the point where almost all other programming is trivial.
World four: World two, except somebody screwed up the programming on one of their robot-programming AIs such that it murders everyone instead of performing useful labor. This strikes me as another plausible outcome.
Are there possibilities I'm missing that seem to you reasonably likely?
For your point (3), I have no particular expectations or insight one way or another.
Hi, I just want to leave a stub response: you seem right and I failed to type a recent response after reading 2 days ago.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
We've been trying to innovate ourselves out of a job since the very beginning. I work with high powered business people that frequently can't even manage the most basic computer tasks, let alone automate them. We'll have a niche so long as people continue to work. What a glorious day it will be when all that intelligence and ingenuity is put to tasks other than making ads serve 0.2% faster.
More options
Context Copy link
You really think AI is going to replace programmers? If it does then it will be smart enough to self-modify, and then career concerns are the least of our worries.
This does not work out the way you think it will. A p99-human tier parallelised unaligned coding AI will be able to do the work of any programmer, will be able to take down most online infrastructure by merit of security expertise, but won't be sufficient for a Skynet Uprising, because that AI still needs to solve for the "getting out of the digital box and building a robot army" part.
If the programming AI was a generalised intelligence, then of course we'd be all fucked immediately. But that's not how this works. What we have are massive language models that are pretty good at tackling any kind of request that involves text generation. Solve for forgetfulness in transformer models and you'll only need one dude to maintain that full stack app instead of 50.
What I'm saying is that AI's are made of code. If they can write code then they can improve themselves. An AI able to code better than people can also code a better AI than people can. Maybe you don't think that that will lead to recursive self-modification--I think there's at least a good chance that there are diminishing returns there--but just consider the advances we've made in AI in the last year, and you're supposing a future where not only have we gotten farther but then there's another entity capable of going farther still. At a bare minimum I think an AI capable of doing that is capable of replacing most other careers too.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I too am a programmer, but fortunately the German software industry is so far behind the times and slow to evolve and German employment laws in general are so strict in their regulations in favor of employees that I think I can safely coast halfway to retirement before I feel any market pressure.
That depends on how much of a difference AI will make, doesn't it? If advanced AI enables big American corps to churn out absurdly efficient code or highly advanced machine designs in minimal time, what will sclerotic German companies do?
I used to work at Siemens and half the people employed as programmers there thought that automating things in Excel was black magic, let alone doing basic things in Python with libraries like pandas. The difference in productivity compared to its rivals is small enough that coasting on momentum of past strengths might be sufficient to stay relevant in the present, but strong AI could plausibly make a lot of crusty German institutions obsolete in a way that our lawmakers won't be able to compensate for.
Stop scaring me. If the statists are going to tax me anyways then the least I expect to receive in exchange is the illusion of security.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link