site banner

Small-Scale Question Sunday for July 21, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

3
Jump in the discussion.

No email address required.

What are the specific jobs most likely to be eliminated by AI?

To my mind, this seems like the obvious answer, based on its currently demonstrated capabilities, that is creative types who follow orders- eg, graphic artists, ghost writers.

Secondly, I’d put call center workers. It can already carry on a conversation, stick to a script, and presumably loading a flow chart isn’t that hard.

Thirdly, there’s a few low level clerical tasks that seem like LLMs can probably take over, like data entry.

Beyond that, though, a lot of the projections for AI-driven job losses seem like they’re dependent on other technologies(eg, truck driver) or delusional about what LLMs can actually do(eg, lawyer).

I'm close to absolute believer in "AI will never truly eliminate people from the workforce who want to stay in the workforce." This is because AI (LLMs) are just another technology. Technology makes certain specific tasks obsolete, but doesn't obsolesce the people executing those tasks. The classic example is that automobiles drastically reducing the need for ferriers, but any competent or sufficiently motivated person of any walk of life could easily become a mechanic. Consumer automotive mechanics have enjoyed employment for over a century now.

What I think will change in a meaningful way is the relative balance of power (and, following that, compensation) for people who have naturally higher competency in soft skills aka "people skills." Most PMC jobs today are a mixture of technical or semi-technical capability and soft-skills. For example, Lawyerin' is partially pseudo-code review, doctors have to actually be able to make a medically sound diagnoses and design a course of treatment. Consultants and bankers less so, but they have their pseudo-technical-jargon signals that one has to be familiar with. All of these occupations, however, require a healthy does (haha, doctors!) of emotive human interaction.

In many cases, the "top" of these careers are 90% if not more human interaction. The "top" bankers, lawyers, consultants and (yes, probably) doctors are effectively very fancy sales people. They delegate the actual work to their underlings, and collect part of the charged fees for that work.

AI(LLMs) mean that one person with a little bit of technical knowledge can "in-source" a lot of the semi-technical work to the AI (which is still close enough to free compared to an actual employee or subcontractor) while doubling down on their sales abilities. This is actually even true for hard tech careers like software development - I know contract developers who have increased their on paper workloads by 50% or more but are actually working the same to less hours because Claude is handling a lot of code boilerplate and documentation for them.

That being said, not all soft skills are created the same. One PMC soft skills I've seen for my entire career is something you might call "fast following politicking." These are folks who can gauge the relative social standings of various people and groups within a larger org and can quickly align themselves with those on the way up and avoid those on the way down (or out). They cannot, however, actually make their own decisions, set a course for others to follow, or demonstrate anything even close to "leadership." They used to be relegated mostly to HR and marketing positions, but have metastasized into "Product Management" (aka the Sociology of the corporate world) in recent years.

I am optimistic these people will find themselves without jobs sooner rather than later. While their shenanigans are usually readily apparent to anyone that matters, they pay their rent to the company by being process automatons. These are the folks who have incredibly well formatted spreadsheets and power points (with almost no content within that formatting). Yet, up until now, some of that basic information processing had to be done by someone ... and they were there. I believe AI will fix that.

So far bigger models have not fixed the serious flaws that all LLMs have: they have no common sense and make boneheaded errors. Importantly, they don't learn from these errors without more training. So when your production LLM is messing up, there's often no way to give it feedback.

Maybe the next model will solve this problem. The progress has been fast enough that I wouldn't doubt anything. But assuming this isn't fixed right away...

Not replaceable: Truck drivers, call center workers, data entry

Replaceable: Lawyers and paralegals

What are LLMs good at? Searching for information and creating boiler plate. Instead of lawyers reading hundreds of pages of legal documents, just load it into an LLM and then ask it questions. Want to create 10 pages of fancy legalese? Just write it in plain English and have the LLM make it sound like a lawyer wrote it.

LLMs are probably already capable of doing 90% of what lawyers do.

On the other side, there's a good chance lawyers will use this new efficiency to create longer and longer legal documents that require an LLM to parse.

In criminal law the state pays the bills and a hugely disproportionate percentage of senior government officials and politicians are (former) lawyers, so I doubt that their sinecures end any time soon. In commercial/corporate law the profession as a whole is already hugely over staffed and overpaid, the issue is it’s an endless defection game where you have to pay a ton of money to get ‘the best’ lawyers so you don’t get fucked by various sneaky techniques employed by the other side who have ‘the best’ lawyers and so on (both of these lawyers went to the same institutions and are friends and indeed even coworkers on other cases). It’s essentially a tax on commercial activity facilitated via the legal profession.

In many ways a lot of finance is similar, so I sympathize, but the incentives lean against automation as long as the people making the decisions can hand out money to their friends, and that describes an extremely high percentage of PMC jobs in professional services (law, finance, consulting, accounting).

Lawyers using LLMs has mostly not gone well for them thus far.

Are we sure about that? There's surely some good examples of mistakes, but what we don't see is the millions of legal documents that are already searched or created with LLMs.

What I am proposing is not a LLM-lawyer, but that an LLM can be a force amplifier for existing lawyers.