site banner

Friday Fun Thread for September 27, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

AI may invert the common wisdom that studying English is worthless and studying computer science is the wise decision. If AI takes off as anticipated, employers will look for word people who are trained in analyzing prompt replies

Look I'm one of the most strident defenders of the value of the humanities on TheMotte, but I have no idea what this is supposed to mean.

If AI is actually generally intelligent then you won't need to "analyze" its replies. You won't need to do prompt engineering, you won't need to do any of that. You'll just tell it to do something and it'll do it. Like any ordinary human. STEM professionals aren't all walking around in an autistic haze where they're unable to have basic interactions with other people. They're quite capable of telling subordinates what to do and verifying that the task was completed, using natural language. Hell, if it really came down to it, you could get the AI to analyze its own replies and do prompt engineering for you! That's what general intelligence entails.

And to the extent that AI falls short of general intelligence, it will likely continue as it does today as essentially a tool for domain experts. In which case, the most important factor for a human will still be their domain expertise and their ability to actually do the job at hand, rather than their AI whispering skills.

Of course there is something to be said for the skill of people management in general, being able to motivate people and keep them on task, playing office politics, things like that - those are real skills that not everyone possesses. But if we're at the point where we have to wrangle our AIs because they're too moody/lazy/rebellious to fulfill your request, that's a bigger problem - one that is more appropriate for the engineers (or the military) to solve, not English majors.

You'll just tell it to do something and it'll do it. Like any ordinary human.

Have you ever actually worked with humans? :-)

You'll just tell it to do something and it'll do it

??? Whatis this supposed to mean?

You convey to the AI what you want to see using precision in language. There is no way for the AI to know what you want without you supplying information to it. Like, if you’re an architect telling the builder what to build, or a sketch artist with probing questions to a witness, or any other basic way in which humans use language to obtain what they want from other humans.

You convey to the AI what you want to see using precision in language. There is no way for the AI to know what you want without you supplying information to it.

Sure. And the limiting factor there is the person's technical domain expertise. Generic "communication" skills are of no help. A programmer can explain much better to another programmer how a piece of software should be constructed than an English major could.

invert the common wisdom that studying English is worthless and studying computer science is the wise decision.

employers will look for word people who are trained in analyzing prompt replies, using specified and nuanced language, [...]

This seems to presuppose that fluency with a language confers understanding of the concepts to which that language and its words can refer. Since you brought up computer science, I have doubts that an English major, without sufficient study of the prerequisites (formal logic, some calculus, basic algorithms, data structures, graph theory - i.e. "studying computer science") would be able to understand, let alone judge the veracity of a proof that an algorithm is correct and has certain performance characteristics. Complaints of "you never need that on the job" notwithstanding, an understanding of the actual problem domain under discussion in (for example) the English language is necessary to walk the AI through certain tradeoffs, designs, and eventually make decisions - or even know that they exist at all. Further, being able to supply a coherent rationale for why a particular decision was made beyond "God AI told me so" would also entail an understanding of those domain-specific complexities.

I suppose I can conceive of an AI so powerful that it will understand, weigh, measure, and decide upon all of these factors on your behalf, while simultaneously being able to discern your intent despite you not having a sufficient understanding of the vocabulary to express it, but in this case the English student would himself also become obsolete.

Phrasing your desired outcome and precisely specifying it requires not just a fluency in language and merely an acquaintance of a domain's vocabulary but also an understanding of the concepts to which that vocabulary refers and the relationships between those concepts. Moreover, in order to intelligibly and productively have a conversation with the AI - that is, to respond back to its replies with follow-up questions or redirections in case it gets off track - one must understand the semantics, or the meaning of those letters on your screen: the things to which they point, and the sense of the statement. Otherwise, you may as well be Searle's Chinese Room, "unintentionally" shunting around meaningless symbols without any understanding of what they mean, all the while maintaining the pretense that you are actually having a sensible dialogue and are capable of moving around and making decisions in the space of solutions and tradeoffs.

Language is merely the technology we use, the medium through which information is serialized and conveyed across minds (including past and future instances of my own brain). As a tool or "bicycle" of the mind it is a good multiplier of one's cognitive capacity, in the same way a pickaxe is a good multiplier of your ability to mine rocks. However, knowing how to swing one will not give you any expertise in prospecting or insight into where to mine for gold. Expertise with a language does not imply expertise with all the possible landscapes, concepts, and ideas that can be expressed within it.

This is how computers already work. English majors turned out to be pretty much the worst type of people at actually communicating clearly. Even now that AI knows some human language it's mostly CS types communicating successfully with it.

What makes you think this trend will reverse so drastically?