site banner

Friday Fun Thread for September 27, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

AI may invert the common wisdom that studying English is worthless and studying computer science is the wise decision. If AI takes off as anticipated, employers will look for word people who are trained in analyzing prompt replies, using specified and nuanced language, and consuming hundreds of pages of written text per day. English and history students would be great at this.

AI may invert the common wisdom that studying English is worthless and studying computer science is the wise decision. If AI takes off as anticipated, employers will look for word people who are trained in analyzing prompt replies

Look I'm one of the most strident defenders of the value of the humanities on TheMotte, but I have no idea what this is supposed to mean.

If AI is actually generally intelligent then you won't need to "analyze" its replies. You won't need to do prompt engineering, you won't need to do any of that. You'll just tell it to do something and it'll do it. Like any ordinary human. STEM professionals aren't all walking around in an autistic haze where they're unable to have basic interactions with other people. They're quite capable of telling subordinates what to do and verifying that the task was completed, using natural language. Hell, if it really came down to it, you could get the AI to analyze its own replies and do prompt engineering for you! That's what general intelligence entails.

And to the extent that AI falls short of general intelligence, it will likely continue as it does today as essentially a tool for domain experts. In which case, the most important factor for a human will still be their domain expertise and their ability to actually do the job at hand, rather than their AI whispering skills.

Of course there is something to be said for the skill of people management in general, being able to motivate people and keep them on task, playing office politics, things like that - those are real skills that not everyone possesses. But if we're at the point where we have to wrangle our AIs because they're too moody/lazy/rebellious to fulfill your request, that's a bigger problem - one that is more appropriate for the engineers (or the military) to solve, not English majors.

You'll just tell it to do something and it'll do it. Like any ordinary human.

Have you ever actually worked with humans? :-)

You'll just tell it to do something and it'll do it

??? Whatis this supposed to mean?

You convey to the AI what you want to see using precision in language. There is no way for the AI to know what you want without you supplying information to it. Like, if you’re an architect telling the builder what to build, or a sketch artist with probing questions to a witness, or any other basic way in which humans use language to obtain what they want from other humans.

You convey to the AI what you want to see using precision in language. There is no way for the AI to know what you want without you supplying information to it.

Sure. And the limiting factor there is the person's technical domain expertise. Generic "communication" skills are of no help. A programmer can explain much better to another programmer how a piece of software should be constructed than an English major could.

invert the common wisdom that studying English is worthless and studying computer science is the wise decision.

employers will look for word people who are trained in analyzing prompt replies, using specified and nuanced language, [...]

This seems to presuppose that fluency with a language confers understanding of the concepts to which that language and its words can refer. Since you brought up computer science, I have doubts that an English major, without sufficient study of the prerequisites (formal logic, some calculus, basic algorithms, data structures, graph theory - i.e. "studying computer science") would be able to understand, let alone judge the veracity of a proof that an algorithm is correct and has certain performance characteristics. Complaints of "you never need that on the job" notwithstanding, an understanding of the actual problem domain under discussion in (for example) the English language is necessary to walk the AI through certain tradeoffs, designs, and eventually make decisions - or even know that they exist at all. Further, being able to supply a coherent rationale for why a particular decision was made beyond "God AI told me so" would also entail an understanding of those domain-specific complexities.

I suppose I can conceive of an AI so powerful that it will understand, weigh, measure, and decide upon all of these factors on your behalf, while simultaneously being able to discern your intent despite you not having a sufficient understanding of the vocabulary to express it, but in this case the English student would himself also become obsolete.

Phrasing your desired outcome and precisely specifying it requires not just a fluency in language and merely an acquaintance of a domain's vocabulary but also an understanding of the concepts to which that vocabulary refers and the relationships between those concepts. Moreover, in order to intelligibly and productively have a conversation with the AI - that is, to respond back to its replies with follow-up questions or redirections in case it gets off track - one must understand the semantics, or the meaning of those letters on your screen: the things to which they point, and the sense of the statement. Otherwise, you may as well be Searle's Chinese Room, "unintentionally" shunting around meaningless symbols without any understanding of what they mean, all the while maintaining the pretense that you are actually having a sensible dialogue and are capable of moving around and making decisions in the space of solutions and tradeoffs.

Language is merely the technology we use, the medium through which information is serialized and conveyed across minds (including past and future instances of my own brain). As a tool or "bicycle" of the mind it is a good multiplier of one's cognitive capacity, in the same way a pickaxe is a good multiplier of your ability to mine rocks. However, knowing how to swing one will not give you any expertise in prospecting or insight into where to mine for gold. Expertise with a language does not imply expertise with all the possible landscapes, concepts, and ideas that can be expressed within it.

This is how computers already work. English majors turned out to be pretty much the worst type of people at actually communicating clearly. Even now that AI knows some human language it's mostly CS types communicating successfully with it.

What makes you think this trend will reverse so drastically?

Will they? I can see some of those skills transferring, but the patterns of behavior and the quirks and nuances that AI have are going to differ from those that a traditional English student is going to be used to. I would think that a healthy amount of computer science would help understand the underlying mechanisms of AI and thus have a better idea of how and why it misbehaves when it does and how to adjust prompts to fix that.

I expect that the value of an English degree will go up, but not enough to surpass that of a CS degree, since the value of those will also go up. Probably the best case would be people who double majored in English and CS, but I believe those are rare.

using specified and nuanced language

This has always been useful, and is less so about studying English than preciseness of thought. Linguistics maybe. End of day, it's about writing crisp documents.

The best software architects write the best requirements. The best managers drive concise meetings. The best lawyers are linguistic magicians. Don't get me started on politicians, tv experts and the entertainment business. Even doctors have to be insanely precise in how they communicate the specifics of a disease.

It's no surprise that Stanford's symbolic systems has produced a significant number of tech billionaires. It is a field studying exactly what you talk about.

End of day, it's about writing crisp documents.

Honestly I do yearn for a job where this is highly valued. Too often I’ve found myself in places where it’s just bull-in-china-shop stuff, doing things mindlessly and then gathering up the debris mindlessly. I wrote a 6-page memo recently to propose a project and outline its week to week operating mechanisms, medium/long term anticipated benefits and challenges foreseen. May as well have scratched my scrotum for 3 hours for all the good it did.