site banner

Friday Fun Thread for September 27, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

2
Jump in the discussion.

No email address required.

You'll just tell it to do something and it'll do it

??? Whatis this supposed to mean?

You convey to the AI what you want to see using precision in language. There is no way for the AI to know what you want without you supplying information to it. Like, if you’re an architect telling the builder what to build, or a sketch artist with probing questions to a witness, or any other basic way in which humans use language to obtain what they want from other humans.

You convey to the AI what you want to see using precision in language. There is no way for the AI to know what you want without you supplying information to it.

Sure. And the limiting factor there is the person's technical domain expertise. Generic "communication" skills are of no help. A programmer can explain much better to another programmer how a piece of software should be constructed than an English major could.

invert the common wisdom that studying English is worthless and studying computer science is the wise decision.

employers will look for word people who are trained in analyzing prompt replies, using specified and nuanced language, [...]

This seems to presuppose that fluency with a language confers understanding of the concepts to which that language and its words can refer. Since you brought up computer science, I have doubts that an English major, without sufficient study of the prerequisites (formal logic, some calculus, basic algorithms, data structures, graph theory - i.e. "studying computer science") would be able to understand, let alone judge the veracity of a proof that an algorithm is correct and has certain performance characteristics. Complaints of "you never need that on the job" notwithstanding, an understanding of the actual problem domain under discussion in (for example) the English language is necessary to walk the AI through certain tradeoffs, designs, and eventually make decisions - or even know that they exist at all. Further, being able to supply a coherent rationale for why a particular decision was made beyond "God AI told me so" would also entail an understanding of those domain-specific complexities.

I suppose I can conceive of an AI so powerful that it will understand, weigh, measure, and decide upon all of these factors on your behalf, while simultaneously being able to discern your intent despite you not having a sufficient understanding of the vocabulary to express it, but in this case the English student would himself also become obsolete.

Phrasing your desired outcome and precisely specifying it requires not just a fluency in language and merely an acquaintance of a domain's vocabulary but also an understanding of the concepts to which that vocabulary refers and the relationships between those concepts. Moreover, in order to intelligibly and productively have a conversation with the AI - that is, to respond back to its replies with follow-up questions or redirections in case it gets off track - one must understand the semantics, or the meaning of those letters on your screen: the things to which they point, and the sense of the statement. Otherwise, you may as well be Searle's Chinese Room, "unintentionally" shunting around meaningless symbols without any understanding of what they mean, all the while maintaining the pretense that you are actually having a sensible dialogue and are capable of moving around and making decisions in the space of solutions and tradeoffs.

Language is merely the technology we use, the medium through which information is serialized and conveyed across minds (including past and future instances of my own brain). As a tool or "bicycle" of the mind it is a good multiplier of one's cognitive capacity, in the same way a pickaxe is a good multiplier of your ability to mine rocks. However, knowing how to swing one will not give you any expertise in prospecting or insight into where to mine for gold. Expertise with a language does not imply expertise with all the possible landscapes, concepts, and ideas that can be expressed within it.

This is how computers already work. English majors turned out to be pretty much the worst type of people at actually communicating clearly. Even now that AI knows some human language it's mostly CS types communicating successfully with it.

What makes you think this trend will reverse so drastically?