site banner

Culture War Roundup for the week of November 7, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

I am not a historian, but I can see parallels between your art-cyborgs and computer programmers through the history of programming. I am told, through peers, media, professors, and culture, that in the beginning days all programs were made by hand; every operation scrutinized and thought out. But as computer-time because cheaper and programmer-time becoming more expensive, other programming languages were created as abstractions over the previous languages. C simplifies the construction of loops (which seem horrible in assembly). Python removes pointers (which are a pain-point for many programmers). Github co-lab and GPT-3 can remove a lot of boilerplate code with good prompts (though I've never used them, so they may not be that wonderful). Matlab, SPARK, lisp, and others can probably fit into the progression, but I am unfamiliar with them.

It seems that, inevitability, programming will be further and further abstracted away from the origins of programming and what the computer is actually doing. People of course still do some work in assembly (but it is usually niche from my understanding). People still use C (sometimes the problem really is a nail). And programming in general hasn't been overrun with computer generated functions that can generate complex elements from plain english.

I expect though, that the next wave of programmers are going to be like the art-cyborgs. They are going to be adept at automating 80% of the work with AI and then the rest is going to be manual work to fix errors or edge cases and combining multiple functions to form a complete program. This AI automation is just another layer of abstraction for the programmer from what the computer is actually doing. If programming used to be the translation of English to machine instructions, and now it's the translation of English to code, eventually we will have the machines translate for us and programming will be the art of typing the right words into the computer machine.

i.e. The next programming language is English and programmers are just the ones skilled at putting the right words and symbols into the computer to get the right answers out.

But to not wander too far away, I feel the same thing about art or whatever previously unfathomably inherently human skill is done in the realm of AI (perhaps music?). AI generated images are a tool that future artists might use to enhance or speed up the translation of idea to canvas. Not too dissimilar from the transition of physical to digital art (but as I said, I'm neither an artist nor a historian). Even if the art is purely collage-style, I don't think that makes them have any less art. Making a collage still requires skill, and given a magazine with scissors it is hard to stumble into a good creation by pure chance. Even still, can a sufficient description not be art by its own merit? Books have the ability to paint wondrous pictures leveraging your imagination (unless you suffer from aphantasia). Maybe it's not fair to compare writers to painters and illustrators, but that is a different argument than saying it's not art.

To summarize because I need a conclusion and am not a writer: AI art is a tool that will be used by artists to quickly iterate on ideas. This parallels computer programming which used more abstract languages to help programmers quickly iterate on ideas.

The programming parallel makes sense to me, and there are plenty of similar parallels throughout history. Humans using technology in order to make arduous processes easier, leading to each human having much more leverage leading to that level of leverage being the norm, with additional technologies built on top of that to make those processes easier, and so on and so forth. In the past, it might have been the wheel or a cart or a bow or a plow or a car, right now AI is one of them.

And at each step in the process, it seems like there have been people who were accustomed to the old norm decrying the new one as some abomination that lacked the "soul" or "essence" of the thing. A digital artist today is standing on the shoulders of giants, relying on the hardware and software development of engineers to contribute to their art, including the specific choice of brush strokes that the software developers programmed in. They would balk at traditional artists who insist that you must actually put paint on canvas using a brush that you control physically, in order to capture the subtle nuances of the muscle movements that result from the unique set of training that the artist went through. And those artists would balk at even more traditional artists who insist that you must actually construct your own brushes by gluing together hair that you gather manually and mix your own paint, in order to capture the subtle nuances of the choices you made when constructing the tools that show up in the final result due to the tools being used. And those artists would balk at even more traditional artists who insist that you must actually raise the animal from which the brush hair came from and tended to the tree from which the wood in the brush or painting surface came from, in order to capture the subtle nuances of the choices you made when prepping the raw material for the tools.

And each of these people would have a point. A very good point worth making. But the point would largely be lost on the person listening, who doesn't particularly see those as worth the trade-off of losing the immense amount of efficiency and creative freedom. After all, with the additional efficiency, now they can create far greater, broader, and deeper works of art than with the previous methods. But to someone who's used to the old norms, these efficiency gains just look like shortcuts that only have a cargo cult understanding of the process.

AI generated imagery is different, in just how much of a leap in abstraction it is compared to the other ones. A digital artist still has skills that would transfer very well to painting on canvas. An AI tool user's skill doesn't need to go far beyond basic Photoshop skills and a basic artistic compositional skills. It's still so early right now that I don't think it's possible to tell, but based on how I've seen actual artists use AI generated images the last few weeks, I suspect that it will be more similar to the other ones than different; that in time, we'll see it as just another tool to increase an artist's leverage in expressing themselves.

Python removes pointers (which are a pain-point for many programmers)

Just to nitpick this sentence and say nothing about the rest of your essay, Python hardly removes pointers. It just removes the ability to do pointer arithmetic, just like Javascript, go, c# and many other newer languages do. The essence of pointer, having a cheap way to refer to a load of data elsewhere, is still there. If anything Python removes the ability to refer to data in any other way. You pass a class instance by value and you can't inline a struct inside another like you can in C.

And Python even gives you a new way to subtly shoot yourself in the foot over pointers, which is the is operator. 5 is 5 and True is True but 5**55 is not 5**55 (on my system). In fairness the only real use case for this is to check for thing is not None when dealing with optional arguments, or perhaps more niche a is b checks when you're absolutely sure a and b come from the same finite pool of objects, as a way to optimize away a more expensive deep a == b check.

If programming used to be the translation of English to machine instructions, and now it's the translation of English to code, eventually we will have the machines translate for us and programming will be the art of typing the right words into the computer machine.

This is practically already the case for a lot of programming. The issue isn't actually writing the code, it's understanding what the problem even is.

People are both bad at and unused to thinking at a higher level of specificity and logic. This is apparent when you read what people write if they write something longer and more complex than a 1-2 paragraph comment. People's texts are rife with dangling modifiers making even their central points unclear, and this is rarely by mistake either, they don't even realize they are there or why it matters.

C simplifies the construction of loops (which seem horrible in assembly)

This isn't hugely relevant to your point, but loops really aren't that bad in assembly. Basic for loop from 0-100 in assembly will be something like:

mov rcx, 0 ; set counter to 0

mov rax, 100 ; target number

.loop:

; whatever you want to do in your loop goes here

inc rcx ; counter += 1

cmp rcx, rax

jne .loop ; with above line, compare the counter to the target and loop if the target isn't reached


That really isn't particularly bad, though certainly not quite as nice as C or another higher-level language.

;whatever you want to do in your loop goes here

^ Is doing all the heavy lifting of "isn't particularly bad".

Nah, not really. Sure, whatever you put in the loop body will certainly be more verbose and harder to write than if you wrote it in C. But that isn't relevant to the notion that loops are hard to write in and of themselves in assembly. That's why I didn't include anything inside the loop, to show that the loop itself isn't hard to write.

Now, if we're saying assembly in general is harder to write? I will totally agree. I wouldn't go so far as to say it's super horrible or anything, but it is harder for sure. There is a reason we invented higher level languages. My point here was simply that loops are not particularly bad.

Maybe loops were a bad example. To be fair, I never wrote anything more than the bare minimum in assembly. I don't have deep knowledge of it and went for the first thing I could think of. The main point was C provides a bunch of niceties on top of assembly. And Python provides a lot of niceties over C. My mathematician peers would probably love to write their for loops in the style of for i,j,k ∈ +Z^3; i,j,k <= 10 { } which is a lot more abstract than loops in previous languages. Eventually it might become For all positive integer 3-tuples each value less than 10 { }. Heck maybe even For all the points in a cube with side length 10 { }. We lose some specificity, choosing which direction to iterate over first, but we are rewarded with reduced conceptual load.

Maybe what you mean is "assembly is bad at scopes" (in that the "whatever you want to do in your loop" has to remember which registers you've already used for other purposes outside of the loop; the same problem arises for procedure calls)?

Seems fair, but I agree with @SubstantialFrivolity that it's weird to characterize that as assembly being bad at loops. It's bad at naming. And getting the name of your complaint right is half the battle -- the other half is cache invalidation and the other half is off by one errors.

hey now

Our power (and income) relies on the kids trained on Java and Python viewing assembly as some sort of dark magic, I can't let you just hand out eldritch knowledge willy-nilly ;-)

The new programming literacy test: Write Fizzbuzz. In assembler. 6502 assembler. And it has to accept values up to 100,000. You may output a character by calling a subroutine at 0xFDED with the character in the accumulator, after which all register contents are lost.

(it will surprise nobody familiar with the subject that Fizzbuzz in 6502 can be found on the net)

6502 is one of the instruction sets that were actually kind of nice to write by hand, though. I guess you'd do something that amounts to keeping your loop counter mod 15 in the low 4 bits of X and then use the remaining 13 bits in X and Y plus some flag you don't touch to get to 100k? I'd rather do this exercise than MIPS or some nasty SIMD and/or RISC special-purpose core...

Yes, all the 8-bit processors were pretty easy to write for by hand; the trick is they don't have multiplication, division, or 24-bit numbers. (Most can do limited 16-bit arithmetic). Getting that right isn't hard but it probably requires you've done low-level work before.

MIPS isn't hard either, provided you just throw a no-op in the delay slot.

I'd rather use 68k which had a bit more to work with (and I still have a physical reference manual) and it had more consistent behavior rather than the fun quirks of 6502.

Yeah, the 68000 series was probably the peak of hand-writable assembly language. But it's a 32-bit processor with multiplication and division, way too easy.

While you're at it, mine your own silicon for the CPU.

PrimitiveTechnology has entered the chat.

Sorry, I didn't mean to let guild secrets out into the open like that.

Honestly I find it kind of depressing just how many of our new hires now seem to view even C and basic command line functions as eldritch knowledge. Like come on, what do you guys even do in school these days?

Given that I just spent two days dealing with Linux kernel driver conflicts even though that is obstensibly not my job I think you're right ;-)