This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Kathleen Booth, early British computer scientist, died one month ago on Sept 29 at the age of 100. The Register published an obituary for her titled "RIP: Kathleen Booth, the inventor of assembly language."
One month later, yesterday, a link to the obituary was the top (I think) post 1, 2 on both Hacker News and Reddit's programming subreddit
Many of the commenters lamented that they had never heard of this highly influential person, and other commenters suggested that the reason most people hadn't heard of her is because she was a woman.
Ironically, I would contend, the only reason we are hearing about her is because she was a woman.
Calling her the inventor of assembly language may be a stretch. One HN comment points out that the IEEE has already given a computer pioneer award to David Wheeler for inventing the first assembly language in 1949.
You can read her 1947 paper and decide if the table at the end counts as the first assembly language. It is a numbered list of 25 operations, a symbolic description of their action, and in a few cases an English description of their operation. It is, at least superficially, similar to the list of 30 operations Wheeler created for the EDSAC.
Ran out of time to delve into:
Grace Hopper falsely being credited for inventing COBOL
If Ada Lovelace invented programming, and if she did but no one knew about it and it didn't influence anyone else, should we credit her?
Booth's credit being recently discovered/promoted in 2018 Hackaday article
Margaret Hamilton being the only programmer popularly known for Apollo work, despite leading a small team of 3 people.
Hamilton and Booth both marrying their bosses.
All of these women being impressive in their own right, and exaggerating their contributions for Girl Power is a disservice to them.
Giving proper credit for genuine insight in science and tech is often completely broken. I wouldn't be surprised to see "secret issues" with nearly every name that is famous enough to be known to this forum.
More options
Context Copy link
For Ada to be the first programmer, Charles Babbage must have designed the first programmable computer without ever coming up with any test programs for it.
He decided what instructions you'd need, and that you'd need multiple ones, then never thought about combining 2 or more together to do anything, ever.
Then lucky for him, a programmer appears!
I think this is disingenuous. Babbage certainly created the first design for a programmable computer as we know it, and clearly would have given considerable thought to combining instructions together. But if Lovelace was the first person to actually spend a significant amount of time constructing lists of instructions to solve particular problems then I don't think it's unreasonable to describe her as the first programmer.
By way of (concrete) analogy: in the fourth year comp eng CPU Design course I took as an undergrad, we all created pipelined RISC CPU designs in VHDL, and used an emulator to test them. To that end I did input several sequences of instruction to ensure that the (emulated) hardware was operating as it should, with the ALU generating the correct results, the pipeline correctly handling various hazards, etc., and while these might technically be "programs" I was not "programming" in any meaningful way. Like Babbage (and thousands of students before me) I created a design for a CPU which will never physically exist. Unlike Babbage, my non-existent CPU would never attract even a single programmer.
You designed your cpu for a class, long after people had come up with the idea of a programmable computer, and determining what instructions are needed.
Every instruction that needs to be put in hardware adds complexity, so knowing that some operation can be achieved by another one is very valuable. Babbage's machine had memory, if statements and loops - quite an achievement to know these are needed to write algorithms without writing any.
Making a device programmable, rather than hardcoded is an amazing mental leap and also a massive jump in complexity. One that I believe you would only make if you had at least 2 programs you wanted to run
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I'd also add that it's not the case that Lovelace was the first programmer.
https://www.bbvaopenmind.com/en/technology/visionaries/ada-lovelace-original-and-visionary-but-no-programmer/
"[T]oday it seems clear that it was Babbage, not Lovelace, who was the first programmer. The computer historian Doron Swade, a prominent world expert on the work of Babbage, settles the controversy with new data presented at the symposium now taking place at the University of Oxford to mark the 200th anniversary of the birth of Lovelace, and reveals a sneak preview to OpenMind: “I confirm that the manuscript evidence clearly shows that Babbage wrote ‘programs’ for his Analytical Engine in 1836-7 i.e. 6-7 years before the publication of Lovelace’s article in 1843. There are about 24 of such ‘programs’ and they have the identical features of the Lovelace’s famous ‘program’,” adds Swade. The historian says that the new tests are “unarguable” and that they “do not support, indeed they contradict the claim that Lovelace was the ‘first programmer’.”"
This is not to say that Lovelace wasn't impressive, but there's generally a tendency to vastly overinflate women's contributions to the field for political purposes and then to subsequently claim that they had their accomplishments stolen from them when plenty of these supposed under-recognised "accomplishments" are actually fictitious. In fact, one can claim that the accomplishment actually attributable to Babbage was "stolen" by Lovelace, and that the willingness to accept this was in large part a result of the fact that she was a woman.
Wow, thanks for sharing. I would love to see you post it on the Reddit adoration thread in /r/programming!
More options
Context Copy link
More options
Context Copy link
Yeah, this one is so common and so annoying. As you note there are many patents predating Lamarr and Antheil's: for example by Purington (in 1940 and 1935); by Broertjes in 1929; by Chaffee in 1922; by Blackwell, Martin, and Vernam in 1926, and even by Tesla in 1903. Jonathan Zenneck's text Wireless Telegraphy (which appeared in Germany in 1909) also contains a section on frequency hopping, and adds in a footnote that “This method was adapted by the Telefunken Co. at one time,” showing that the core principle was applied as early as the opening of the 20th century. Frequency hopping was most certainly not pioneered by Lamarr.
Additionally, she was at one point married to an Austrian munitions manufacturer whose work she had access to and interest in, and German engineers before World War II were aware of frequency hopping - it is not beyond the realm of possibility that the idea could've been derived from her former husband's colleagues.
Edit: added more
More options
Context Copy link
Booth's list is not an assembly language. It is a suggested list of operations for a machine that has not been built. An assembly language is a machine readable mnemonic representation of low level instructions. Wheeler's assembly language could be punched onto tape, which the machine would read and then translate to the proper bit patterns. Wheeler's 1950 Programme Organization and Initial Orders for the EDSAC contains the assembler (in assembly language).
The hackaday article suggests the assembly language and assembler is found in an unavailable document called "Coding for A.R.C." However, this document is in fact available, and while it contains a table mapping "order numbers" (bit patterns) to more human-readable instructions, there remains no indication that this language is machine-readable.
The other issue with using those reports to support the claim of "a woman invented assembly language" is they were written by Andrew Booth and Kathleen Britten (later Kathleen Booth). I'm fairly sure Andrew wasn't a woman.
The hackaday and other articles refer to "autocode", which doesn't appear in either those papers or the later "Mathematical Tables and Other Aids to Computation".
Autocode is mentioned here. It appeared much later (in the mid-1960s) and was a high-level language, not an assembly language at all.
More options
Context Copy link
The etiology of the gender achievement gap has been debated considerably. Boys and girls test roughly the same in terms of IQ, and in elementary school and even up through college, females tend to outperform males. Yet something changes after college. Obviously, parental obligations. But I wonder if it can be explained by other factors, such as personality. Some have posited that men are more interested in 'things' whereas women are more interested in people. Maybe men are better at entering flow states and concentrating, which is necessary for success at competitive, cognitively demanding activities , whereas women get distracted too easily.
Yes, that men and women have different personalities and interests is one of the largest and most replicated results in psychology. I realize this is somewhat discouraged knowledge, but it's not too hard to find, e.g.:
Men and things, women and people: a meta-analysis of sex differences in interests
Why can't a man be more like a woman? Sex differences in Big Five personality traits across 55 cultures.
The Distance Between Mars and Venus: Measuring Global Sex Differences in Personality
Note that the differences tend to be actually larger than many of these suggest at first glance, as there tend to multiple, at-least-partially-independent, so if you take multiple traits at once, the means move further apart.
Scott also has a great discussion on it in Contra Grant on Exaggerated Differences
I also think the 'greater variability hypothesis', namely that men tend to have greater variability in most traits, is both true, and explains a lot of the differences we see (more homeless men, more Nobel prize winners), because it means many more men at the extremes.
If you look at top scores in the math SATs, for example (over 750?) you see many more men than women. Sorry, I don't have a source easily on hand for this one, but I've verified it a few times, and welcome you to do so. (Women tend to outscore on the verbal, and their scores tend to be more correlated, which has implications for chosen careers.)
More options
Context Copy link
IQ tests are gender-normed. They're the same for boys and girls because they're forced that way.
Is Greater Male Variability a property of the real world, independent of the IQ norming process? I would be astounded how our genetics or socialization techniques somehow cause GMV.
Really? It makes sense, so to speak, from an evolutionary standpoint, in that one male can impregnate many females, so a species can afford to produce a bunch of 'waste' men, as long as some turn out well. It's like VC investing in companies. Women, on the other hand, cannot be easily replaced -- if a tribe loses half its women, it loses half its next generation, more or less. If it loses half its men, it has labour and fighting problems, but no problems producing enough children.
There's also the detail that women have two XX chromosomes, but men have an X & Y. That's why many diseases hit men more frequently (e.g. color blindness), because they only need one defective X chromosome for it to hit. Similarly, if they get a helpful mutation, it isn't drowned out by the partner gene.
And for what it's worth, apparently you do see more male variance in the world, although I think in the mental domains it's not as clear-cut.
More options
Context Copy link
More options
Context Copy link
Wait, please explain. How are things gender normed? How are the scores manipulated?
The people who designed the original IQ tests included a balance of (I forget the specifics, take these as possible examples) spatial-reasoning tasks which men are better at, and conditional hypothetical verbal problems which women are better at. The number of each kind of question is calibrated such that men and women will both get an average score of 100. But it's a kludge; IQ tests could be more sensibly understood as "intelligence test to grade women against other women" and "intelligence test to grade men against other men" smushed into the same paper.
They did this because they couldn't be bothered to do two separate normalisation operations, but their laziness has had the tragic knock-on effect of giving entire generations the chronic misconception that women are as smart as men.
From what you've said it sounds like IQ is a grab-bag of different things. What does "women are as smart as men" mean? Are you saying:
men & women have "different" intelligences (your post is phrased far more controversially) or;
do IQ tests weight verbal questions more than spatial ones? (are there any other interpretations than the one I'm thinking of??)
I mention question weights because this guy sounds knowledgeable.
More options
Context Copy link
Might this be testable?
IQ is valuable as it tracks actual intelligence fairly closely; if you are good at IQ-test-style puzzles you're very likely actually intelligent. This will be detectable in other things like complex creative tasks, and even practical areas like average income, crime rate, and so on. If the test was biased towards one gender by adding questions that the gender did better at, but weren't as strongly correlated with actual intelligence, you'd expect their IQ scores to be less correlated with the downstream effects of intelligence itself than it is for the other gender.
(Theoretical example: If men were a lot dumber than women in general, one could add "ability to lift weights" or the like to the IQ tests until both genders had the same average score. This would however result in IQ no longer being able to predict the ability of men to e.g. lead a company, like it still would for women.)
More options
Context Copy link
I don't follow, why give preference to spatial-reasoning tasks when defining who is smarter? Why not the one at which women are better?
I don't follow, who exactly is preferencing [whichever one men were better at] tasks? The IQ test designers didn't preference them, they finely balanced them against the [whichever one women were better at] tasks.
More options
Context Copy link
I'd be fascinated to know how men-smart (spatial) the average woman is, and how women-smart (verbal) the average man is.
Also to what degree each correlates to the positive traits IQ is usually held to be associated with (health, income, etc.)
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
My understanding is that on IQ tests and proxies for it, men and women score about the same (men maybe a bit lower on average), but the standard deviation is higher. At the tails, the representation due to greater variance dominates. (Can someone well-versed in IQ comment on this?)
More options
Context Copy link
More options
Context Copy link
Your understanding of Margaret Hamilton’s role in Apollo program is still closer to what activists want you to believe instead of actual truth.
She was a lead of a team that wrote the Apollo lander program, this much is true. What is less commonly known is that she joined that team as the most junior member, and only became a lead after the code had already been written, and the actual leads (whose names, ironically, basically nobody knows today) have moved on to more important projects.
More options
Context Copy link
It is so tiring. And then combined with people saying things like "women dominated early programming and computer science". Agh. I'm sure there were a number of important and talented women in computer science. I'm also pretty sure, from what I know of history, interests by sex, and the breakdown by sex over the 30 years I've been in the field, that it's pretty likely there were more men than women. (Yes, I know much of the original 'programming' work, which was like connecting cables in a telephone switchboard was mainly done by women, as were many of the clerical calculations. That's something different).
The Lovelace one is particularly annoying, since it appears both Babbage and someone else had made algorithms (a word from Arabic from waaay back) before her. But sure, all of the credit is hers, and men have just been stealing the idea from her.
Grace Hopper seems to have been pretty kick-ass. Ada Lovelace too. We don't need to make up shit so that they are even more kick-ass. It's deceptive and sad.
Still, though, (I'm terribly sorry) whenever I hear her name, I can't help but think something like "secret identity of superheroine The Locust."
More options
Context Copy link
'Lovelace invented algorithms' is obviously incorrect, but she did appear to have insight into the applications of the analytical engine that it's designer didn't. I think it's fair to characterize her as the first person to look at the capabilities of a general purpose computer and realize the limitless possibilities.
In meme form:
Babbage:
Lovelace:
More options
Context Copy link
deleted
More options
Context Copy link
More options
Context Copy link
'Hopper Invented Cobol' is a smear; Cobol is perhaps the most hated programming language and she didn't invent it, but Sammit et al who did invent Cobol heavily reference Hopper's work.
More options
Context Copy link
link does not work
Same recurring pattern of people who are held up as geniuses and then their contributions are manifestly mediocre, whereas people today who are objectively more talented get no recognition, due to more competition. Like everything in life, being early only helps.
There does seem to be truth to the meme though. If we use paper length as a proxy for difficulty, papers have gotten considerably harder https://jeffbloem.files.wordpress.com/2018/07/screen-shot-2018-07-26-at-10-45-41-am.png I have observed this myself. Math and econ papers published in the 50s and 60s are way shorter and also seem conceptually easier compared to today, in which econ papers have multiple authors and tons of dense stats and complicated stats methods. At the same time, acceptance rates for top journals has also fallen in half since 1980. https://cepr.org/voxeu/columns/nine-facts-about-top-journals-economics It seems like a problem of drawing water from a stone.
When I was reading Adam Smith I kept thinking, "dude, stop writing like that, this is not a special ed book".
that is an uncharitable interpretation. Adam's smith's writings are much easier to follow than more recent stuff.
More options
Context Copy link
More options
Context Copy link
As someone who has to write scientific papers for a living, I assure you it's all padding to bore / shame the journal editors into accepting it.
O, for the halcyon days when you could just send a bottle of gin along with your monograph and thereby be assured of publication in the August Annals of the Royal Society of Natural Philosophy.
More options
Context Copy link
Most non-economists can easily read, understand, and apply the Nature of the Firm and the Problem of Social Costs to their own economic activities. I don't think most economists are as likely to do the same with the latest econometric monstrosity.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link