@hooser's banner p

hooser


				

				

				
0 followers   follows 0 users  
joined 2022 October 02 12:32:20 UTC

				

User ID: 1399

hooser


				
				
				

				
0 followers   follows 0 users   joined 2022 October 02 12:32:20 UTC

					

No bio...


					

User ID: 1399

I don't know about all flavors of communism, but Soviet Union ideal was that everyone eat at canteens.

Where before the expectation was to dress formally in the office, now "smart casual" rules the day (if that)

It's useful to have a bit of historical perspective of what's considered acceptable or necessary. For example, the tuxedo--currently the most formal of men's wear--was originally casual-wear among upper-class:

The tuxedo ... traces its origins back to 1865 when Prince Edward VII introduced it as a stylish alternative to the traditional tailcoat. This groundbreaking garment, initially referred to as a "dinner jacket," was tailored by Henry Poole & Co. and featured a sleek black jacket paired with matching pants, which made it ideal for dining and more casual occasions.

It took about two decades for the tux to get accepted as formal wear--in US, which as now are far more into being informal:

The tuxedo gained popularity in the United States in 1886, thanks to James Brown Potter and his wife Cora, who famously wore it to the Autumn Ball in Tuxedo Park, New York. This event marked a pivotal moment in fashion history, as the tuxedo began to shift from informal evening wear to an accepted form of formal dress.

Another example: the corset. I remember watching a Perry Mason episode (thought I can't remember which one) where the female witness gets scolded for not wearing a corset to court. The exact quote: "Save the jingle for the husband." That's either late 1950's or early 1960's.

Another example: jeans, which were worker's clothes.

I am very happy that, when I go in public, I am not expected to put on a corset and stockings and wear heeled pumps, à la 1950's. My knees thank me that I can wear sneakers; my legs are much warmer in the winter in jeans or warm cargo pants, and it's nobody's business what underwear--if any--I choose to wear. If that means that I have to encounter people who chose to go out in crocks, sweatpants and a tube-top, then that's a trade-off I am willing to take.

While we are at it, I will also throw in the Chinese foot binding:

It has been estimated that by the 19th century 40–50% of all Chinese women may have had bound feet, rising to almost 100% among upper-class Han Chinese women.

There are many historical examples of norms and expectations that are either arbitrary or actively counterproductive. Therefore, when a current norm or expectation is getting relaxed, I would examine it on its own merit before decreeing it bad. Is a business suit really superior than "smart casual" for all white-collar work?

(Personal anecdote: I know an NVIDIA software engineer whose boss explicitly warned him to not wear a tie to work. In software engineering lore, the shirt-and-tie is associated with the famous IBM dress-code for its engineers, and therefore it's associated with stodgy, inflexible corporate ethos.)

On the rare days that I am interested in an update on wars / conflicts, I go to the Institute for the Study of War. They provide in-depth analysis (a blog length) based on available public info, and they have pretty good interactive maps.

Do any women here have any advice for how they'd like this subject broached if they were on the receiving end of the conversation

OK, I qualify. It's much more effective to ask what her body can do (in terms of physical activity) rather than what her body looks like. If, despite her skinniness and near-veganism, she's strong as an ox and endures like a camel, then there is no problem with her diet.

Do the two of you do any physical activity together (other than intimacy) which puts her strength and endurance to the test? Do you go for long hikes? Swim? Play tennis? Climb? Go hunting? Clear debris? Dig ditches? If she does any physical activity that brings her to the brink of her strength or endurance, then improving her strength / endurance is a motivator. If, on the other hand, she avoids any activity that requires strength / endurance because she has none, you have an opportunity to start doing an activity with her at the beginner level. Not long hikes, but going for a walk. Not swim but splash in the pool. Not climb but gentle scramble. Once she starts doing the physical activity (with the motivator being your shared company), and she gets to enjoy doing the activity (positive reinforcement is a hell of a trainer), then she'll be in the position where she'd value increasing her strength / endurance.

As for looks: once your girl is into a physical activity, she'll put on the muscle, look less skinny, and (yes) feel better.

If you value these questions as highly as I do, and you value high-quality work on these questions, then there is a tangible ROI in paying people to work on this stuff full time.

Only if the incentives of that work align with not only producing high-quality work on these questions, but also effectively disseminating the results. Current incentives in academia do not.

Yes, some academics still produce great work (aimed at others in their sub-field). The work of disseminating their result even among their sub-field peers is a challenge due of the deluge of poor-quality stuff that everyone (including them) puts out to inflate their publication record.

I have been in enough hiring and promotion committees to witness first-hand that most committee members (a) will count the number of publications, taking into account the frequency and recency of them, and the quality of the journals based on SJR metrics, and (b) will not even bother reading any of the works if the applicant is even in a slightly different sub-field, but instead rely on the blurbs in reference letters / external reviewers, which (b1) tend to be way too nice and uncritical, and (b2) tend to do about as good a job conveying the actual qualifications of the candidate in their field as we professors do when we write a letter of recommendation for a student's grad school application.

(And gods-forbid that the candidate tries something interdisciplinary and we couldn't find a reviewer with decent knowledge of both fields. Or collaborates with someone outside their field. In math at least, that tends to look like this: the mathematician use some low-level mathematics to make a reasonable model in the context provided by other collaborators; if the reviewer is a mathematician without much knowledge of the other field, the reviewer isolates the mathematical model, realizes that it's pretty low-level math, and reports that in the review. The hard part of the collaboration is the endless back-and-forth with the non-mathematicians to get them to elucidate what, specifically, they want to model, and to commit to particular measurements and parameters. None of that work comes through in the review of the final polished publication, and is certainly not apparent to any pure mathematician.)

As a result, those who rise in an academic field must go through several such filters: at least one successful tenure-track hire; successful tenure review; successful full-professor review, and any reviews in-between. The process selects for those who stay firmly in the confines of their sub-field, making numerous and safe publications. By the time one gets through these filters, one might as well stay in that lane where it's safe and comfortable, and where one has already achieved some level of prominence and prestige.

At that point one becomes the cog that perpetuates the system: one gets swamped by requests for reviews (manuscripts submitted to a journal that published your work; external review of a tenure / promotion candidate; letters of recommendation for junior colleague; letters of recommendation for students). That's a shit-ton of work, and one feels obliged to take on some of it (to keep ones connections), so one develops streamline methods for quickly writing those reviews. Which results in more bland, overly-positive-while-saying-little-of-substance reviews that others then rely on for admittance/publication/hiring/tenure/promotion. And because they know (and you know) the worth of those reviews, everyone falls back on something concrete like the JSR metrics, which feeds the Goodhart's law and further dilutes the few high-quality works that do indeed get produced and published.

So no, the current academic system's incentives do not align with producing a few but high-quality explorations into important questions.

Similarly, pulled from recent news, why would anyone spend a million dollars on studying if cocaine makes Japanese quail more sexually active?

Because both quail and quail eggs are delicious. Farmers raising quail for food, or quail hens for eggs, are definitely interested in what makes quail more sexually active, especially if it can be made economically viable to incorporate into feed.

The replication crisis, while bad in itself,

The replication crisis (e.g., in psychology) is very good for the field and for humanity: it more accurately reflects the true state of the field, compared to what we thought. The theory of replication is why psychology bills itself a science; the root problem was that replication wasn't done in practice. If every new result required two replications before being tentatively accepted as possibly describing something real, then psychology wouldn't have a replication crisis, it would just have replication, as a science should.

(On the contrary, beware any field that claims the status of science and either doesn't have the practice of replication baked in, or isn't having a replication crisis. I am looking at you, Sociology. Away to the humanities with you.)

The statistic to I like to keep in mind is: 6%. That's the proportion of all proposed medical treatments that start the FDA stage-I trials that successfully make it past stage-III to FDA approval. It takes serious financial backing to start stage-I (which is when one tests the treatment on a handful of healthy adults to check for adverse effects), so only the most promising treatments that have solid theory for why they should work, and which have been extensively tested in the lab and (if appropriate) on animals, even start the FDA medical approval process.

So I recon that the strongest academic theories in psychology are maybe epistemologically on par with the pre-FDA-stage-I medical theories. If someone were to actually put serious money in backing as rigorous a test for an application of such a theory as the one required by the FDA, then I expect that only 6% would make it.

I was gonna read the philosophical paper and scoff at its navel-grazing, but turns out it's quite interesting and got me thinking about applications of its ideas to AI.

To argue his thesis (that just cause you "obviously" feel stuff (generalized Moore argument) doesn't necessarily mean that you actually subjectively experience it in the moment), he distinguishes between the subjective experience (phenomenal), the behavioral aspects associated with the experience (functional), and the value we assign to the experience (normative).

I don't know what it's like to be you (or anybody other than myself). So even if the generalized Moore argument feels compelling to me when applied to myself (I feel stuff, so obviously I have phenomenal experiences), it takes a generalizational leap for me to also apply it to you (I am human, and others are human, so their experiences are probably like mine). That's even though I have lots of evidence that other people don't feel like me, and don't experience the world like I do. Still, it's safer to err on treating everyone like Player Characters in their own right and assume that they also feel stuff (phenomenal), because otherwise they'll think badly of me (normative) and gang up against me (functional).

But what about AI? It's not going to think badly of me and gang up on me if I treat it like it doesn't have feelings. I can adjust levels of politeness in my prompts if I think it will make a difference in the output (functional), and disregard the normative notions of proper communication.

(Of course, the same idea applied to animals. Well, I wasn't going to donate to PETA anyway.)

Come to think of it, I have heard versions of these ideas before... in Theravada Buddhism. Does it count as being "state-sponsored" if the founder was a prince?

I agree that such critical reflection is important, I disagree that government funding is either necessary or sufficient to promote such reflection. If anything, it seems to me that government funding is more likely to corrupt either the critical or the reflective part of it. Such corruption can happen by the State funding its apologists. See, for example, just about anything officially published in the USSR on the well-being of soviet people.

Such corruption can also happen by elite-group capture, which is what is happening now. While I don't know how specifically the Marsden Fund was administered, but I know how other such funds work, and I don't expect anything different here. If they give grants in [$academic field] for [$purpose], they get some prominent people in [$academic field] (as prominent as they can get, at least) to evaluate applications for their worthiness in [$academic field] and their adherence to [$purpose]. So in fact all such Funds purposely start out as elite-group capture: who else would you ask to evaluate a chemistry proposal but chemists? And that's fine, so long as you can trust [$academic field] to fruitfully pursue [$purpose]. But once the field gets an influx of members who are diverting the field from [$purpose], and they rise to prominence within the field, then they will become the evaluators who determine where funding goes, and it will go away from [$purpose].

At that point, if you care about [$purpose], start by turning off the funding spigot.

The Science post screwed up the link to the announcement, here's one that works. Despite Science's spin, the overall reporting is accurate. Let me de-spin it a bit, with quotes from the original announcement:

“The Government has been clear in its mandate to rebuild our economy. We are focused on a system that supports growth, and a science sector that drives high-tech, high-productivity, high-value businesses and jobs,” [says the Minister of Science, Innovation and Technology]. “I have updated the Marsden Fund Investment Plan and Terms of Reference to ensure that future funding is going to science that helps to meet this goal.”

An elected government chooses a popular priority--economic growth--and a ministry aligns with that priority.

The new Terms of Reference outline that approximately 50 per cent of funds will go towards supporting proposals with economic benefits to New Zealand. “The Marsden Fund will continue to support blue-skies research, the type that advances new ideas and encourages innovation and creativity and where the benefit may not be immediately apparent. ..."

So the applications to this fund should either make a reasonable case that they will benefit NZ economically, or that they have some potential to lead to that. That's in line with the priority the elected government has established for itself (economic growth).

“The focus of the Fund will shift to core science, with the humanities and social sciences panels disbanded and no longer supported. ..."

I can see why humanities and social scientists would be upset: nobody likes to have their source of funding taken away. I have but two questions: (1) do they disagree with the current elected government prioritizing economic growth, or (2) do they argue that the humanities and social science projects funded by this fund lead to economic growth as well as the core science projects?

If the disagreement is with the first question, then the response is: elections have consequences. New Zealand economy is doing poorly, people are worried, they elect a government with a mandate to grow the economy. While other goals have value, they have lost priority.

Is there any argument on the second front? The Science article hints at the possibility:

The cuts and priority changes suggest officials don’t realize commercially viable research is often underpinned by discoveries in fundamental science, says Nicola Gaston, co-director of the MacDiarmid Institute for Advanced Materials and Nanotechnology at the University of Auckland.

... but there is absolutely no follow-up or development of this argument. In fact, it's clear that "fundamental science" of the kind that an Institute for Advanced Materials and Nanotechnology is likely to do indeed will continue to be funded, and likely at a higher rate than before now that the funds are not going towards social science / humanities. Unless, despite the name, that institute is pursuing non-core, non-fundamental-science projects (e..g, "How would an advance in nanotech affect [$historically-disadvantaged-minority]?" or "Indigenous knowledge of microchips").

That brief hint of a beginning of an argument is followed by a conflation of economics and social cohesion, and then by how this will impact Maori-led research. So bupkis.

Your argument is at least more developed: you think that growing the economy through pursuing advances in science and tech leads to decrease in well-being of the population. I wonder, though: New Zealanders adopt science and tech products made elsewhere, and (let's take your claim at face value for the moment) suffer the social consequences anyway. Isn't that strictly worse than having NZ companies develop the product domestically, and at least capturing the economic benefits of the product?

Oh I see. Yes, I think so. Many of the congregations around where I live are very welcoming of newcomers, and seem even more so with people who were never religious. The devout protestants I know seem especially susceptible to simple redemption narratives ("I grew up an atheist, but now..."), and would have fewer questions for someone like that who wants to join their congregation. With someone like me, they'd want to know how I came to grok that the denomination of my youth isn't the right Christian faith while theirs is.

Imagine growing up irreligious, with parents who don't attend a church of any kind. Would "church first, then government safety net" still be your ordering in seeking help?

If I imagine that I didn't know that a church is more responsive than the government, then indeed I wouldn't have that mental ordering. Then again, I am probably missing ideas about other resources that are more responsive than the government, because I don't have prior experience in them.

and we have not managed to get past the standing quietly for two hours part of being to church with young children.

Huh, the Sunday service is only two hours now! I remember it being three. (I love being old enough to say "back in my day...")

Fortunately church people are very understanding of kid's limitations. I remember the parents taking their toddlers outside (and, discreetly, their tweens as well) once their progeny started fidgeting.

I don't really understand your point here. you seem to be agreeing with me that education is not something generally unnecessary, so it doesn't explain the bimodal distribution mentioned by OP.

I think I see: OP conflated (or rather, placed in extreme proximity) education as getting-credentials and education as reading books. The getting-credentials has a coming-together pattern (more people are going for education credentials, so there is more of a continuum of the type of credentials and their quality), but the reading-books has a coming-apart pattern (majority read practically no books, a small minority read lots and lots of books).

the vast majority of people today will have to get to grips with how to work their smartphones and smart watches and smart TVs and Fitbits and so on and so forth, ...

That fits the coming-together pattern, but with an extra feature: because many more people need to grapple with the situation that requires some of the skill, the market responded by making such situations easier to accomplish.

This is similar to the pattern in education credentialism: because many more people are playing the education credentialism game (e.g., getting a Bachelors degree), the market responded by making it easier to accomplish.

I gotta say though, sometimes it's not just the market. Take set theory. Reading Cantor's original work is challenging for a professional mathematician. But take about a century iterations of people communicating the essentials to ever-broader audience. By the 60's we have "New Math" books for elementary-school kids, which confuse the crap out of most math teachers but which the top 10% grok and love. And a few decades later Venn diagrams become essential components of memes.

... but the actual knowledge of how computers, operating systems, and actual physical electronics in general work has arguably declined.

And that's the coming-apart pattern.

There is a scene in Star Trek IV where Scotty tries to operate an 80's computer by talking into the mouse. After realizing his mistake, he looks at the keyboard, says "How quaint!", and then proceeds to speed-type. It's a funny scene, but it has always rubbed me the wrong way: why would anyone who never needs to type pick up that skill? Or, for that matter, the skill of operating whatever chemistry-model software that company was using? Not even the assumption that Scotty is the-best-of-the-best geeks can patch this hole.

I respectfully disagree.

Physical appearance in the post I responded to refers specifically to physical fitness. Half-a-century ago, general physical fitness was broadly necessary (e.g., many people had to walk or do physical labor), and now is much less so (e.g., much smaller proportion of people have to walk or do physical labor, and for the latter OSHA mandates all kinds of supportive equipment).

Sex in the post I responded to refers primarily to marriage and its dissolution, so "how-to-get-and-stay-married" is the relevant skill here.

Finally, playing-the-game-of-credentialism (a.k.a. "education") is without a doubt a more widely practiced skill now than it was fifty years ago. About 90% graduate high-school; of those, half go to college; of those, about half graduate with a degree. Fifty years ago, much higher percentage of people dropped out of high-school, and less than 10% of those who graduated went on to college. (There stats are approximate but broadly correct.)

The credentialism game has changed to accommodate the large influx of people seeking credentials.

I especially like your Christianity-as-skill idea, because it fits but I haven't thought of it that way before.

Recently, I [an atheist who grew up Eastern Orthodox] came to the conclusion that, if ever shit hit the fan in my life and my personal social network wasn't up for the task, I would head to Church--of whichever denomination is closest to Eastern Orthodox and physically proximal to me. Church first, then check what safety net the government has to offer. Because the Church tends to respond faster to any crucial need and doesn't require paperwork.

(US governments offer a pretty good safety net to anyone who is willing and capable of (a) accurately filling lots of forms, (b) letting go of all of one's earthly possessions, and (c) waiting up to several years if necessary.)

My atheism in particular, and my non-belonging-to-a-church in general, are luxuries indicative of a life lacking in severe shocks. I recognize this. How fortunate for me, then, that so many Christian denominations share the idea of repentance and return-of-the-prodigal.

All of your examples have this pattern: $[skill] used to be not only desirable but also broadly necessary; as $[skill] became generally unnecessary, a large portion of the population has mostly abandoned it, while those who remained devoted to maintaining $[skill] became much more proficient.

E.g.: back in 1962 every home-maker was expected to bake, and a large proportion of women were home-makers. Now, fewer women are home-makers, social norms about desirability of cakes and cookies have largely changed, and there are lots of options for buying baked goods. Thus, most women have mostly abandoned baking (or never developed the skill), while the few that do have vastly improved that skill.

E.g.: back in 1962, the alternatives to books (for entertainment or information) were either expensive (movies or plays in the theater), or inferior in quality or quantity (newspapers), or were on a schedule (TV and radio). Now, the alternatives to books are superior, cheap, and instantly available. So most people mostly abandoned reading books, while a smaller proportion still reads for pleasure. (Though for this example, I don't know of any metrics by which those that read books have become more proficient, except maybe a brief increase in popularity of speed-reading a decade ago in my circle.)

Let's call these the coming-apart pattern examples, and let's consider whether there are any examples with a flipped coming-together pattern: $[skill] used to be desirable but broadly unnecessary; as $[skill] became generally necessary, a large portion of the population has developed at least some competency in it. As a result, if we compare the $[skill]-ed populations now and back-in-the-day, the back-in-the-day group was much more $[skill]-ed.

E.g.: typing. Back in 1962, most professionals didn't type much themselves because they could hire a typist for a fairly low wage (mostly because that was one of the careers for young women that was generally acceptable for decades by then). That is, a professional could, instead of learning the skill himself, use some reasonable portion of his income to outsource the typing tasks. Now, every white-collar worker and many blue-collar workers are expected to do their own typing, and the typing tasks have only increased. As a result, at least 2/3rd of the population has some typing skill, and if we compare the group whose job included typing in 1962 to similar group now, the average 1962 typist would be much faster and make fewer spelling errors.

(The skill of spelling is another coming-apart pattern example, mostly courtesy of ubiquitous spell-checkers.)

Another coming-together pattern example: figuring out how to make a new electronic device work. Back in 1962, besides the small number of professionals who needed to work with bespoke electronic devices--and hobbyists who chose to do so--most people would only need to figure out how to make their TV and their radio work, and those were fairly straightforward. Now, most people regularly get electronic gadgets that either didn't exist a decade ago or whose user interface changed substantially, and they keep having to figure out how they work. (The joke among us olds is that the instructions are so complicated that only a child can do it.) So a broader proportion of the population has acquired the skill of figuring out how to make new electronic device work, but the professionals and hobbyists of yore were much better on average, because they had to understand quite a bit about the underlying electronics. (My husband salvaged many a cheap Chinese-import doo-dad with a multimeter and a soldering iron.)

To summarize:

  • When a desirable skill becomes more broadly necessary, more people acquire some level of proficiency in it, and the average level of the skill (among those that have some proficiency in the skill) drops.

  • When a desirable skill becomes less broadly necessary, fewer people acquire some level of proficiency in it, and the average level of the skill (among those that have some proficiency in the skill) rises.

I do not care that someone on the internet may actually bring to life the strawman assertion that "[A Song of Ice and Fire] is some kind of nihilistic, grimdark, pornographic deconstruction of all that is right and good in the world". Your essay remains a response to a strawman. For comparison, here's the whole text under the "Criticism" of GRR Martin's wikipedia page:

Martin has been criticized by some of his readers for the long periods between books in the A Song of Ice and Fire series, notably the six-year gap between the fourth volume, A Feast for Crows (2005), and the fifth volume, A Dance with Dragons (2011), and the fact that The Winds of Winter, the next volume in the series, hasn't been published since. In 2010, Martin had responded to fan criticisms by saying he was unwilling to write only his A Song of Ice and Fire series, noting that working on other prose and compiling and editing different book projects have always been part of his working process.

I have went through all five stages of grief and have come to accept the fact that the last two books in the series will likely never be written, with the HBO's crappy last two seasons will remain the one-and-only allowed fan-fiction. I am at peace now.

So I have to be honest with you: I did not read your essay past the first three paragraphs. The framing of your essay is I-will-put-on-full-armor-and-destroy-this-strawman, and it turned me off so badly from what you have to say about one of my favorite fantasy series that I don't want to read the rest--even as I recognize by skimming the headlines that you may have something interesting to say about ASoIaF.

And maybe that's just me, and other people here will find the strawman a delightful hook. However, if you do get similar complaints, please consider reposting a revised version of your essay, where instead of this-nobody-thinks-X-but-I-think-Y framing it's just Y. I will gladly read and engage with that essay then.

But do children learn anything in school anyway? You can graduate from high school and then get a degree without knowing much of anything.

Surprisingly, some students do indeed learn in school. It happens to some students, on some days, and in some classes, when the perceived norms for students is to pay attention and do the work. When those norms are gone, those students who would have learned something are not paying attention and miss the opportunity, or they are paying attention but have not done the work and are thus unprepared for the moment.

This is not the most efficient way to learn. But it does happen, just not often and not to everyone.

I worked with high-school and college students before, during, and after the pandemic. The holes even in their elementary-school math (like fractions and decimals) are so much larger now than before. But what's really impressive is the holes in their expectations for what the school norms ought to be. No, the fact that you showed up doesn't mean that you will pass the class. No, the fact that you wrote 'idk' as your answer does not earn you partial credit. Yes, we are going to have an in-class exam, and no you can't use your laptop or phone, and no you can't work in groups. How were you supposed to know how to answer this question, you ask? Do you observe this section in your textbook that you were required to read, with a very similar example worked out in detail? Do you remember these two similar problems we have done in class? Do you recall these three similar problems on the homework, which I see by your turned-in work you have done correctly? Was that perhaps not your work?

Rant over; I am just so happy I have retired.

The breadth of Hunter Biden's pardon is unprecedented, with Ford's pardoning of Nixon the closest comparison. That source goes through other presidential pardons in history for comparison. Since it's MSN, I take it as a sign that Democratic Party partisans are less-than-pleased with Biden pissing in their well.

How very Confucian:

The Duke of She said to Confucius, “Among my people there is one we call ‘Upright Gong.’ When his father stole a sheep, he reported him to the authorities.” Confucius replied, “Among my people, those who we consider ‘upright’ are different from this: fathers cover up for their sons, and sons cover up for their fathers. ‘Uprightness’ is to be found in this.”

Yes, this reflects badly on Biden as US President, and by extension on the Democratic Party. Giving such an unprecedently-broad pardon to his own son--prior to sentencing--is hypocritical on so, so many levels. And yet, one of my favorite Borges short story is the Three Versions of Judas:

"God became a man completely, a man to the point of infamy, a man to the point of being reprehensible—all the way to the abyss. In order to save us, He could have chosen any of the destinies which together weave the uncertain web of history; He could have been Alexander, or Pythagoras, or Rurik, or Jesus; He chose an infamous destiny: He was Judas."

This is exactly my experience, but with sports! I don't enjoy watching sports, I don't care who wins or loses, but I know that 90% of the discussion at Thanksgiving will revolve around the Notre Dame football games so I have to choose to either watch (the highlights) or be utterly out of the conversation loop.

This is not a new problem. I fully suspect that it's what drove the earlier generation to watch the nightly news, since otherwise you're out of the water-cooler chit-chat loop. (How far back do I have to go for a water-cooler chit-chat to still be a thing? 90's?)

For me, Scott's write-up is valuable because he (a) surveys the best of the field, (b) reasonably summarizes the findings in layman terms, and (c) does his best to distinguish his own editorial ideas.

Do prisons work to reduce crime? I am sort of interested--my tax dollars support the system, and I and mine are subject to the laws that have the potential to land us there--but I am not sufficiently interested to actually do my own deep dive into the matter. Thus, it's useful to know (and I will trust Scott on this) that, in the vast and varied field of criminology studying the question, there are three meta-studies that are worth a damn. It's useful to know that they (and most criminologists studying the question) share a reasonable framework of Deterrence / Incapacitation / Aftereffects. Knowing how academic research gets done, I am not at all surprised that even the three meta-studies disagree on specifics. However, it's useful to have a general synthesis of how much they agree, and an analysis of the likely roots of their disagreement. Plus, Scott provides a simple though very useful additional framework where effectiveness of a change in incarceration rate depends on current level of incarceration and current level/type of crime.

And it took me a leisurely hour or so to read Scott's post, whereas my own deep dive would have taken me days and I wasn't going to do it anyway.

That's either evidence against my hypothesis, or proof that the memeplex metastasized.