This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
It is a bit interesting to me that very, very few educational reform proposals I hear ever mention that we should be teaching and implementing epistemics as a core, fundamental aspect of any well-rounded curriculum. It seems almost self-evident and yet...
It's cliche to say that "education should be about teaching you how to think, not what to think," but I think that's actually a pretty decent goal. I'm not say you completely excise the 'rote memorization' aspects, but perhaps also provide the tools that make that rote memorization useful.
Seriously. Shouldn't we be able to at least ensure that someone who graduates high school has the ability to consider the truth-value of a statement and at least weigh whether they should incorporate the statement into their beliefs about the world or not? That they're able to make predictions based on limited evidence and reject falsehoods when there are actual consequences on the line.
And working off the assumption that not many students will be capable of autistically applying Bayes' Theorem to every new piece of evidence they encounter, it would still be pretty useful to teach the variety of heuristics that have a proven track record and teach the more blatant fallacies to avoid, and provide them with ample opportunities to learn in a controlled environment how to detect when people are lying or when the evidence isn't strong enough to support the purported conclusions, and to notice when someone is just trying to manipulate them.
Epistemics is like the ONE truly useful branch of philosophy, so it seems like making students slog through Ethical, Political, and Aesthetic philosophers without addressing the foundations of knowledge is a backwards approach to 'classical' education.
I say all this already knowing that even if we taught all students how to ascertain truth, the real lesson of high school is how to navigate complex social environments and to identify where you are situated in the hierarchy and, from that, what beliefs you need to adopt and which signals you need to send in order to maintain or improve your status.
And that's a core of human psychology that has been engrained into us over millions of years, so any lessons about how to think better will, in most cases, be suborned to the innate need to fit in with and protect the tribe.
So it's not like I expect teaching epistemics to produce a generation of enlightened thinkers, it just seems like its a bare minimum that ought to be done to ensure education isn't merely brainwashing/propagandizing with some math and science tacked on.
(Yes, I know that from the perspective of the state and ideological actors, the brainwashing is in fact that point)
One might equally say that fabrication is about making sure the right atoms are in the right places. And equally, that would be a decent goal, but just as equally, we don't have the tools to do it.
You can absolutely teach people how to think, to at least some degree. The degree you can achieve this goal, in a general population sense, using the existing tools of the educational establishment, is so extremely limited that no value is gained from trying. The system evidently works for ideological indoctrination, and it conceivably could function to teach basic skills, were it reformed. There is no evidence supporting the idea that it can actually mass-produce "well-rounded individuals", or "teach people how to think" in any meaningful sense.
Right. I'm amenable to the argument that the system can't be reformed in its current state, or that brainwashing/indoctrination is in fact the point and thus we shouldn't even care about producing well-rounded individuals.
It just seems like whatever your proposal for improving education is, there should be some component that actually addresses how to assess and learn new information and gives students the tools to learn more efficiently and hope that at least some of them use them.
Even if this is literally just teaching the students to use tools like Spaced Repetition to enhance their study.
Not including this is a bit of an admission of defeat.
Yes, and? What's wrong with admitting defeat and moving on, anyway?
Nothing, but in this context it's politically unacceptable to just throw in the towel.
So we keep getting proposals to fix everything, and things continue to decline.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
On one hand, I agree that classic logic and rhetoric should be included in any elite education, along with statistics and exercises in doing historical primary source research and discovery. However, if you try teaching these things to a midwit, they will just start accusing everyone of "having confirmation bias" instead of "being stupid", they won't actually apply these concepts accurately, just use them as smart-sounding replacements for classic insults.
That said, in order to think well, it's important so simply have a lot of knowledge about the world. You can't be a good thinker in a vacuum. You need high quality information to feed in all those Bayesian priors. If the BBC publishes an article about 10% of Britain being black in the 1300s and black women being hardest hit by the plague, if you have a grounding in lots of historical knowledge about the world of that time period, including lots of primary sources and literature, then you will have a lot better intuition about whether that claim is likely true or absolutely ridiculous.
Chicken-Egg problem.
Yes, it is very important to have a lot of knowledge. But it is just as important to make sure that knowledge is accurate and wasn't fed to you simply to ensure you behave in a particular way to benefit some person's agenda.
For instance, I used to actually believe the "Bermuda Triangle" phenomenon was some unexplained mystery of the world because I saw that factoid repeated a lot in various contexts. Then I eventually saw someone do the basic statistical analysis to show that while, yes, a lot of boats and plane 'mysteriously' disappeared in that area, that's just because it's a particularly busy area of the ocean and isn't really an outlier from any other randomly chosen location.
This gets especially important in the context of history! Where it is very important to learn what 'primary sources' actually are and how to cross-reference and confirm details rather than just accepting one person's account as accurate.
That's part of my concern, if 'education' means memorizing "10% of Britain was black in the 1300s" and at no point do you learn to examine how that conclusion might have been reached, then the knowledge they're inheriting is not going to do them much good.
More options
Context Copy link
More options
Context Copy link
I'll second this one. Learning about epistemology in college was extremely helpful for me. It seems pretty core to the idea of what we think of as critical thinking. Who is telling me this information? Why are they telling it to me? Why do I believe X? What makes X true? Are all important questions to be asking and to be able to answer to understand the world around you. Especially appreciating the distinction between why you believe a thing and what would need to be the case for a thing to be true.
I am not sure about teaching Bayes Theorem or specific fallacies, but I think teaching students the ability to reflect on their beliefs and how they formed them would be very valuable. School itself is rife with opportunities for this since most of the time you learn things by trusting the testimony of a teacher or some other expert source rather than by direct firsthand experience of the facts that establish something as true.
Yes, there's an irony in that if you do really well at traditional academics, you're basically training yourself to accept the word of an authority figure as truth.
"Teacher lectured on this topic, the textbooks confirmed their teachings, and then I was tested on my ability to accurately recite the teachings! What a useful way to discover the truth!"
It'd be interesting, for example, if teachers explicitly told students that they'd be slipping occasional falsehoods into the lessons and teaching them as true, and that there were extra credit available to anyone who not only could identify the falsehoods, but explain exactly how they figured out it was false.
Its a good exercise to test one's epistimics AND to teach that authority figures occasionally (lol) outright lie to you!
I'd say that's a good idea, and what should have been done, but these days what will happen is that someone will copy paste lecture transcripts into an LLM and have it find the error and explain it lol. I suppose that does still deserve points for diligence.
I recall a professor back in med school who did do that, and I'm particularly proud of catching several of the bugs myself, even if I suspect a handful were simply him misspeaking from memory instead of being intentional heh.
Actually, medicine might be a bad idea for such tricks, plenty of things are outright counter-intuitive or edge-cases where we need to memorize where the heuristics fail. If you try this before the students have a good fundamental underpinning of theory and some practise, they might well never figure out where they went astray unless they crack out a textbook and pour over every claim.
Or the more traditional pre-Internet failure mode: the student knows better than the teacher, finds "intentional" errors that are unintentional and just the teacher not knowing better, and gets punished for it.
More options
Context Copy link
Also probably true, but I'll also say that if we have LLMs that are reliably able to spot and correct actual falsehoods we're probably in a world that is a little epistemically safer for the average person than our current one.
But this will tip into my other concern that people will become utterly reliant on AI tools for information, and thus almost ALL of their knowledge will ultimately rest on an appeal to authority. "The AI says this was true, no need to question that."
And finally, I do think relying on authority is not the worst thing people can do! If you've found an actual reliable source of information then you can choose to simply take most of what they say as accurate! I have a handful of people I follow on Twitter who I believe are making a good faith effort to be correct about complex issues, so when they summarize things or make a prediction, I lend them a lot of credence. Because I don't have the energy to assess every single claim I encounter for accuracy, myself.
But there's gotta be some bedrock somewhere where the person(s) making certain claims actually care about getting it right.
Entirely dependent on your standards for "reliability" in my eyes. I have found SOTA models adequate for that purpose in almost everything I've cared to try, and I have checked to see whether they were providing corrections that had a basis in objective fact. It's not perfect, but I say we're past "good enough" to catch anything the teacher says that they already expect diligent students to notice.
I broadly agree with the rest of your comment, I'm happy to defer to Scott for most things, even if I do disagree with certain things he's said, and there are certainly plenty of crime-thinkers on Twitter I follow because I trust them to give me information that's both true and suppressed because it's outside the general Overton Window.
If, say, we had an aligned AGI that proved itself to be smarter and more capable in terms of answering questions I had of it, including taking into account my values where relevant, I'd have few qualms about eventually handing over my decision making to it. But if I had a route to improving my own cognition to the point where I didn't need it, being able to match it myself, I'd prefer that.
I think we should probably continue exercising caution with current LLMs due to their propensity to hallucinate, especially if given a prompt that encourages such.
And since they're able to do internet searches now, we're hinging some of their reliability/truthfulness on the accuracy of the internet at large which... well, you know why we're here on THIS site rather than on Reddit.
I suspect that I won't be ready to accept LLM's as 'oracles'/truth-sayers until they've got the ability to tap into the real world directly and explain their reasoning for their logic.
If I ask ChatGPT "Is the sun currently shining right now"
I don't want it to just say "Based on your location data (which I scraped from your browser) I figured out what your time zone is and based on weather reports for your zip code is appears that it is a bright and sunny day!"
I'd want to hear something like "I've checked several camera feeds from various locations around the globe and it appears the sun is shining brightly in the following areas []."
This is definitely the future I want but ain't sure I'm gonna get it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Something that has always ticked me off about the medical curriculum, and which is likely a global problem, is that while doctors are occasionally trained to consider potential applications of Bayesian reasoning, we're not taught it explicitly.
So we're told, "if you hear hooves, think horses, not zebras" in the context of always considering the most common potential causes for a presentation, or confronted with questions like X disease has a prevalence of Y%, test A has a sensitivity B and specificity C, given a positive test result, what are the odds that the person testing positive actually has the disease and so on.
But the generalized thought process? Not a fucking peep.
And there's no reason that people should even enter med school without knowing Bayes' theorem, the math isn't even complicated.
One thing I like about this particular aphorism is that there's a meta lesson. "Think Horses, not Zebras... unless you happen to be in a location where Zebras are more common." If you happen to be in Eastern Africa, for instance.
Because I think physicians are pretty good at applying the basic "its usually the simplest, common explanation, don't overthink it" logic but then apply it to everything and outright dismiss explanations that are more esoteric.
There's also the risk-aversion that comes when you can be sued for malpractice if you do anything other than give the most common and accepted advice.
Another reason to teach explicit Bayesianism, because that takeaway comes with it!
Plenty of tests reveal the equivalent of a news reel going "authorities report an escape of 22 zebras from the local zoo after the paddock was left open", or your neighbor swearing some of those ill-tempered horses had stripes on them.
Malpractice claims are still thankfully a rarety in India, but I suppose you can still mitigate most of the risk by providing both the "recommended" advice as well as your particular suggestions and leave it to the patient to choose, assuming you document this well. There's nothing much stopping a pissed-off patient litigating against you really, not if they want to.
I do suspect that most US doctors are more risk-averse than necessary, but teaching them Bayesianism would help them figure out the optimal course of action for their particular risk tolerance.
Exactly.
I hear a lot of accounts on twitter of people who WANT their doctors to start giving them some of the more out-there suggestions for therapies or drugs or procedures that could fix [problem] but get frustrated because they have to navigate the standard process first and most doctors won't deviate from the script much, even when asked nicely. Some people resort to homebrews out of frustration, even.
It shouldn't be difficult for intelligent risk-seekers to hook up with intelligent doctors who understand risk and to mutually agree to try out more radical options, with some safety precautions in place.
The FDA is at least part of the problem, granted.
One of the under-appreciated perks of being a doctor is that, when you go see a doctor, they're far more likely to indulge such concerns.
For example, UK guidelines for contraception, which are also used in India, mildly frown on using IUDs in nulliparous women who want a family down the line. Yet when my girl and I went to see a gyno, we were able to convince her to approve and insert one, since I could convincingly argue that despite it being UKMEC 2 (meaning it works, doesn't do any harm, but is ~overall held to not be the best choice for that demographic, which would be UKMEC 1 like OCPs or implants), we know what we were getting into. Or various psychiatric consults I've had to do myself.
I'm sure the same is true for lawyers consulting lawyers in other niches, or mechanics seeing mechanics and so on. You get a sense of palpable relief from knowing that you don't have to rehash the basics.
Sadly short of having a medical degree or experience in an allied field, there are few signals, costly or otherwise, that declare the same thing to a doctor who has to also consider the deficit in both knowledge and common sense in the average patient. I certainly wish it were otherwise, or that there was something like a short questionnaire or form you could fill out to declare yourself the equivalent of a sophisticated investor in medicine, who is willing to step outside the norm without crying about it later if it fails.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I think epistemics isn't really that useful in day to day life. For the vast majority of people in the vast majority of situations, just going off intuition is more effective. I think an elective of epistemics would definitely be good, and maybe it'd be better than one of the lower priority courses like science, but it's much lower priority than literacy or physical fitness.
Explicit reasoning, especially on Bayesian lines, isn't necessary 99% of the time, in the sense that crunching the numbers will provide an increase in value greater than the time/opportunity cost spent doing it. This is because humans are probably fundamentally Bayesian in their reasoning, according to the predictive processing theory of cognition.
This ceases to be true in certain important edge cases, where people devote far too little thought or rigor to important decisions like say, buying or renting a house, or what insurance to get and so on. Hard numbers and research will almost certainly beat going off vibes, at least in terms of dollars per hour of effort.
More options
Context Copy link
I think this absolutely ceased to be true (if it ever was) in the current era where the average person is exposed to dozens of possible scams every day, many of which were devised by extremely sophisticated actors and have been refined over years or decades to be almost imperceptible from honest business opportunities.
Maybe if you still live in a small town where you know your neighbors (i.e. you're unlikely to exceed Dunbar's number) and you've got friends and family nearby looking out for you your intuitions will still suffice to navigate life without too many pitfalls.
But even merely browsing the internet can have you stumbling into places that are designed to suck you in and extract resources or spur you to action or instill some false but alluring belief which will in turn be exploited by other malicious actors.
Most of Crypto-space, for example. Blows my mind how billions of dollars have been thrown at rugpull scams and pyramid schemes just by making people believe it was possible to 10x their money for zero actual effort. Get-rich-quick schemes have existed forever, but the sheer ubiquity of them is what makes it more risky for the average person these days.
More options
Context Copy link
More options
Context Copy link
Bryan Caplan has a book on education, one of the things he found going through the research is that "transfer of learning" is very low. Like learning Latin does not help most students learn a Latin derived language.
Students tend to learn the thing for the test then quickly forget it.
The idea of teaching epistemics sounds nice, but I don't think it would have much practical effect.
I think just awareness of the existence of epistemics might be helpful.
It's like math. Most people can't remember half of what they learned in high school math classes. Many can't even do basic algebra, but they're at least aware that algebra can and probably should be done to explain why an answer is correct.
Sometimes I'll be in conversation with a person and they'll make an assertion. I'll ask them why they think that. How do they know? Suppose I didn't agree. How would they convince me? Then their eyes narrow and their lips curl. I can see the gears turning as they mentally brand me enemy and then they just assert the thing again, but louder and with edge to their voice.
My guess is that person will remain unchanged by an epistemics class.
And the people that might benefit from an epistemics class will hate the subject and think it is dumb, because the way it will end up being taught will be dumb. They won't be learning how to think. They'll learn the major philosophers in epistemics, they'll memorize some vocab words, and they'll do a few story puzzles.
This is my fundamental complaint with K-12 education. You sit in a chair for hours a day filling out worksheets. It isn't clear to me that actually teaches most people very much.
I spent years filling out Spanish language vocabulary and verb conjugation worksheets. At no point did I learn to speak Spanish nor was I on a path to learning to speak Spanish. I suppose someone learning Spanish could benefit from some of those worksheets, as a very small part of a larger effort.
I had a good language teacher in highschool. She taught German with full immersion. People would come from all over the state to observe this amazing teaching method. I'm terrible at learning languages, but after 4 years I can kind of have basic conversations in German. My thinking at the time was "yeah of course this is the only way to actually learn the language".
What I hadn't spent those four years of German class doing was learning how to pass a test that verified my German speaking skills. So when I got to college and tried to test into a higher level German, I couldn't get past the entry level requirements. I would have had to entirely start over. My frustration at that led me to never take a language class in college.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link