This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Maybe this is just my American education but I was required to take both History and English classes though High School. Not clear to me that events having actually happened necessarily gives it more value. The freedom of fiction seems like it gives more opportunity to explore particular issues and themes with more precision than can commonly be encountered in real world events.
I'm wondering here if there's some specific class named "Science" that students are required to take? My recollection of high school is that we had to take one "Science" class per year but the classes were all themselves themed around specific sciences (Physics, Chemistry, Biology, etc) so you had freedom in choosing what you were interested in.
What would be the content of a grade 10 computer science course that would be useful? Maybe it's because I have a CS degree but I struggle to think of what I could teach someone about computer science in a single year that would be useful for them in general life, unless it was some kind of tech-support-esque class.
I think this could be a good idea but only if Gym class is significantly reconstituted. Maybe it was just my experience but my own Gym class did not do a good job impressing on us the importance of aerobic exercise as a habit. It was just this annoying class we had to take. I think a gym class reconstituted around the idea of healthy habit formation, the importance of exercise as a habit, nutrition, and so on would be much more effective.
I had this as a mandatory class called Wellness in high school, in Tennessee nearly 20 years ago. The subject matter was essentially exactly what you describe.
However, the teacher was quite demotivated, and the class consisted mainly of worksheets and reading chapters in the textbook. This is a similar scenario to what I often see posted on Reddit: "They don't teach personal finance in the schools!" - at the same (ordinary, public) school, I did have a personal finance class in 10th grade, in 2005, and I thought it was very helpful. But then, I was paying attention and interested in the subject, and this is apparently not common.
More options
Context Copy link
I'm of the opinion that exploring issues and themes in fiction was basically entirely useless to me. Where as learning about the history of single payer healthcare, or the lead up to WW1, or any number of topics in history, were at least very slightly useful because they provide context to modern politics.
For grade 9 and 10, we had general science classes that taught a bit of each, then grade 11 and 12 we were given our choice of all 3.
I think learning to think logically and understand a bit about how computers work would be valuable, at least as much as most highschool classes. I might just be over valuing because it was one of my favorite classes though.
I find this fascinating since my experience was quite opposite. Fiction could make issues clear cut in a way non-fiction almost never could.
What do you mean by how they work? I think a lot of the practical operation of computers (opening programs, navigating file systems) are easily integrated into other classes. If you mean more literally how they work (binary, memory, CPU clocks, adders, etc) then that seems more esoteric to me than a lot of other stuff you describe as wanting to be optional.
I was thinking a standard Python 101 class. At the end of it, they should be able to do the easiest problems on LeetCode. I think having a basic idea of how websites and software one level of abstraction down work would be good for people.
More options
Context Copy link
Have you considered that perhaps that's a good reason not to use fiction to think about issues? There's a reason issues often aren't clear cut in non-fiction. The world is rarely that neat and simple. Perhaps fiction encourages us to think unrealistically about issues. Perhaps the author has biases or blind spots that mislead/manipulate the reader into thinking one thing or another. While that often happens with non-fiction too, at least the events in question happened and the author's take can in principle be refuted.
I think it's a good reason not to use only fiction. I think an important part of being able to reason about complex situations is to be able to reason about simple ones. There's a reason logic classes start off with simple syllogisms. One should, of course, always keep in mind that the author has their own views on the topic and the work itself should be examined through that lens. I actually think this last part is an important part of media criticism that I see less often than I would like. Instead of asking whether a work is "good" in the sense that I enjoyed reading it or that I endorse the message it conveys one should think about what message the author is trying to send and whether the work does so in an enjoyable or engaging way. Reading fiction critically is an opportunity to consider how others or yourself might act (or ought to act) in ways that are analogous or dis-analogous to various actual situations one may find oneself in.
I find this a little confusing. What do you take it to mean to refute an author's take? If you mean an author's description of events that have actually occurred, then no one should be reading fiction for that anyway. If you mean refuting an author's take on what ways it would or would not be appropriate to act in some circumstance then it seems to me fiction author's takes are as open to refutation as non-fiction author's takes.
Except fiction can create scenarios that are extremely unrealistic, including in ways that might not be obvious to a young person. For example, a work of fiction that sanitizes violence and its true brutality might lead someone to be more likely to endorse violence in general. Or, conversely, fiction that depicts bad guys being effortlessly incapacitated might lead people to be less likely to endorse lethal violence when it's actually called for. I think, for example, that Hollywood's aversion to depicting gruesome violence (yes, you read that right) contributes to people having terrible intuitions about police use of force. They see movie heroes shooting people in the leg and think that's something police should be doing instead.
The author of a work of non-fiction (say, a textbook) might selectively omit certain other historical facts that would have changed how the reader thinks about a particular fact of history, or they might claim certain information is factual when there's actually some dispute about it among experts, or they might make normative claims that are debatable or use language in clever ways to try to sway the reader to the author's point of view.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
We had the phys/chem/bio split, too, but it wasn’t elective. Not until the AP classes!
I want to say students would benefit from using a REPL. Anything that has a state to manipulate plus a limited set of commands. I’m not sure it would ever be directly relevant, because the only command line a layman might encounter is the Windows one. A 10th grade understanding of .bat scripting isn’t going to help anyone.
What I’d try to convey is the core concept of computers setting up stuff and then looking back at that stuff later. Get them to ask “what steps will the computer take next?” Or “what have I already given this computer?” I think the average person could be better at predicting what a system is or isn’t capable of, rather than treating it like a person who just won’t help.
I wonder what fraction of high school age people have played video games. The way you describe what you want people to do and how you want them to reason are things anyone who has spent any amount of time playing video games has had to do. They are very literally state machines with limited inputs that the player must be able to reason about. The player must be able to reason about what the game will do with further inputs and the context of previously given inputs. Most people playing games don't think about it at this level of abstraction but it seems clear to me that's what's happening.
When you get more fluent with a system, it fits that reasoning process into existing patterns. Learn that moving the mouse corresponds to moving that little cursor, and you can now rely on the same kinesthetic sense that you use for any manipulation. You can skip the slow reasoning step. You will skip it, and you’ll use that cognitive power for the underlying task.
A system which is really clever about this, and makes the intuition easy and painless? We call that good UX.
I think it’s very possible to learn a game and skip right over the explicit reasoning. Especially if
All of which apply to, say, the Xbox Live or (vanilla) Minecraft server modes of play. Barrier to entry is not usually a design goal. When it is, and you get a niche game like NetHack or Aurora 4X…yeah, I’d expect those players would make good programmers. But hey, correlation != causation.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link