This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
While your idea is somewhat valid, it either misses the point of the question that a Bayesian probability answers or it ignores that it is an important part of Bayesian reasoning. In other words, a good Bayesian would say that your idea is trivial and irrelevant, unless there is further information acquisition. It is not a "valid and important question to ask" except for some contexts.
In your example, if you can only take the bet once, optimally choosing to take the bet or not involves calculating the expected gain using the correct Bayesian probability. Any other information is irrelevant. In another simple example, you can formulate this as a problem with an option to continue. In that case, there will be an instantaneous (also called flow) payoff and a continuation value (the value of being able to take the bet again). The continuation value depends on the posterior probability which, as you correctly mention, depends on other stuff. However, this continuation value only matters for the decision if it is affected by the decision. If the shady guy will nonetheless toss the coin, then how the posterior probability will change is irrelevant for you.
More generally, dynamic problems with new information are not a problem for Bayesians. Specifying the informational context of a problem requires a proper prior, which is a joint distribution of all variables. These variables can be the decision-relevant ones (the particulars of the coin) or informational ones (the history of coin tosses or extra information on how the coin was obtained). Bayes theorem has us update this prior in the usual way. While there are some examples where this extra information can be neatly summarized into a simple sufficient statistic (e.g., the number of tosses and number of heads for coins with a given probability of landing heads and independently distributed outcomes given the coin), those examples are the exception.
To recap, Bayesians are not "making incorrect technical arguments insinuating that the estimated probability alone should be enough for everyone." They are making correct arguments that fail only in a very small subset of problems, those with information acquisition that is affected by the decisions. In this way, it is not a "a valid and important question to ask." Furthermore, it is not clear that "Bayesians ought to get a habit of volunteering this information unprompted" because this information, besides being irrelevant to most decisions, is not easy to communicate succinctly.
I disagree that this is a very small subset of problems, the majority of real life problems let you decide to wait and collect more information or decide how many resources you're willing to bet. See examples in https://en.wikipedia.org/wiki/Multi-armed_bandit
For example, I think I first noticed this problem many years ago in one of Scott's linkdumps where he disapprovingly linked to Obama saying that CIA told him that such and such thing had a 70% probability but really they had no good information so it was a coinflip. And Scott was indignant, 70% is 70% what more do you need to know before you authorize some military operation, even the President doesn't understand probability smdh. In my opinion Obama was right, if technically imprecise, while Scott was wrong, which demonstrates the danger of having a little knowledge and also the need for more awareness.
You say this as if it's not Bayesians' fault that they have not developed (or got into a habit of using) a succinct way of conveying how much of the estimate comes from the prior and how much from the previous updates. I would understand if it was an acknowledged hard problem in need of community's attention, but for example Yudkowsky sequences don't mention it at all.
More options
Context Copy link
More options
Context Copy link