site banner

Small-Scale Question Sunday for January 12, 2025

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

3
Jump in the discussion.

No email address required.

RE: infinite reincarnation, it's just plainly, mathematically untrue that infinite time guarantees the that infinitesimally small probabilities must occur, much less that they must occur infinity times. You can demonstrate this using a simple mathematical model (see "AI slop" below), but it should be a natural conclusion given the increase in entropy over time. Our current configuration has a certain value of entropy and saying that we should see that value of entropy (followed by the same characteristic increase we observe today) violates thermodynamics. And frankly, I don't find "when you're dead you're dead" to be a big hit against pure materialism. If anything, it's a bit motivating.

###Claude chat###

User: Consider two objects that each have a position on the integer number line. At t=0, both objects are at position=0. At each discrete time step, object 1 has a 90% chance to increase its position by 1, and 10% chance to decrease its position by 1. Conversely, object 2 has a 90% chance to decrease its position by 1 and a 10% chance to increase its position by 1. Give a function of discrete time, t>0, that describes the probability that object 1 and object 2 have the same position.

Claude: Let me solve this step by step.

  1. First, let's understand what needs to happen for the objects to be at the same position:

    • Given their movement probabilities, Object 1 tends to move right while Object 2 tends to move left
    • To be at the same position at time t, they must have made the same net movement
  2. For any time t, let's consider:

    • Let X₁ be the net movement of Object 1 (positive = right)
    • Let X₂ be the net movement of Object 2 (positive = right)
    • We want P(X₁ = X₂)
  3. For them to have the same position:

    • The number of right moves minus left moves must be equal for both objects
    • Given that they start at the same position (0)
  4. Key insights:

    • At each step, Object 1 has p₁ = 0.9 for +1 and q₁ = 0.1 for -1
    • Object 2 has p₂ = 0.1 for +1 and q₂ = 0.9 for -1
    • This means Object 2's movement is the negative of Object 1's distribution
    • Therefore, X₂ has the same distribution as -X₁
  5. For time t:

    • X₁ follows a binomial distribution with parameter p₁ = 0.9, scaled to ±1 movements
    • X₂ follows a binomial distribution with parameter p₂ = 0.1, scaled to ±1 movements
    • The probability we want is P(X₁ = X₂)
  6. For the objects to be at the same position:

    • The total number of rightward moves (k) for both objects must be equal
    • Given time t, k must satisfy: 0 ≤ k ≤ t

Therefore, the probability at time t is:

P(same position at t) = Σₖ P(Object 1 makes k right moves) × P(Object 2 makes k right moves)

This can be written as:

P(t) = Σₖ₌₀ᵗ (t choose k)(0.9)ᵏ(0.1)ᵗ⁻ᵏ × (t choose k)(0.1)ᵏ(0.9)ᵗ⁻ᵏ

Simplifying:

P(t) = Σₖ₌₀ᵗ (t choose k)² (0.09)ᵏ(0.09)ᵗ⁻ᵏ = (0.09)ᵗ × Σₖ₌₀ᵗ (t choose k)²

This is the exact formula for the probability that the objects are at the same position at time t.

As t increases, this probability approaches 0, which makes intuitive sense as Object 1 tends to move right while Object 2 tends to move left, making their meeting increasingly unlikely.

User: What is the mean number of times object 1 and object 2 will have the same position integrated from t=1 to infinity?

Claude:Let me solve this step by step.

  1. This is asking us to find E[number of meetings] = Σₜ₌₁^∞ P(meeting at time t)

  2. From our previous derivation: P(meeting at t) = (0.09)ᵗ × Σₖ₌₀ᵗ (t choose k)²

  3. A key insight: this sum is equivalent to counting the expected total number of times the objects will ever meet.

  4. This is related to a classic probability problem. When two random walks with different drift eventually separate (one going mostly right, one going mostly left), the expected number of meetings has a closed form solution.

  5. For simple random walks with transition probabilities p and 1-p, the expected number of meetings is: E[meetings] = p(1-p)/(p² + (1-p)² - 2p(1-p)) = p(1-p)/(2p² - 2p + 1)

  6. In our case:

    • For Object 1: p₁ = 0.9
    • For Object 2: p₂ = 0.1 = 1 - p₁
  7. Plugging into the formula: E[meetings] = 0.9 × 0.1 / (2 × 0.9² - 2 × 0.9 + 1) = 0.09 / (1.62 - 1.8 + 1) = 0.09 / 0.82 = 0.1097... ≈ 1/9

Therefore, the mean number of times the objects will meet after t=0 is approximately 0.1097 or roughly 1/9.

This makes intuitive sense because:

  • The objects start together at t=0 (not counted in our sum)
  • They have strong tendencies to move apart (0.9 vs 0.1 probabilities)
  • Once they separate by a few positions, the probability of meeting again becomes very small
  • The geometric decay of meeting probabilities ensures the infinite sum converges

###End chat###

I find Claude's answer pretty satisfactory. I know there's a lot of criticism here about the use of LLMs for checking your intuition about things, but my feeling is that it does quite well as long as you want reasoning rather than pure factual recall and you know how to prompt it well.

A simple example:

A random walk in 1D and 2D space is recurrent, and the odds of returning to the origin over an infinite amount of time approaches 1.

On the other hand, a random walk in 3D space converges to a finite, <1 value. The space of possible outcomes expands faster than you can explore it.

"A drunk man will find his way home, but a drunk bird may get lost forever."

I don't, however, think this is a rebuttal of a Boltzmann Brain. For any given volume of space-time, there is a finite set of distinct structures it can take, and over an infinite amount of time, the odds of instantiating one approaches 1. The key difference here is that we are concerned with finite volumes with a finite number of potential arrangements over infinite time, as opposed to an infinite volume in the case of random walks in three dimensions.

( @faul_sname , if you're still around, care to comment?)

A random walk in 1D and 2D space is recurrent, and the odds of returning to the origin over an infinite amount of time approaches 1.

An unbiased random walk (where each direction is equally likely) in 1D and 2D space is recurrent.