RE: infinite reincarnation, it's just plainly, mathematically untrue that infinite time guarantees the that infinitesimally small probabilities must occur, much less that they must occur infinity times. You can demonstrate this using a simple mathematical model (see "AI slop" below), but it should be a natural conclusion given the increase in entropy over time. Our current configuration has a certain value of entropy and saying that we should see that value of entropy (followed by the same characteristic increase we observe today) violates thermodynamics. And frankly, I don't find "when you're dead you're dead" to be a big hit against pure materialism. If anything, it's a bit motivating.
###Claude chat###
User: Consider two objects that each have a position on the integer number line. At t=0, both objects are at position=0. At each discrete time step, object 1 has a 90% chance to increase its position by 1, and 10% chance to decrease its position by 1. Conversely, object 2 has a 90% chance to decrease its position by 1 and a 10% chance to increase its position by 1. Give a function of discrete time, t>0, that describes the probability that object 1 and object 2 have the same position.
Claude: Let me solve this step by step.
First, let's understand what needs to happen for the objects to be at the same position:
Given their movement probabilities, Object 1 tends to move right while Object 2 tends to move left
To be at the same position at time t, they must have made the same net movement
For any time t, let's consider:
Let X₁ be the net movement of Object 1 (positive = right)
Let X₂ be the net movement of Object 2 (positive = right)
We want P(X₁ = X₂)
For them to have the same position:
The number of right moves minus left moves must be equal for both objects
Given that they start at the same position (0)
Key insights:
At each step, Object 1 has p₁ = 0.9 for +1 and q₁ = 0.1 for -1
Object 2 has p₂ = 0.1 for +1 and q₂ = 0.9 for -1
This means Object 2's movement is the negative of Object 1's distribution
Therefore, X₂ has the same distribution as -X₁
For time t:
X₁ follows a binomial distribution with parameter p₁ = 0.9, scaled to ±1 movements
X₂ follows a binomial distribution with parameter p₂ = 0.1, scaled to ±1 movements
The probability we want is P(X₁ = X₂)
For the objects to be at the same position:
The total number of rightward moves (k) for both objects must be equal
Given time t, k must satisfy: 0 ≤ k ≤ t
Therefore, the probability at time t is:
P(same position at t) = Σₖ P(Object 1 makes k right moves) × P(Object 2 makes k right moves)
This is the exact formula for the probability that the objects are at the same position at time t.
As t increases, this probability approaches 0, which makes intuitive sense as Object 1 tends to move right while Object 2 tends to move left, making their meeting increasingly unlikely.
User: What is the mean number of times object 1 and object 2 will have the same position integrated from t=1 to infinity?
Claude:Let me solve this step by step.
This is asking us to find E[number of meetings] = Σₜ₌₁^∞ P(meeting at time t)
From our previous derivation:
P(meeting at t) = (0.09)ᵗ × Σₖ₌₀ᵗ (t choose k)²
A key insight: this sum is equivalent to counting the expected total number of times the objects will ever meet.
This is related to a classic probability problem. When two random walks with different drift eventually separate (one going mostly right, one going mostly left), the expected number of meetings has a closed form solution.
For simple random walks with transition probabilities p and 1-p, the expected number of meetings is:
E[meetings] = p(1-p)/(p² + (1-p)² - 2p(1-p)) = p(1-p)/(2p² - 2p + 1)
Therefore, the mean number of times the objects will meet after t=0 is approximately 0.1097 or roughly 1/9.
This makes intuitive sense because:
The objects start together at t=0 (not counted in our sum)
They have strong tendencies to move apart (0.9 vs 0.1 probabilities)
Once they separate by a few positions, the probability of meeting again becomes very small
The geometric decay of meeting probabilities ensures the infinite sum converges
###End chat###
I find Claude's answer pretty satisfactory. I know there's a lot of criticism here about the use of LLMs for checking your intuition about things, but my feeling is that it does quite well as long as you want reasoning rather than pure factual recall and you know how to prompt it well.
RE: infinite reincarnation, it's just plainly, mathematically untrue that infinite time guarantees the that infinitesimally small probabilities must occur, much less that they must occur infinity times. You can demonstrate this using a simple mathematical model (see "AI slop" below), but it should be a natural conclusion given the increase in entropy over time. Our current configuration has a certain value of entropy and saying that we should see that value of entropy (followed by the same characteristic increase we observe today) violates thermodynamics. And frankly, I don't find "when you're dead you're dead" to be a big hit against pure materialism. If anything, it's a bit motivating.
###Claude chat###
User: Consider two objects that each have a position on the integer number line. At t=0, both objects are at position=0. At each discrete time step, object 1 has a 90% chance to increase its position by 1, and 10% chance to decrease its position by 1. Conversely, object 2 has a 90% chance to decrease its position by 1 and a 10% chance to increase its position by 1. Give a function of discrete time, t>0, that describes the probability that object 1 and object 2 have the same position.
Claude: Let me solve this step by step.
First, let's understand what needs to happen for the objects to be at the same position:
For any time t, let's consider:
For them to have the same position:
Key insights:
For time t:
For the objects to be at the same position:
Therefore, the probability at time t is:
P(same position at t) = Σₖ P(Object 1 makes k right moves) × P(Object 2 makes k right moves)
This can be written as:
P(t) = Σₖ₌₀ᵗ (t choose k)(0.9)ᵏ(0.1)ᵗ⁻ᵏ × (t choose k)(0.1)ᵏ(0.9)ᵗ⁻ᵏ
Simplifying:
P(t) = Σₖ₌₀ᵗ (t choose k)² (0.09)ᵏ(0.09)ᵗ⁻ᵏ = (0.09)ᵗ × Σₖ₌₀ᵗ (t choose k)²
This is the exact formula for the probability that the objects are at the same position at time t.
As t increases, this probability approaches 0, which makes intuitive sense as Object 1 tends to move right while Object 2 tends to move left, making their meeting increasingly unlikely.
User: What is the mean number of times object 1 and object 2 will have the same position integrated from t=1 to infinity?
Claude:Let me solve this step by step.
This is asking us to find E[number of meetings] = Σₜ₌₁^∞ P(meeting at time t)
From our previous derivation: P(meeting at t) = (0.09)ᵗ × Σₖ₌₀ᵗ (t choose k)²
A key insight: this sum is equivalent to counting the expected total number of times the objects will ever meet.
This is related to a classic probability problem. When two random walks with different drift eventually separate (one going mostly right, one going mostly left), the expected number of meetings has a closed form solution.
For simple random walks with transition probabilities p and 1-p, the expected number of meetings is: E[meetings] = p(1-p)/(p² + (1-p)² - 2p(1-p)) = p(1-p)/(2p² - 2p + 1)
In our case:
Plugging into the formula: E[meetings] = 0.9 × 0.1 / (2 × 0.9² - 2 × 0.9 + 1) = 0.09 / (1.62 - 1.8 + 1) = 0.09 / 0.82 = 0.1097... ≈ 1/9
Therefore, the mean number of times the objects will meet after t=0 is approximately 0.1097 or roughly 1/9.
This makes intuitive sense because:
###End chat###
I find Claude's answer pretty satisfactory. I know there's a lot of criticism here about the use of LLMs for checking your intuition about things, but my feeling is that it does quite well as long as you want reasoning rather than pure factual recall and you know how to prompt it well.
More options
Context Copy link