For the probabilitists in the audience
I'd like your thoughts on a problem. (Here, all "randomly chosen" numbers are taken from the same continuous real distribution.) Suppose you have a set S of n randomly-chosen numbers, and a set D, initially empty. Begin by removing the largest number from S and placing it in D, and adding another random number to S. Now take the largest number from S that is smaller than the last number you chose, place it in D, and replace it with a random number. Repeat until you've taken the smallest number from S.
- What is the distribution of numbers in D?
- What is the expected value for the size of D?
- Bonus question (I know the answer to this one!): Why is Brian thinking about this problem?
I'm about to start in on a Monte Carlo estimator of the problem, but I'd like some theoretical input.
click
- What is the distribution of numbers in D?
- What is the expected value for the size of D?
- Bonus question (I know the answer to this one!): Why is Brian thinking about this problem?
I'm about to start in on a Monte Carlo estimator of the problem, but I'd like some theoretical input.
click
no subject
Oooh, I know! It's 'cuz he's a nerd, right? =o]
More seriously, my gut tells me that the expected value for the size of D is probably infinite. I was never all that good at this sort of math, though, and I'm also ridiculously out of practice. I'm still going with "infinite", though.
no subject
I don't believe infinite. The number you pick is guaranteed smaller with each iteration, and once you get the smallest one you're done.
no subject
Yes, but you keep adding numbers to the set, and it's a continuous real distribution, so infinite isn't *that* unreasonable.
I never said I was right, or even that I'd bet on it, that was just my first reaction. I'm sure the hordes of actual mathematicians will pounce on me with their mockery masked by logic any minute now. =o]