Wikipedia:Reference desk/Archives/Mathematics/2012 March 7

Mathematics desk
< March 6 << Feb | March | Apr >> March 8 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


March 7

edit

weighted random walk

edit

Recall my weighted random walk:


Now here are generalized question:

1.
a. For a positive integer k, what is the probability p that I will touch or go below j-k at least once in the next n steps of my walk?
b. What is the expected current value after n steps, starting at j?
c. What is the standard deviation of this value? Does it roughly follow a bell curve (is normally distributed) or if not, how is it distributed?

2.
a. For a positive integer k, what number of steps m imply a probability p of 0.5 that I will touch or go below j-k at least once during the m steps, starting at value j? And for any probability p?
b. If m is however many steps result in touching or going below j-k, what is the average value and standard deviation of m (Sorry if this is in fact the same question as question part (a), that part is about probability 0.5, this one is the average value of m before it in fact touches. I guess this might be the same?)
c. What is the standard deviation of the two values in question part (b)? Do they roughly follow a bell curve (are approximately normally distributed) or if not, how are they distributed?

3. For a given positive integer k, if I would like to take q steps with a p probability that I will never touch or go below j-k, what should j be?


If it is much easier to answer the above questions if j = k (meaning that I touch or go below zero by touching or going below j-k) then you can go ahead and do that.

I have very little background in statistics and I'm trying to find the above answers in monte carlo form, but it's just taking too long. I'm hoping someone who knows statistics can give me formulas. Thanks!
--80.99.254.208 (talk) 06:31, 7 March 2012 (UTC)[reply]

See Gambler's ruin#Unfair coin flipping and just let the second players number of coins be infinite for the chance of ever going to zero. The boundary case of random walks is your general problem but that article is a bit harder. Also as I envisage the drunkards walk with a canal on one side the real question is whether they will fall in before they get to the end of the path not how many times they fall in. ;-) Dmcq (talk) 09:53, 7 March 2012 (UTC)[reply]


I can't give you the formulas, not sure there are simple formulas, but instead of using the monte carlo method, you can calculate all propabilities starting with one throw, then two, then three etc. writing a program for this shouldn't be too hard. Let's say the chance of winning p times out of N is (p,N). you know (1,1)=0.8 and (0.1)=0.2 ; the you calculate (0,2) (1,2) and (2.2) and so on. In general, (p,N)= 0.2 * (p,N-1) + 0.8 * (p-1,N-1). Start of with 0$ to keep it simple, you only need to know how much money is lost or won for every case (p,N) namely 6.375*(N-p) - 25 * p; calculate up till a reasonable value N and use that array to get the numbers you want. You'll have to do some extra work to get the chance of going below a value at least once, because you'll have to exclude values (p,N) that reached that limit so you don't use them for calculating (p+1,N+1), winning after you went bust doesn't count. The average value after N will be N * 0.02 (? I think); the standdev will be almost the same as for the case 6.25 and -25 (with mean value 0). The difference with a fair game (=average=0) is so small (diference between winning 50 times or 51 times) that you'll have to go to high number of throws before you'll notice much difference. Since a player in a fair game having limited wealth will always go broke against a bank with unlimited capital, it's no surprise that your simulation went bust so many times.. 84.197.178.75 (talk) 14:08, 7 March 2012 (UTC)[reply]

I'm not sure I understand your answer completely, but let's make sure we don't misunderstand each other: you say "Since a player in a fair game having limited wealth will always go broke against a bank with unlimited capital, it's no surprise that your simulation went bust so many times.. " and you think it applies equally when the player has a 10% average advantage against the house, wins four times out of five and when he doesn't win, loses only 3.9 times as much as he usually (four out of five times) wins? By the way, in my simulation it seems that if you just have a bankroll of 7000 you can go up to a BILLION and not bust -- due to your 10 percent advantage -- but having a bankroll of 1500 will leave you busting before you ever reach ten thousand. Can you explain this discrepency using math? --80.99.254.208 (talk) 19:17, 7 March 2012 (UTC)[reply]
If you'll look at the link I gave you'll see that gambler's ruin only applies for fair odds. If a player wins 2/3 of the time and has 5 coins to start with then they'll go bust with probability (1/2)5, i.e only 1/32 of the time. The 1/2 = (1-2/3)/(2/3). The chance of going bust if you start with 1500 and have a 10% advantage is extremely tiny indeed, I think your simulation is going wrong somewhere. Dmcq (talk) 20:50, 7 March 2012 (UTC)[reply]


Oops, my comment was for the 0.1 advantage, not the 1.1 you used in your simulation. your pseudo-random generator may be at fault here as Dmcq says. A Linear congruential generator as used in most programming languages is not suitable for Monte Carlo simulations. 84.197.178.75 (talk) 03:05, 8 March 2012 (UTC)[reply]
Could you clarify? The correct (I'm the OP) advantage is +0.1 which is. Oh, I guess that's not a ten percent advantage against the house at all! Let me ask for clarification...(new section) --80.99.254.208 (talk) 10:42, 8 March 2012 (UTC)[reply]