typodupeerror

## Comment Solution (Score 1)167

Let r, p and s be the probabilities we play rock, paper and scissors on a given game respectively. It follows that,

r + p + s = 1
where r, p and s are elements of the interval [0,1]

For our opponent we will use a similar notation, only with uppercase letters,

R + P + S = 1
where R is an element of [1/2, 1] and p and s are elements of [0, 1/2]

Rearranging, we find that,

s = 1 - r - p
S = 1 - R - P

Now the expected gains, G, we get from any one game is given by,

G = AW + 0D + (-A)L = A(W - L)
where A is the amount exchanged between players and W, D and L is the probability of winning, drawing and losing respectively.

The probability of winning on single game, W, is given by the sum of the probabilities of each of the winning configurations occurring is

W = rS + pR + sP

Similarly the probability of losing on a game is

L = Rs + Pr + Sp

and so the expected gains is given by

G = A( r - R + P - p + 3(pR - Pr) )

Now note that our opponent must play rock at least half the time, so whenever we play scissors, at minimum we will lose half of the time and so it will never improve our gains. Therefore we should never play scissors. In the worst case scenario, our opponent is aware that we will never play scissors and that playing rock more than he has to will never improve his gains. This means

R = 1/2
G = A( r + (p-1)/2 + P(1-3r) )

Assuming the opponent is aware of this, he will do what he can to minimize G. This would mean that if (1-3r) is negative then he would maximize P (the only variable he controls). Similarly if (1-3r) is positive, he will minimize P. Finally if (1-3r) is 0, he cannot affect G.

In all of these scenarios, if we set r = 1/3 (or at least as close as possible to it) we will maximize G. This means that we should play rock a third of the time and paper the rest of the time. Our expected gains per game is G = A/6 so if A = \$100 we should only be willing to pay \$16.66 or less if we wish to make any gains per game.

## Comment Re:Full analytical solution (Score 2)167

Oh my, another correction and this one is bigger! I assumed that player 2 would always play the rock half the time when in fact he is still able to choose rock even when the coin-toss doesn't force it. This adds another variable for which I didn't account.

Still, without going into expanding the probabilities, my intuition tells me that it doesn't improve his strategy to increase the rate of playing rock as this would only makes him more exploitable. Whether my intuition is correct is something I'll test at a more opportune time.

## Comment Re:Full analytical solution (Score 2)167

Another correction (this time to the second to last line): "Finally it's nice to observe that when player 1 sets x = 1/3, the second player's choice of z has no effect on the gains function whatsoever."

I really should have proofread this post.

## Comment Re:Full analytical solution (Score 1)167

Correction for seventh line from bottom: "and player 1 would maximize this by setting x = 1/3 (as was assumed for this case) and y = 2/3 which gives F(1/3, 2/3, 0) = 2/3 - 1/2 = 1/6."

## Comment Full analytical solution (Score 2)167

As some have said, the optimum is a mixed strategy of playing paper 2/3 of the time and rock the rest of the time.

Let's say we're player 1 and they're player 2. The way to calculate this exactly is to first observe that the general form for each players mixed strategy is:

P(r1) + P(p1) + P(s1) = 1
1/2 + P(p2) + P(s2) = 1

where, for example, P(r1) is the probability of a rock being played by player 1.

This means that for a given P(r1), P(p1) and P(p2),

P(s1) = 1 - P(r1) - P(p1) [1]
P(s2) = 1/2 - P(p2) [2]

Now observe that the probability of player 1 winning, P(W), is the sum of the probabilities of each of the winning configurations occurring,

P(w) = P((r1 and s2) or (p1 and r2) or (s1 and p2))
= P(r1 and s2) + P(p1 and r2) + P(s1 and p2)
= P(r1)*P(s2) + P(p1)*P(r2) + P(s1)*P(p2)

After substituting from [1] and [2] we get the following,

P(W) = P(r1)*(1/2 - P(p2)) + P(p1)*(1/2) + (1 - P(r1) - P(p1))*P(p2)

Let's make this easier to read:

x = P(r1)
y = P(p1)
z = P(p2)

So then we have:

P(W) = x*(1/2 - z) + y/2 + (1 - x - y)*z

Similarly, we find that the probability of losing, P(L), is:

P(L) = x*P(p2) + y*(1/2 - z)+ (1 - x - y)/2

Now if we gain by 1 for winning, gain by -1 for losing (aka make a loss by 1) and nothing happens when we draw, then the first players expected gains, F, is given by:

F(x, y, z) = (1)*P(W) + (0)*P(D) + (-1)*P(L) = x + y/2 - 1/2 + z*(1 - 3*x)

[Note that we didn't bother finding the probability of drawing because it has no effect on the value of the expected gains]

Now, recall that player 1 controls x and y (the probability of player 1 playing rock and paper respectively) and player 2 controls z (the probability of player 2 playing paper). Be aware that x and y can at least be zero and at most be 1 minus the other. Also note that z can only range from 0 to 1/2 (because the other half of the time player 2 must throw a rock). Finally, you should understand that player 1 wants to maximize F and player 2 wants to minimize F.

From this equation it is clear that when x is less than or equal to a 1/3, player 2 should minimize z in order to minimize F. So in this case,

F(x, y, z) = F(x, y, 0) = x + y/2 - 1/2

and player 1 would maximize this by setting x = 1/3 (as was assumed for this case) and y = 0 which gives F(1/3, 0, 0) = 1/3 - 1/2 = 1/6.

If on the other hand x were set to a probability greater than or equal to 1/3, player 2 would want to maximize z. This would give,

F(x, y, z) = F(x, y, 1/2) = x + y/2 - 1/2 + (1/2)*(1 - 3*x) = (y - x)/2

Player 1 would maximize this by minimizing x (x = 1/3) and maximizing y (y = 2/3 as x + y = 1),

F(x, y, z) = F(1/3, 2/3, 1/2) = ((2/3) - (1/3))/2 = 1/6.

Finally it's nice to observe that when player 1 sets x = 0, the second player's choice of z has no effect on the gains function whatsoever.

So, in conclusion, the optimal strategy for player 1 is x = 1/3 and y = 2/3 where x is the probability of player 1 playing rock and y is the probability of player 1 playing paper.

## Comment Re: I wonder.... (Score 0)196

If you think your in anyway living in a free country, no matter where you live, , then you must be high on something.

If you're high on something there's a good chance you're living in a free country, so I would mostly agree with you.

## Comment It wasn't going to stay censored forever... (Score 1)101

Surely there comes a time when such atrocities are so far behind you that they no longer feel like 'our fault'. When this time comes, they can serve as a useful indication of how much better things are today (particularly to remind those who would wish a return to those days).

## Comment Re:'Kill shot' cameras (Score 1)263

I appreciate your response and can also see your point of view. I understand that, for you, a hunter is more than a person who hunts: he must also hunt for the right reasons.

I can imagine a lot, perhaps most people who hunt to live, do so without any pleasure for the actual killing. I can't fault those people. The moral grey area comes with those who hunt for sport, since already the aim is enjoyment and taking satisfaction in the creatures death could easily be part of it.

To be clear, I'm not talking about the satisfaction of a job well done: I'm talking about taking pleasure in extinguishing life, for it's own sake. This is what disgusts me personally, and I think it does you too.

So if that's how you feel, I'm glad we agree. Just don't assume that everyone who hunts, does so for the same reasons as you. And don't assume that everyone's definition of 'hunter' is the same as yours.

## Comment Re:'Kill shot' cameras (Score 1)263

I'm pretty sure it's bloodlust such as in the parent post that DogDude was talking about and it doesn't take great leaps for a person to follow this bloodlust into hunting, so I'm confident holmedog's generalisation is wrong. Looking at the moderation though, it would seem slashdotters would prefer what he said to be true and so have modded him informative.

I personally differ with people who want to turn creatures into clouds of blood, and share DogDude's sentiment. If you consider this an irrational stance to take, I would agree that it is, along with caring about other human beings and the like.

## Comment Thank you. (Score 1)135

Thank you, Slashdot, for giving me one more way in which I'm more like a woman than I am like a man. :(

## Comment Re:Bad metric (Or, I have a better solution) (Score 1)348

Given that the driver ran the red light, there's zero chance you will have predicted it as opposed to MIT's nonzero chance.

That isn't to say your method doesn't have it's advantages, but so does assuming everyone will run the red light. What matters more is how you use that information and whether the reactionary costs are outweighed by the potential costs of not acting.

## Comment Re:I'll be first to say WTF (Score 1)700

To say that 0.999... = 1 is undeniably the case is just not true.

It's quite possible to define a number system that has numbers separated by an infinitely small gap just as much as you can have numbers so large they can only be expressed as bigger than all numbers of a set. It's still maths and it still produces true statements.

Maths will only ever churn out true statements based on the assumptions you make, it's your philosophy that will tell you what assumptions are meaningful.

# Slashdot Top Deals

10.0 times 0.1 is hardly ever 1.0.

Working...