Actually, flink means nice in Danish.
Slashdot videos: Now with more Slashdot!
“After a fairly shaky start to the day, Arthur's mind was beginning to reassemble itself from the shell-shocked fragments the previous day had left him with.
He had found a Nutri-Matic machine which had provided him with a plastic cup filled with a liquid that was almost, but not quite, entirely unlike tea.
The way it functioned was very interesting. When the Drink button was pressed it made an instant but highly detailed examination of the subject's taste buds, a spectroscopic analysis of the subject's metabolism and then sent tiny experimental signals down the neural pathways to the taste centers of the subject's brain to see what was likely to go down well. However, no one knew quite why it did this because it invariably delivered a cupful of liquid that was almost, but not quite, entirely unlike tea.”
Good news: after running extensive realistic experiments using the well known disease simulator plague, inc. I can conclude that the probaility that ebola will successfully annihilate humanity is quite low (because it didnt spread first and then ramped up the mortality which seemes to be one of the only really winning strategies)! The bad news is that now might be a good idea to move to Greenland since it could still destroy everything else
Actually, you will generally know an upperbound on the length of whatever is encypted given the encrypted text. That is something, so it is not quite perfect (In theory anyway).
I think your point count against you.
I personally do not mind yards and feet too much but I dislike miles since it depends on the country.
The question isnt how much you bet against his bet but how much you are willing to pay him to play a given game with a fixed value for the outcomes (i.e winning losing and tieing). Note this is different from a bet in exactly the way you mention with the 3rd player, except that you pay your oppeont and not someone else - i.e you lost your payment even if you tie.
If you win 75% and him the rest and you get 1 for win 0 for tie and 1 for lose, you get 1 with 75% chance and lose 1 with 25% chance (0.5 on average) and should then be willing to give him upto 0.5 before playing the game because you then come up even (equvialently the game is fair/even if you get 0.5 for winning (1 minus the 0.5 you paid upfront), -0.5 for tieing and -1.5 for losing). Similarly if you pay him 0.5 and you win 1 with 50% chance and tie and get 0 with the remaning you are still even.
Game theory theories are math theories and not physics theories. Building on some given assumptions (i.e. what we measure is what you are willing to pay upfront) the theories are correct (if we look away from posible errors in the proofs - that said this is von Neumanns normal form games we are looking at right now and the proofs are correct under the assumptions used).
Say you increase to paper 2/3 - x and rock 1/3 + x. Then against paper 50% you win 1/3 - x/2 of the time and he wins 1/6+ x/2 of the time. Hence you get 1/6 - x on avg. On the other hand against scissors 50% you win 1/2 of the time (after having picked yours he got one option which is losing and he play each with 50%) and he wins 1/3 -x/2 of the time for 1/6-x/2. Finding the x that maximizes the smallest is easy. It is x=0. Note that you get 1/6 against the strategy I mentioned for the other guy because you do not do anything really stupid (= scissors). Also, note that maximizing the smallest is the most important thing if he is smart enough to figure out your strategy (since he will answer with the strategy leading to the smallest nr.)
This is right yes
His nash strategy (minimax strategy) is to play rock 1/2, scissors 1/3 and paper 1/6 (it is true that rock 1/2 and scissors 1/2 is a best response to the nash strategy, but that does not mean it is optimal against abitrary strategies). I have a long earlier post showing the strategies and so on
Note, I say I can argue that my strategy gives 1/6 on avg. in the sense that I did argue that in a looong post above
Ok lets see: you play 2/3 paper (I shorten the fraction I hope that is ok
Next, random round in which you play paper: In those you get 1/2 against his rock, and lose 1/3 against his scissors, i.e. again you gain 1/6.
Next, random round in which you play scissors: In those you get 1/6 against his paper and lose 1/2 against his rock, i.e. you LOSE 1/3.
On avg you play rock 2/3 of the time and get 1/6 in those rounds, scissors in 1/6 of the time and LOSE 1/3 and paper 1/6 of the time and get 1/6. Thus, on avg. 2/3*1/6+1/6*(-1/3)+1/6*1/6=1/12. This is below your lower bound so there is something wrong with it. (the reason is that you lose whenever a bit on avg. whenever you do not play paper).
My strategies, played against each other gives 1/6. Thus, you can not say that yours is better always. I can argue that against ANY strategy mine gets 1/6. You can not get better than 1/12 (because you get that against mine strategy for the other player). Thus, yours can not be optimal sorry
"you" are not the guy playing rock 50% of the time. "you" are the guy beating on the poor guy playing rock 50% of the time. The optimal choice is to play rock 1/3 and paper 2/3 (his is to play rock 1/2 and paper 1/6 and scissors 1/3).
Since I already explained the optimal solution to the basic question mentioned in the summery lets solve the bonus question too (my solution also matches the solution given in comments on the article side so it should be good (and said to be correct by the author) - note currently no answer with a high score is correct - mine has 1).
The bonus question is that you play two rounds, and your oppoent must play atleast rock once. So, if he plays something not rock in the first round he must play rock in the second and loss (you just play paper). If he plays rock in the first he can play 1/3 all in the second (which leads to a draw like normal). Thus, if he plays rock first it is like normal RPS (because he get 0 in the next). Otherwise you get one free win (for the second round).
Thus, we can model the first game of the bonus question as (where the numbers is the number of rounds he wins on avg given the choice in round 1):
R P S
R 0 -1 1
P 0 -1 -2
S -2 0 -1
Where you pick columns and him rows. We see that rock dominates paper for the row player. We get
R P S
R 0 -1 1
S -2 0 -1
For the column player, the choice of rock now dominates scissors. We get
R 0 -1
S -2 0
Playing rock 1/3 and paper 2/3 for the collumn player gives -2/3 wins on avg. Similarly, the row player can get -2/3 wins on avg by playing rock 2/3 and scissors 1/3.
There is a flaw in your reasoning. You do not know that your oppoent flipped so you can not condition on it like you do here (you can not play paper all the time if he "flips" rock because you do not know his coin flip). If you think about it you should NEVER play scissors. In the best case for you he plays rock 50% and paper 50% and you get 0 in expectation and clearly you got an advantage so 0 is not good.
The optimal strategy is to play 1/3 rock, 2/3 paper. It gives at least 1/6 against anything he could play. He can similarly ensure that you can not get more than 1/6 a game by playing rock with probability 1/2, paper with probability 1/6 and scissors with probability 1/3. Your strategy would get less than 1/6 against that (more precisely, you get 1/6 if you play either rock or paper and you lose 1/3 if you play scissors. Therefore you get 1/6*5/6-1/3*1/6=1/12 which is less than the 1/6 you get for playing 1/3 rock and 2/3 paper).
See my above post for a indepth analysis.