An evil salesman has offered you a magic stone. You don't know the value of this stone right now—the only thing you know is that it is worth between 1 and 10 coins inclusive—only integer values permitted—and that the probability that the stone is worth x is x/55.
You have to pay the salesman between 1 to 10 coins, after which the salesman tells you the actual value of the stone. If the number of coins you give is greater than or equal to the value of the stone, you get the stone; if not, you don't get it. In either case, no change is given.
For example, if you pay the salesman 9 coins and the stone is worth 7 coins, you will receive the stone and have a net loss of 2 coins. However, if you pay the salesman 5 coins, you will not receive the stone and your net loss is 5 coins.
You know that paying only one coin will ensure the least expected loss, but how many coins would you have to pay in order to have the greatest expected loss?