Here’s a philosophical question about the “White Elephant” game: is it more likely that everyone will leave with a gift they like or that at least a few people will be miserable?
If you don’t know the game, the basic rules are these: every person brings one present to the game. An order is determined for players to take turns. On each turn, you may either open a present or steal a present from someone else.
This question came up over the weekend because my wife and I had a dispute about the run-off rules she came up with. What happens to you after your present is stolen by someone else? My suggested rule was basically recursion: you get to open or steal, but you can only steal things that haven’t yet been stolen this round. (A round begins when someone who hasn’t played yet chooses to open a present or steal one.) Her suggested rule was that you open a new present, full stop.
I did some math and realized that run-off rule preference is probably biased by whether you think it’s more likely that everybody is trying to get to their preferred gift and there isn’t much competition for that gift, or whether you think it’s likely that someone brought a highly-desirable gift that everybody is after. Is everyone in your party a beautiful and unique snowflake, or is somebody Michael Scott?
Let’s say you have N people participating (let’s call them person
B, …), and each person brings a gift (let’s call them
b, …). Let’s model preferences as simply as possible: each person has a single gift which is their preferred gift. The possible preferences for the case of two people is very simple: person
A could prefer gift
a or gift
b, and person
B could prefer gift
a or gift
b. Let’s write person
A’s preference next to person
B’s preference, so
ab represents person
A preferring gift
a (their own) and person
B preferring gift
b, also their own. There are clearly four possibilities here:
bb. In two cases, everybody can leave happy (
ba) but physics prevents satisfaction in both the
It should already be clear that this looks like two binary bits (
b=1; 00, 01, 10, 11) so it’s starting to look like we have an obvious encoding for this problem. The 3-person scenario is very similar: 000, 001, 002, 010, 011, 012, … 222. The cases where everyone is happy are 012, 021, 102, 120, 201, 210. So we have NN person-desire possibilities, against N! ideal outcomes. If you plot these against each other, you will see that total outcomes quickly exceed ideal outcomes. In our particular case we had 12 participants; the likelihood of the ideal outcome is 12! / 1212, or about 0.00537%. This means someone’s going home unhappy 99.995% of the time.
You could argue that this is being melodramatic. For one thing, people probably actually have a preference list. Maybe they are getting their second choice. Most white elephants have post-game exchanges. In the game we actually played, I know with some confidence that one person left unhappy. On the other hand, I don’t know what could have been done about that; some people are just built for misery.
I’m not sure how to analytically tackle the question of whose algorithm results in more happiness. I think a stronger model may be necessary: maybe we should model preference lists and assign an integer score for happiness instead of the boolean-flavor of a single preference and a “did I get it?” boolean value. Or, maybe this is a sufficient model; we could always simulate and see. One thing that bothers me about Liz’s rule is that player one never gets a chance to make a choice, even if they are stolen from. Liz’s retort was that if they are not stolen from, it doesn’t matter which ruleset is used, and if there’s always an iPod, it’s just going to keep getting stolen, and lots of people are going to be miserable anyway, so who cares.
Another question I’d like to analyze, which I think can be done with combinatorics, is, what is the likelihood of M ≤ N people being unsatisfied? The work above is for 1 or more, what’s the exact curve for 1, 2, 3, etc. people being dissatisfied?