# Is Using Someone's Reason Against Them Fraud?

## Recommended Posts

I don't want to wait for the next round for the game to get re-evaluated. I want you to give me a value for the next round before it is played. If you don't you're ignoring the next round in your calculation. As I said before, the value of a play is:

-(2n - 1) + p * (20 - (2n + 1)) + (1 - p) * something.

Until you supply the "something", you're ignoring what happens if the opponent doesn't drop out. There's a probability p that the opponent drops out and a probability 1 - p that he doesn't. If he doesn't, I need to know what will happen.

You can't just ignore what happens and therefore conclude that the value of the game is positive. You can't just assume "something" is equal to zero. What if "something" is a large negative number? Then, the value of the game is negative and the player should drop out.

You can't just say I'm assuming Bob knows things he doesn't know. Even if he doesn't know them, the analysis can't ignore what he doesn't know. If you can't assign a value to "something" then you can't conclude the payoff is positive or even zero. You can't prove it. Your equation is incomplete.

We're both wrong. It seems that the game can't be evaluated at all in terms of utilities because evaluations at later stages of the game determine the evaluations at earlier stages. This is of course, an absurdity. Nonetheless, I think you're right in that we can't just assume that the "something" is zero, but then we can't assume that it is anything else either.

Exactly! Well, I did say earlier that there might not be any way to rationally assign probabilities to the various possibilities, but I played along with your formulation in order to try to prove that it didn't make sense.

There is a lesson here. If your mathematical formulation doesn't agree with common sense, there is probably something wrong with it.

Part of the problem is that the game tree is infinite. So, it is impossible to assign a value to the infinity of possibilities. But no real game is infinite. If the two players kept bidding forever, the game would never end, Alice and Bob would never lose anything and Carl would never win.

EXACTLY! That's why they would want to bid forever. In this paper the authors introduce a method for dealing with infinite games. I'm still working through the formalism, but what they're essentially saying is that bidding forever in an infinite game is rational because it is a Nash Equilibrium. Deviating from that strategy by either player on the nth move always results in an immediate loss of n dollars for that player. Sure, the players are digging themselves into a deeper and deeper hole, but the catch is that no one cares about losing a large some of money infinity years from now.

The problem is that in real life, games don't last forever, so that isn't a solution. I'll take a look at the paper because it looks sort of interesting, but infinite games are a purely academic exercise.

So, you could make some reasonable assumption. Assume that each player has \$100 and reevaluate the game. What strategy would you employ if you were one of the players?

If both players have an equal budget, then it turns out that there is a strategy that one of them can use to win the auction with a positive payoff. Not sure what it is, though.

Lets use common sense here.

If neither player has an advantage, then the expected payoff for both players has to be zero or negative. If the expected payoff for both players is negative, then neither will play and Carl's expected payoff will be zero. If the expected payoff for both players is zero, then Carl's expected payoff is also zero, since he only wins what the other players lose. That is, if the other players expect to lose nothing then Carl can expect to gain nothing.

On the other hand, the player that goes first might have an advantage. But, if that were the case, then there would be no reason for the other player to play. That is, the second player would only play if the first player gave him an opening, e.g., by making a mistake. So, in order to induce the second player to play, the first player would have to make a move that would reduce his own expected payoff to zero or less. So, even in this case, the expected payoff of both players would be zero or negative and the asymmetric game reduces to the symmetric game.

• Replies 208
• Created

#### Popular Days

Exactly! Well, I did say earlier that there might not be any way to rationally assign probabilities to the various possibilities, but I played along with your formulation in order to try to prove that it didn't make sense.

There is a lesson here. If your mathematical formulation doesn't agree with common sense, there is probably something wrong with it.

We can assign probabilities, just not utilities.

The problem is that in real life, games don't last forever, so that isn't a solution. I'll take a look at the paper because it looks sort of interesting, but infinite games are a purely academic exercise.

This factoid is actually irrelevant because a non-zero probability of a player continuing at each point implies an infinite game tree. It's not about whether the game as it plays out in reality is infinite, in figuring out the behavior of real agents or rational agents (or both), it is necessary to consider how they think about the game. For example, it is common in strategic situations to run up against never-ending circular thinking like "I know. He knows that I know. I know that he knows that I know..."

If the players don't know each others' budgets, then there is a non-zero probability that the game will continue arbitrarily far into the future, and so it becomes necessary to consider infinite games.

Lets use common sense here.

If neither player has an advantage, then the expected payoff for both players has to be zero or negative. If the expected payoff for both players is negative, then neither will play and Carl's expected payoff will be zero. If the expected payoff for both players is zero, then Carl's expected payoff is also zero, since he only wins what the other players lose. That is, if the other players expect to lose nothing then Carl can expect to gain nothing.

On the other hand, the player that goes first might have an advantage. But, if that were the case, then there would be no reason for the other player to play. That is, the second player would only play if the first player gave him an opening, e.g., by making a mistake. So, in order to induce the second player to play, the first player would have to make a move that would reduce his own expected payoff to zero or less. So, even in this case, the expected payoff of both players would be zero or negative and the asymmetric game reduces to the symmetric game.

Well, sadly this is wrong. When the players know each others' budgets they know when the game will end and are therefore dealing with a finite game tree that can be solved by backwards induction.

For example, and to keep the analysis as simple and short as possible, suppose that Alice and Bob both have \$3 and Carl is auctioning off \$2. Let's also assume that a player, given the choice between dropping out and playing a move that brings the exact same payoff always chooses to drop out. Then, if Alice bids \$3, Bob will drop out and their payoffs will be (-1, 0). If Alice bids \$2, then Bob can bid either \$3 or drop out. If he bids \$3, their payoffs are (0, -1), and if he drops out (0, 0). This is better for Alice than if she were to bid \$3 no matter what Bob does, so Alice will bid at most \$2 on her first move. Meanwhile, Bob should drop out immediately after Alice bids \$2. If Alice bids \$1, then Bob can bid either \$3, \$2, or he can drop out. If he bids \$3, then the payoffs are (-1, -1). If \$2, then Alice can further bid \$3 or drop out. If she bids \$3 at this point, then the payoffs are (-1, -2), and if she drops out, then (-1, 0). If Bob drops out after Alice bids \$1, then the payoffs are (1, 0). Notice that if Alice bids \$1, then Bob cannot do better than getting \$0 by bidding any further and that he loses nothing by dropping out right away. Therefore, if Alice bids \$1, she wins \$1 and Bob drops out right away, losing nothing. If Alice drops out at the beginning, then Bob simply bids \$1 and wins \$1.

Additionally, the outcomes of finite dollar auctions are extremely sensitive to the exact amounts of the players' budgets and the amount that Carl auctions. If b is the players' budget, s the amount auctioned, and xi and xj the players' current bids, then the optimal bid for player i is (b - xj - 1) mod (s - 1) + xj + 1 if this number is less than xi - s, and the player should drop out otherwise.

So to get back to the example where Carl auctions \$20 and Alice and Bob both have \$100, the optimal strategy is for Alice to bid \$5, and for Bob to drop out right away. The first player always wins and poor Carl always loses money.

So you see, even in finite games with known budgets it is rational for Alice and Bob to play.

##### Share on other sites

Naomi, the point you are missing is that there is no mathematical formula that will allow you to figure out the behavior of real human beings. The range of human emotions and actions cannot be reduced to numbers in the way that you wish. That's what Darrel, and everyone else who has participated in this thread, wants you to consider.

##### Share on other sites

For cognition, Ayn Rand actually based her theory of concepts on algebra and measurements.

But she didn't use math to explain human behavior, nor equate rational thought to math.

However she came close. If you include ordinal number thinking (first, second, third, etc.) along with cardinal number thinking (one, two, three), she did use that to explain how best to make choices and structure values. (This is more important than that, i.e., this comes first and that comes second.)

For her, cognition also includes observation and identification, which includes differentiation and integration. And this last led to the algebra-like format being possible. But this leads to math, it is still not it. Peikoff even wrote an entire book on integration from a non-math perspective, DIM.

Maybe a case could be made that this actually is math, but I see it as preparatory. Like you need to observe something before you can classify it as a unit. That observation is not math, but classifying it a unit is (kinda ).

Michael

##### Share on other sites

For cognition, Ayn Rand actually based her theory of concepts on algebra and measurements.

But she didn't use math to explain human behavior, nor equate rational thought to math.

However she came close. If you include ordinal number thinking (first, second, third, etc.) along with cardinal number thinking (one, two, three), she did use that to explain how best to make choices and structure values. (This is more important than that, i.e., this comes first and that comes second.)

For her, cognition also includes observation and identification, which includes differentiation and integration. And this last led to the algebra-like format being possible. But this leads to math, it is still not it. Peikoff even wrote an entire book on integration from a non-math perspective, DIM.

Maybe a case could be made that this actually is math, but I see it as preparatory. Like you need to observe something before you can classify it as a unit. That observation is not math, but classifying it a unit is (kinda ).

Michael

There is most definitely a mathematical precision in Ayn Rand's use of words.

Greg

##### Share on other sites

So you see, even in finite games with known budgets it is rational for Alice and Bob to play.

It's irrational to play zero sum games because no wealth is actually being produced. Wealth is only transferred from loser to winner.

Some people choose to feed off of what is lost by others, while others choose to create value through useful production.

Each approach to life creates different kinds of people. One creates only parasites and hosts, predators and prey, clever deceitful snakes and stupid clueless mice, con men and suckers...

...while the other creates decent human beings.

Greg

##### Share on other sites

My son is dyslexic. One way he compensates is to apply musical notes to certain letters that give him diffculty. My child, literally, hears music when he reads. I have no idea how he developed this method or why it works for him, and I never will, but it's a freaking amazing thing that he accomplished.

I think math is Naomi's music, and she's working through how she can apply something that makes total sense to her to something that creates conflict within her. Indeed, it's a unique way of thinking, but kind of amazing, too.

Naomi, forgive me for talking about you as if you aren't here.

Heh. Yesterday, a friend of mine pointed out that my name is an anagram of "Alien Number God'.

##### Share on other sites

Hello 'Mundane obliger'!

##### Share on other sites

Naomi, the point you are missing is that there is no mathematical formula that will allow you to figure out the behavior of real human beings. The range of human emotions and actions cannot be reduced to numbers in the way that you wish. That's what Darrel, and everyone else who has participated in this thread, wants you to consider.

If you think that you can't think about something mathematically, then you don't know enough math.

##### Share on other sites

So you see, even in finite games with known budgets it is rational for Alice and Bob to play.

It's irrational to play zero sum games because no wealth is actually being produced. Wealth is only transferred from loser to winner.

I meant "rational" from a game-theoretic perspective.

##### Share on other sites

So you see, even in finite games with known budgets it is rational for Alice and Bob to play.

It's irrational to play zero sum games because no wealth is actually being produced. Wealth is only transferred from loser to winner.

I meant "rational" from a game-theoretic perspective.

Within the world of zero sum that is the only possible "rationality".

##### Share on other sites

Naomi, the point you are missing is that there is no mathematical formula that will allow you to figure out the behavior of real human beings. The range of human emotions and actions cannot be reduced to numbers in the way that you wish. That's what Darrel, and everyone else who has participated in this thread, wants you to consider.

If you think that you can't think about something mathematically, then you don't know enough math.

If you think that you should think about everything mathematically, then you don't know enough about life.

##### Share on other sites

Naomi, the point you are missing is that there is no mathematical formula that will allow you to figure out the behavior of real human beings. The range of human emotions and actions cannot be reduced to numbers in the way that you wish. That's what Darrel, and everyone else who has participated in this thread, wants you to consider.

If you think that you can't think about something mathematically, then you don't know enough math.

If you think that you should think about everything mathematically, then you don't know enough about life.

Gimme an Amen on that statement sister!!

##### Share on other sites

Naomi, the point you are missing is that there is no mathematical formula that will allow you to figure out the behavior of real human beings. The range of human emotions and actions cannot be reduced to numbers in the way that you wish. That's what Darrel, and everyone else who has participated in this thread, wants you to consider.

If you think that you can't think about something mathematically, then you don't know enough math.

If you think that you should think about everything mathematically, then you don't know enough about life.

Gimme an Amen on that statement sister!!

Is this ying/yang?

--Brant

the world (I) wants to know

##### Share on other sites

Naomi, the point you are missing is that there is no mathematical formula that will allow you to figure out the behavior of real human beings. The range of human emotions and actions cannot be reduced to numbers in the way that you wish. That's what Darrel, and everyone else who has participated in this thread, wants you to consider.

If you think that you can't think about something mathematically, then you don't know enough math.

If you think that you should think about everything mathematically, then you don't know enough about life.

touché

##### Share on other sites

If you think that you should think about everything mathematically, then you don't know enough about life.

I'm sorry.

I didn't mean to attack you personally by implying that you don't know enough about math. I was just trying to say that if anything can be described and understood at all, it can be described and understood precisely, even if that sometimes turns out to be very difficult to actually do. But I think it came out wrong.

##### Share on other sites

Naomi,

I'm impressed.

Are you actually a sweet person in disguise?

(Just kidding with you. )

Every day that passes, I get more and more glad my first impression of you was so wrong.

Michael

##### Share on other sites

If you think that you should think about everything mathematically, then you don't know enough about life.

I'm sorry.

I didn't mean to attack you personally by implying that you don't know enough about math. I was just trying to say that if anything can be described and understood at all, it can be described and understood precisely, even if that sometimes turns out to be very difficult to actually do. But I think it came out wrong.

No offense taken. Whether you meant me specifically or not, the fact remains that I probably don't know enough about math. There's lots of things I don't know enough about. If I took offense everytime this point became clear to me, I'd spend a whole lot of time offended.

What I'm suggesting is that you attempt to find a balance between the mathematical concepts that make sense to you, and the very real fact that those concepts will likely never, on their own, give you an understanding of human behavior. It's perfectly okay for numbers to be your foundation, but you're going to need to integrate some other stuff, too.

For what it's worth, this thread has provided significant insight into how you think, and I'll say again that I've been fascinated and amazed. You have a great mind. If nothing else, hopefully some of us here in the OL community will know better in the future the best way to interact and engage with you.

##### Share on other sites

No offense taken. Whether you meant me specifically or not, the fact remains that I probably don't know enough about math. There's lots of things I don't know enough about. If I took offense everytime this point became clear to me, I'd spend a whole lot of time offended.

What I'm suggesting is that you attempt to find a balance between the mathematical concepts that make sense to you, and the very real fact that those concepts will likely never, on their own, give you an understanding of human behavior. It's perfectly okay for numbers to be your foundation, but you're going to need to integrate some other stuff, too.

For what it's worth, this thread has provided significant insight into how you think, and I'll say again that I've been fascinated and amazed. You have a great mind. If nothing else, hopefully some of us here in the OL community will know better in the future the best way to interact and engage with you.

I don't expect them to. Even if I had a perfect mathematical model of human behavior, it would be so complicated that it would be completely impractical to actually use it. My point is that it is possible to use math in very simplified and idealized situations so that we can gain insights for when things get messy. I'm saying this because, in your first post, you seemed to be saying that it is impossible to think about human behavior mathematically.

##### Share on other sites

My math skills are so lousy all I can do is this sort of thing: 1 human plus 1 human = 2 humans.

--Brant

A is A (?)

oh, A = A! Got it!

##### Share on other sites

My math skills are so lousy all I can do is this sort of thing: 1 human plus 1 human = 2 humans.

--Brant

A is A (?)

oh, A = A! Got it!

Hmm... "A is A" isn't enough.

(1) A is A.

(2) If A is B, then B is A.

(3) If A is B, and B is C, then A is C.

##### Share on other sites

My math skills are so lousy all I can do is this sort of thing: 1 human plus 1 human = 2 humans.

--Brant

A is A (?)

oh, A = A! Got it!

Hmm... "A is A" isn't enough.

(1) A is A.

(2) If A is B, then B is A.

(3) If A is B, and B is C, then A is C.

The only rub is swallowing the Kool Aid that A is B.

##### Share on other sites

I think this sums it up pretty well ...

Darrell

##### Share on other sites

Exactly! Well, I did say earlier that there might not be any way to rationally assign probabilities to the various possibilities, but I played along with your formulation in order to try to prove that it didn't make sense.

There is a lesson here. If your mathematical formulation doesn't agree with common sense, there is probably something wrong with it.

We can assign probabilities, just not utilities.

The problem is that in real life, games don't last forever, so that isn't a solution. I'll take a look at the paper because it looks sort of interesting, but infinite games are a purely academic exercise.

This factoid is actually irrelevant because a non-zero probability of a player continuing at each point implies an infinite game tree. It's not about whether the game as it plays out in reality is infinite, in figuring out the behavior of real agents or rational agents (or both), it is necessary to consider how they think about the game. For example, it is common in strategic situations to run up against never-ending circular thinking like "I know. He knows that I know. I know that he knows that I know..."

If the players don't know each others' budgets, then there is a non-zero probability that the game will continue arbitrarily far into the future, and so it becomes necessary to consider infinite games.

Lets use common sense here.

If neither player has an advantage, then the expected payoff for both players has to be zero or negative. If the expected payoff for both players is negative, then neither will play and Carl's expected payoff will be zero. If the expected payoff for both players is zero, then Carl's expected payoff is also zero, since he only wins what the other players lose. That is, if the other players expect to lose nothing then Carl can expect to gain nothing.

On the other hand, the player that goes first might have an advantage. But, if that were the case, then there would be no reason for the other player to play. That is, the second player would only play if the first player gave him an opening, e.g., by making a mistake. So, in order to induce the second player to play, the first player would have to make a move that would reduce his own expected payoff to zero or less. So, even in this case, the expected payoff of both players would be zero or negative and the asymmetric game reduces to the symmetric game.

Well, sadly this is wrong. When the players know each others' budgets they know when the game will end and are therefore dealing with a finite game tree that can be solved by backwards induction.

For example, and to keep the analysis as simple and short as possible, suppose that Alice and Bob both have \$3 and Carl is auctioning off \$2. Let's also assume that a player, given the choice between dropping out and playing a move that brings the exact same payoff always chooses to drop out. Then, if Alice bids \$3, Bob will drop out and their payoffs will be (-1, 0). If Alice bids \$2, then Bob can bid either \$3 or drop out. If he bids \$3, their payoffs are (0, -1), and if he drops out (0, 0). This is better for Alice than if she were to bid \$3 no matter what Bob does, so Alice will bid at most \$2 on her first move. Meanwhile, Bob should drop out immediately after Alice bids \$2. If Alice bids \$1, then Bob can bid either \$3, \$2, or he can drop out. If he bids \$3, then the payoffs are (-1, -1). If \$2, then Alice can further bid \$3 or drop out. If she bids \$3 at this point, then the payoffs are (-1, -2), and if she drops out, then (-1, 0). If Bob drops out after Alice bids \$1, then the payoffs are (1, 0). Notice that if Alice bids \$1, then Bob cannot do better than getting \$0 by bidding any further and that he loses nothing by dropping out right away. Therefore, if Alice bids \$1, she wins \$1 and Bob drops out right away, losing nothing. If Alice drops out at the beginning, then Bob simply bids \$1 and wins \$1.

Additionally, the outcomes of finite dollar auctions are extremely sensitive to the exact amounts of the players' budgets and the amount that Carl auctions. If b is the players' budget, s the amount auctioned, and xi and xj the players' current bids, then the optimal bid for player i is (b - xj - 1) mod (s - 1) + xj + 1 if this number is less than xi - s, and the player should drop out otherwise.

So to get back to the example where Carl auctions \$20 and Alice and Bob both have \$100, the optimal strategy is for Alice to bid \$5, and for Bob to drop out right away. The first player always wins and poor Carl always loses money.

So you see, even in finite games with known budgets it is rational for Alice and Bob to play.

Hi Naomi,

You've made a valiant effort to solve this problem. You're very persistent. I have to give you credit for that. I don't know how much more time I want to dedicate to the problem, but let me just make a few points.

(1) Your final analysis doesn't support the thesis that you originally put forth. I hope you realize that. If it is in the interest of the first person to bid \$5 and in the interest of the second person to bid nothing, then Carl can't necessarily "use reason against them."

(2) You state that we can't assign utilities to the actions of the players, but in order to have an optimal solution, you must have an optimality criterion. Even to say that one strategy is better than another, you must have some criterion for making a distinction. If it is not utility --- if it is not expected payoff --- what is it?

(3) Your example where Alice and Bob have a budget of \$3 and the amount auctioned is \$2 is a little too simple to show some of the potential complexities that occur if those limit are raised. However, even there I would add a little to the analysis.

You stated that backward induction on the game tree could be used to solve for the expected payoff, but I don't see backward induction being used in your example.

You assumed that it if dropping out or moving on would bring the same payoff, but that doesn't have to be an assumption. There could be a good reason for dropping out earlier. Less risk.

If Alice bid \$1, then Bob would get \$0 if he didn't bid or if he bid \$2 and Alice didn't bid again. If Bob bid \$2, then Alice's payoff would be -\$1 if she dropped out after bidding \$1 or if she bid \$3 and won \$2. Since, in this case, Alice has nothing to lose by bidding \$3 --- Bob can't bid more because of his limited budget --- she might do it just to spite Bob for bidding \$2 and causing her to lose her dollar. So, since Bob has nothing to gain and might lose something by bidding \$2, he should not bid \$2. So, the choice to drop out rather than bid doesn't need to be an assumption. It is the rational thing to do (assuming expected payoff or something is the optimality criterion).

(4) Your general formula doesn't make sense. You stated, "the optimal bid for player i is (b - xj - 1) mod (s - 1) + xj + 1 if this number is less than xi - s, and the player should drop out otherwise." That means, (100 - 0 - 1) mod (20 - 1) + 0 + 1 is Alice's optimal bid if that number is less than 0 - 20, or if 99 mod 19 + 1 < -20 or if 5 < -20. So, according to your decision rule, Alice should never bid, nor should Bob.

I don't know where your decision rule came from. If it is correct, then neither Alice or Bob should ever bid which is what I've been saying all along. However, I suspect the analysis is not quite right. So, let's return to a more intuitive analysis of the problem.

For Alice and Bob, the game is sort of like playing chicken on the highway. You may be too young to understand that reference, but in the early days of the automobile, it wasn't that unusual for a couple of good-old-boys to play a game of chicken on some remote stretch of two lane highway. They would drive straight at each other down the center of the road and the first person to swerve into the ditch to avoid a collision lost. Of course, sometimes neither person would swerve into the ditch and there would be a collision. Cars were tougher back then, but that was back before the days of airbags and seat belts, so guys could end up badly injured.

Losing the current game isn't quite that dramatic, but part of the goal is to dissuade the other player from bidding. So, for example, the first player could bid \$19, as I pointed out earlier, which would be very effective at dissuading the other player from bidding, but wouldn't result in a very big payoff. So, the first player would want to bid as low as possible while still dissuading the other player from bidding. That is where the risk comes in.

My guess is that the optimal first bid would be about \$10. If Alice bid \$10, then Bob could bid anywhere from \$11 to \$19 and still make money if he won. So, his maximum gain would be \$9. However, if he bid \$11, Alice, who is down \$10, could bid anywhere from \$12 to \$29 to come out ahead of where she current was, so she would almost certainly bid again. So, if Bob wanted to dissuade Alice from bidding again, he should bid as much as possible. But, the highest bid he could make that would result in a gain for him would be \$19. But, if he bid \$19, Alice could come out ahead by bidding \$20 - \$29, which she might or might not do. She might do it to minimize her loss, but then she would have to worry that Bob could bid up to \$38 to minimize his loss. Still, a \$19 bid would be pretty risky for Bob because of the significant probability that Alice would respond with a higher bid to minimize her loss. So, he probably wouldn't do it, but he might if he thought he could psych out Alice and still make a profit.

Now, if Alice initially bid too low, Bob could respond with a higher bid. For example, if Alice bid \$5, Bob could bid anything from \$15 - \$19 and come out ahead and deter Alice from bidding at the same time. She would be unlikely to respond because of the significant risk that she would lose significantly more than \$5.

So, it appears that Alice would have an advantage bidding first. If she wanted to strongly discourage Bob from bidding, she could bid something like \$15 and take home a tidy profit while limiting her risk.

Since Carl probably wouldn't want to keep paying out profits to the first bidder, he might set up a rule that each person could only increase the bid by \$1 each round and must start at \$1. This game is more difficult to analyze. However, I think the results are likely to be similar. By the time the two players reach \$10, they can each still go \$19 higher and come out ahead relative to where they are. However, they know that their opponent is in the same situation and is no more likely to quit than they are. Therefore, they should quit earlier to avoid a head on collision with the other player.

Darrell

##### Share on other sites

I was in a pub once and witnessed two "strangers" setting up a con game. The game is played with 15 coins.

The rules of the game are you can go first or I can go first. You may take 1,2 or 3 coins from the pile at one time. Then the opponent can do the same. The person that is left with one coin on the table is the loser. I say back and watched the strangers playing for money apparently winning and losing at random with either one or the other getting mad, doubling down etc.

I was to be the mark.

So I joined in already knowing the "hidden" rules to the game. I feigned the same outrage. I ended up taking about 400 bucks out of the exchange.

After I collected my winnings I looked at them square in the eye and said " 2,6 and 10" they knew I conned the cons.

If you get to start first you take 2 and that guarantees you win. They then take whatever lets say 1. So you then take 3. (=6) they take 2

You take 2(=10) they take 1 you take 3 leaving the last coin on the table and winning the game.