Settlers of Cetan is a board game, and a very economically interesting one. Since it's complexity is on the order of risk or monopoly, I'm going to skip the entire rule set and explain the necessary rules as I go.
There are two phases to my analysis. The first is that commodity "prices" in the game exhibit real world economic drivers. The second is that, generally and specifically, players must behave strategically not unlike some of the models we've discussed in class.
To really get at the value of the commodity in the game it's important to set up how the commodities have value. First, the main goal of the game is to win ten points. You win points for having towns and cities, and for various special things like having the longest road or the most knight cards. In order to get these you need to have the resource cards to build/purchase them There are five kinds: Brick, Wood, Sheep, Wheat, and Ore. The building costs for the roads, towns, knights, etc. vary.
There's one way to get resource cards and three ways to get the kind of card you need. To get a card Players choose locations on the board which have a probability (die roll) of giving them a resource at the beginning of each players turn. Players get more locations with probabilities for getting resources by building more settlements/towns. The three ways of getting the kind of resources you need are to, 1) place your locations so that you get a probabilistically steady flow of that resource, 2) TRADE, 3) players can unilaterally trade in cards for others at given ratios if they have a location on the edge of the board (ports). 1 is the least interesting for this initial analysis. 2 and 3 are more obviously economically interesting.
Prices. You need roads to get to new locations to build new settlements to convert them into cities, whew. Because of the varying costs of each of these items certain resources are more valuable at different times during the game than others. To build a road you need 1 brick and 1 wood. Players have a higher need for roads early on in the game, but there are only ever so many of the necessary resources in play at the beginning of the game so that players are trading brick and wood at a higher value than say, Ore or Wheat. Often times players are willing to trade two for one for a resource because of their need. Towards the end of the game, Ore and Wheat are more important as people try to convert settlements to towns. The value of wood and brick drops significantly because roads don't get points and towns do.
Ok, so far I've described but not mentioned how the game includes scarcity, comparative advantage, and benefits of trade. (Too bad game theory isn't one of Mankiw's seven laws.)
The more relevant piece of the the game for this class is the game-theoretic aspect. I had ambitions to model more aspects of the game but this post is getting too long.
The paradox we've talked about in class - Pop quiz/Franchise? - plays out to a certain degree, one significant difference is that players don't know when the game is ending but they know when it's getting close to ending. They can look around the board see how many settlements/towns people have, they can see who has the longest road or who has played the most knight cards, but there are point cards that must be hidden until the winner announces his victory. How this affects behavior is that players are less likely to trade with the player who is closest to winning - or something like that. A player doesn't want to "trade away the game," by trading just the right resources that the other player needs to win in that turn, in order to get cards that won't secure immediate victory for the player. Theoretically, I suppose a player could use backwards induction to reason they should not trade given that the game does have an ending, it's just indeterminate when that is. At some round r it will be the last round, in that round it will not be in my best interest to trade and you can sort of get the same result as the paradox.
Initially, people will trade much more freely (and in my experience playing this is true). We can think of trade as a cooperative behavior because any act of trading in the game is voluntary and for mutual benefit. Towards the end of the game people are less likely to trade as I had said; there is an incentive to act non cooperatively to come out ahead. If players know that in the end they will not behave cooperatively, they ought to behave non cooperatively before the other players which drives us back to round one. However, there are incentives from forward induction that drive players to trade and break the trade deadlock - if a player limits himself from trade he will have a much harder time getting the resources he needs in the critical first rounds and fall very far behind those who defect and trade. Players cannot possibly have all the resource driving locations or even ones with good probabilities at the start of the game so they need to trade to secure them. If other players are limiting themselves to hurt other players but at least two are willing to trade they will be better off than those who are not. This pushes players to trade. It's interesting that a game of marginally more complexity than the ones we've studied in class (marginally more than modeling all the distal and proximate factors in even a medium sized market) can add so many more motivations and drivers for behavior.
Subscribe to:
Post Comments (Atom)
I've played this game several times. The object of the game is to achieve a certain level of economic growth before any of your opponents do. It's no coincidence that the primary strategy for winning the game is exactly what the FTC polices: monopoly of a market. In this case, resources. I wonder how the game would be played if there were anti-trust rules involved. It probably wouldn't be nearly as much fun.
ReplyDelete