Formal Approaches to Social Procedures

First published Mon Sep 8, 2014

Social procedures that have algorithmic aspects can often be improved by redesign. This holds for voting and other peaceful decision making procedures, for match-making, for auctioning, for fair division of estates, and for many procedures of distributive justice. The algorithmic aspects can be analyzed with formal methods. The term “social software” was coined by Rohit Parikh (2002) for the emerging interdisciplinary enterprise that is concerned with the design and analysis of algorithms that regulate social processes. Such analysis and (re-)design uses methods from logic, game theory and theoretical computer science.[1] The goals of research in formal approaches to social procedures are modeling social situations, developing theories of correctness, and (re-)designing social procedures.

Logic, game theory and computer science are not the only disciplines that have something to say about social mechanisms. Such mechanisms are also an object of study in voting theory, in auction theory, in social choice theory, in mechanism design theory, and in algorithmic game theory. Multi-agent interaction at a more abstract level is studied in artificial intelligence and distributed computing, so all these disciplines have something to say about the formal analysis of social interaction.

1. Social Procedures as Algorithms

Social software cannot be seen as a clearly defined research field on its own, but rather an umbrella for certain types of research in computer science, logic, and game theory. Nevertheless, the social software perspective on social procedures and intelligent interaction, emphasizing algorithms and information, has already produced a wide variety of important insights. In this article, a number of examples will be discussed and pointers will be given to related discussions in philosophy.

The prototypical example of an algorithm in mathematics (see also entry on computability and complexity) is Euclid's recipe for finding the greatest common divisor (GCD) of two positive whole numbers $A$ and $B$. The GCD of two numbers is the greatest number that divides both numbers without residue.

If $A$ is larger than $B$, then replace $A$ by $A-B$, otherwise, if $B$ is larger than $A$, replace $B$ by $B - A$. Continue like this until $A$ equals $B$.

The final $A = B$ yields the greatest common divisor of the numbers $A$ and $B$ that you started with. For instance, suppose you start the algorithm with $A = 20$, $B = 12$. Then in the first step, $A$ gets replaced by $20 - 12 = 8$. In the second step, $B$ gets replaced by $12 - 8 = 4$. In the third step, $A$ gets replaced by $8 - 4 = 4$, and the two numbers have become equal. The algorithm yields $4$ as the GCD of $20$ and $12$.

Euclid's recipe is formal, and we can analyze it with formal means. The correctness of Euclid's algorithm follows from the insight that if you have two numbers $A$ and $B$ with $A$ larger than $B$, and you replace $A$ by $A - B$, then the set of common divisors of the number pair does not change.

In exactly the same manner, social procedures can be analyzed with the formal tools of logic and theoretical computer science.

2. Fairness

Formal methods by themselves do not solve philosophical issues, as the following tale from Padma (2007) illustrates.

Two farmers, Ram and Shyam were eating chapatis. Ram had 3 pieces of the flat, round bread and Shyam had 5. A traveller who looked hungry and tired rode up to the two men. Ram and Shyam decided to share their chapatis with him. The 3 men stacked the 8 chapatis (like pancakes) and cut the stack into 3 equal parts. They shared the pieces equally and ate until nothing was left. The traveller, who was a nobleman, was so grateful that he gave the two farmers 8 gold coins for his share of the food.

After the traveller left, Ram and Shyam wondered how they should share the 8 gold coins. Ram said that there were 8 coins and only 2 people, so each person should get an equal share of 4 coins. “But that's not fair,” said Shyam, “since I had 5 chapatis to begin with.” Ram could see his point, but he didn't really want to give 5 of the coins to Shyam. So he suggested they go see Maulvi, who was very wise. Shyam agreed.

Ram and Shyam told the whole story to Maulvi. After thinking for a long time, he said that the fairest way to share the coins was to give Shyam 7 coins and Ram only 1 coin. Both men were surprised. But when they asked Maulvi to explain his reasoning, they were satisfied that it was a fair division of the 8 coins.

Here are reasons the participants could have given to explain each mentioned division as being fair:

  1. Ram: “If the traveller hadn't arrived, we would have shared the chapatis equally. So it is only fair if we now share the eight coins equally as well.”

  2. Shyam: “If the traveller hadn't arrived, you would have bought one chapati from me at the going rate for chapatis. Now that the traveller was so generous, the going rate suddenly went up to one gold coin for a chapati. So your chapatis turned out to be worth three gold coins and mine five gold coins.”

  3. Maulvi: “The traveller has eaten one third of eight chapatis. Ram had only three chapatis to start with, and therefore he has eaten $1/3$ chapati from Ram and $7/3$ chapatis from Shyam. So it is only fair if Ram gets one coin and Shyam gets seven.”

A moral of this could be that there is no obviously correct notion of fairness, in this case, and in many cases. Formal analysis always starts from an intuition and can help to turn that intuition into a more precise definition. Then one can check if a given procedure fits the definition; however, if it fits, that does not show that the definition is right.

Many definitions of fairness have their grounding in something like John Rawls' veil of ignorance (Rawls 1971, see also entry on original position). A procedure for settling a dispute between A and B is fair if someone who does not yet know whether she is playing the role of A or B would agree to it. This is not unlike the foundations of probability theory. The expected value of an event involving chance is what you would have to stake in a fair game, or wager, about that event. A wager is fair if both players stake the same amount and the positions are symmetric. Suppose someone hides 3 dollars in one of his hands and 7 dollars in the other hand, and offers me the choice. How much is this offer worth? Exactly 5 dollars, for if I have 5 dollars for sure, the following game is fair. I can stake 5 dollars against an opponent who stakes the same amount, with the rule that the winner gets the stake, and has to give 3 dollars to the loser. This is fair in the sense that the two roles in the game are symmetric, and both parties have equal chances of getting 3 (if they lose) or 10 minus 3 equals 7 (if they win). This explanation of expected value in terms of playing a fair game appeared for the first time in (Huygens 1657; see also entry on interpretations of probability).

2.1 Fair Division

Social procedures are as old as the world. Divide and choose (also known as “I cut, you choose”) is a procedure for two-person fair division of some desirable or undesirable heterogeneous good. One person divides the good in what she believes are equal shares, and the other person chooses. If the two participants have different value judgements on parts of the goods, it is possible that both participants feel they received more than 50 percent of the goods. Indeed, let $X$ be a set representing the good to be divided. A valuation function $V$ for $X$ is a function from ${{\cal P}}(X)$ to $[0,1]$ with the properties that $V(\emptyset) = 0$, $V(X) = 1$, and $A \subseteq B \subseteq X$ implies $V(A) \leq V(B)$. Suppose $V_m$ and $V_y$ are functions for my and your valuation of the contents of $X$. If $V_m$ and $V_y$ are different, this means that you and I value some items in $X$ differently. It follows, as was already observed by Steinhaus in 1948, that there exists a division that gives both parties more than their due part; “this fact disproves the common opinion that differences in estimation make fair division difficult” (Steinhaus 1948).

It matters whether the valuations are known to the other party. Such knowledge can be used to advantage by the one who cuts. First consider the case that your valuation is unknown to me, and vice versa. Then if I cut, the best I can do is pick sets $A, B \subseteq X$ with $A \cap B = \emptyset$, $A \cup B = X$, and $V_m(A) = V_m(B)$. If you choose, you will use $V_y$ to pick the maximum of $\{ V_y(A), V_y(B) \}$. It follows immediately that cutting guarantees a fair share, but no more than that, while choosing holds a promise for a better deal. So if you ever get the choice between cutting and choosing in a situation where both parties only know their own valuation, then it is to your advantage to leave the cutting to the other person.

However, if the valuations are common knowledge (see entry on common knowledge), the situation is reversed, for then it is more advantageous to take the role of cutter. As cutter, you can attempt to make a division in $A$ and $B$ with $A$ slightly more valuable than $B$ according to the valuation of the other party, while $B$ is much more valuable than $A$ according to your own valuation.

The example shows that issues of knowledge and ignorance, are crucial for analysis of fair division protocols. Epistemic logic (see entry on epistemic logic) can shed much light on many subtle aspects of knowledge and ignorance in social interactions, and in particular in fair division problems. Still, in traditional studies of fair division the role of knowledge is not taken into account, as is witnessed by the comprehensive study of “cake cutting algorithms” in Robertson & Webb (1998).

2.2 Cutting a Cake Among More than Two Participants

In the social choice literature (Brams 2005; Brams & Taylor 1996) it is common practice to use cake cutting as a metaphor for a division of a single heterogeneous good. Dividing a piece of land at inheritance would be an example. The cake has different toppings that cannot all be cut into pieces with the same composition: it may have candied cherries on top that someone likes but another person abhors, and so on. A cake division is simply fair if each of $n$ players feels she received at least $1/n$ of the cake, according to her individual valuation of its parts. A procedure may be simply fair without ruling out the possibility of hard feelings. A cake division is called envy-free if each person feels that nobody else received a larger piece. A sure sign of a division being envy-free is that nobody wishes to trade pieces with anyone else. It turns out to be very hard to design cake cutting procedures that are both fair and envy-free. The I cut, you choose procedure is fair, and it is envy-free simply because the rest of the cake is a single piece, so there is no possibility for envy. See the entry on economics and economic justice for philosophical discussions of envy-freeness.

R. Parikh (2002) analyzes the so-called Banach-Knaster algorithm for cake cutting when the cake has to be divided fairly among at least three persons, which goes like this:

I cut a piece intended for myself. All others consider it. If nobody objects, I get my piece. If someone raises an objection, she has the right to cut off a slice and put that back with the rest of the cake. She then asks if she can have the reduced piece. If nobody objects, she gets it, otherwise someone else takes the knife and reduces the piece a bit further, and so on, until someone gets the trimmed piece. Then on to the next round, with $n-1$ players. When two players are left, they use the Divide and choose algorithm.

Parikh's discussion shows how the methods of theoretical computer science can be used to argue that the procedure is fair. The key ingredient of the procedure is a loop operation:

Continue to trim the piece until there are no further objections about the size.

Suppose $r$ stands for the action of trimming a piece of cake and putting it back with the main part of the cake, according to the Banach-Knaster algorithm, and suppose $F(m,k)$ is the proposition that the main part of the cake is large enough for $k$ people. Then surely $F(m,n)$ holds at the beginning: the whole cake is large enough for the whole group to begin with. Moreover, Parikh (1983, 2002) uses his game logic to prove that $F(m,k)$ is invariant under the action $r$: If $F(m,k)$ is true before $r$, then it will still be true after $r$ has occurred. Clearly, if one can show that $F(m,k)$ continues to hold through the algorithm, for $k$ running through $n, \ldots, 1$, then this establishes that the division is fair.[2]

2.3 Solomon's Judgement

A variation on Divide and choose was played by King Solomon in the famous Judgement of Solomon, in a dispute among two women who both claimed to be the mother of a child. The full story is in 1 Kings 3:16–28. Two women who lived in the same house both had an infant son. One of the women claimed that the other woman had stolen her child, after accidentally smothering her own son during sleep. The other woman denied this and reversed the charge. After hearing their stories, King Solomon called for a sword and declared that there is only one fair solution: cut the living child in two and give both women half of it. Upon hearing this, the true mother cried out that she was willing to give the child up if it could be spared, while the fake mother agreed with the judgement. This behaviour revealed to Solomon who was the real mother, and her child was given back to her.

This procedure is not repeatable. As the Bible story puts it:

And all Israel heard of the judgment which the king had judged; and they feared the king; for they saw that the wisdom of God was in him, to do justice.

Obviously, in a second similar dispute, both women would exclaim “Give it to her, but let it live!”

Solomon's handling of the situation can be turned into a social procedure that is repeatable, as follows. Solomon does not call for a sword, but instead explains to the two women the following procedure. First he is going to ask the first woman if she is willing to give up the child. If the answer is “yes”, the dispute is resolved, with no further questions asked. Otherwise he will ask the other woman if she is willing to give up the child. Again, if the answer is “yes” the dispute is resolved. If they both refuse, then the child is his, and then he will allow one of the women to buy it back, at a price that is to be determined as follows. They will both write an amount of money on a sheet of paper, without their names. If the two bids are $A$ and $B$, then the price of the child is set at $\frac{A + B}{2}$, and fate will determine which woman is going to get the child at that price, where the other woman has to pay a small fine. If the two women are rational, one of them will give up the child when being first asked (see Moore 1992 and Pauly 2005; for philosophical discussions of rationality, see the entries on practical reason & game theory and ethics).

Both Moore (1992) and Pauly (2005) discuss the importance of reasoning about common knowledge and ignorance in the King Solomon cases. For example, King Solomon is ignorant of who is the real mother, but both women commonly know from the start who is the true mother, and that the true mother will therefore bid much higher than the other one. This makes the procedure safe. Again, epistemic logic and in particular common knowledge help to shed light on a tricky social procedure. For a more traditionally philosophical introduction to the fair division problem, including more extensive explanations of fairness, manipulability and envy-freeness, see entry on economics and economic justice.

The next section shows that the perspective of social software can also shed light on social matching problems. These range from marriages to the assignment of students to hospitals, college admission procedures, and the assignment of students to housing.

3. The Stable Marriage Problem

Suppose equal sets of men and women are given, all of them seeking to marry someone of the opposite gender, and each man has listed his preferences for the women by means of a strict ordering, and similarly for each woman. A stable marriage match is a one-to-one mapping between the men and women with the property that if a man prefers another woman over his own wife, then that woman does not prefer him to her own husband, and if a woman prefers another man over her own husband, then that man does not prefer her to his own wife.

3.1 The Gale-Shapley Algorithm

The computer scientists Gale and Shapley proved that stable matchings always exist, and gave an algorithm for finding such matchings, the so-called Gale-Shapley algorithm (Gale & Shapley 1962):

Initially, all men and all women are free (unengaged).

Next, in a number of rounds, each free man proposes to the most-preferred woman to whom he has not yet proposed and ticks her off from his list. If the woman is free, she accepts, and they become engaged. If the woman is not free, she compares the proponent to her current fiancé. If she likes him better, she dumps the fiancé who becomes free again, and the proponent and his woman of choice become engaged.

This goes on until all men and women are engaged.

As an example, suppose that there are three men $a, b, c$ and three women $d, e, f$, and the lists of preferences are as follows (with the most preferred one first in the list): $a: edf$, $b: {\it fed}$, $c: {\it dfe}$, $d: {\it abc}$, $e: {\it cda}$, $g: {\it acb}$. So $a: {\it edf}$ means that $a$ prefers $e$ to $d$ and $d$ to $f$. It is assumed that preferences are transitive, so $a$ also prefers $e$ to $f$.

An example of a stable match for this situation is represented as three pairs $(a,e)$, $(b,f)$, $(c,d)$. Note that woman $d$ ends up with the man that is at the bottom of her list. But this match is still stable, for although $c$ is willing to swap her husband for any of the other two men, these two candidates will not agree, for they both happen to be married to the woman who is on the top of their own list.

To check that the Gale-Shapley algorithm always produces stable matchings, we can proceed as follows. Obviously, the situation where no-one is engaged is stable.

What does it mean for $E$, an “engagement” mapping, to be stable on the set of women $W$ and the set of men $M$? Let us use $m >_w m'$ for “$w$ prefers $m$ over $m'$” (so bigger is better).

  • (1) For all $(m,w) \in E$: if there is $w'$ with $w' >_m w$ then there is no $m'$ with $(m',w') \in E$ and $m >_{w'} m'$;
  • (2) For all $(m,w) \in E$: if there is $m'$ with $m' >_w m$ then there is no $w'$ with $(m',w') \in E$ and $w >_{m'} w'$.

What does it mean for a man to be free?

  • (3) The set of free men should equal the set of all men minus the men that are engaged.

Next, inspect what happens in a single step in the algorithm. The precondition for the step is that there is at least one free man $m$ left. Such a free man $m$ proposes to the highest woman $w$ on his list to whom he has not yet proposed.

There are two cases. If $w$ is free, $w$ accepts the proposal, and they become engaged. Is the new set of engaged pairs stable? We only have to check for the new pair $(w,m)$.

  • Suppose that there is a free $w'$ with $w' >_m w$. This cannot be, for $w$ is at the top of $m$'s list.

  • Suppose there is $m'$ with $m' >_w m$. Then if $m'$ is engaged, let us say to $w'$, this must mean that not $w >_{m'} w'$. For otherwise $m'$ would have proposed to $w$ instead of to $w'$.

  • The new list of free men equals the old list, minus $m$. This is correct, for $m$ just got engaged.

Now the other case: suppose that $w$ is already engaged. There are two subcases. In case $w$ prefers her own current fiancé, nothing happens. The resulting list of engaged pairs is still stable. The list of free men remains the same, for $m$ proposed and got rejected.

In case $w$ prefers $m$ to her own fiancé $m'$, she swaps: $(m,w)$ replaces $(m',w)$ in the set of engaged pairs. Again, it is easy to see that the resulting list of engaged pairs is stable. Man $m$ gets replaced by $m'$ in the set of free men. This is also correct.

Note that the Gale-Shapley matching algorithm is hugely favourable to the party that is doing the proposing. The proposing party gets a chance to make proposals to any candidate, in order of preference. But at the start of the procedure the receiving party has to say “yes” to any proposal! The result of swapping the roles of the men and the women in the algorithm will also compute a stable match, but one that is more advantageous to the women.

The Gale-Shapley algorithm has many important applications, also outside of the area of marriage brokering; Gale and Shapley themselves discuss college admission procedures (1962). The next subsection presents another application.

3.2 A University Housing Assignment Procedure

Using the perspective of social software, Parikh and Pauly (2012) investigate a variant of the Gale-Shapley algorithm that is used in the Stanford Housing Draw to assign students to rooms. The situation is more complex than in the marriage setting, because students do not give a complete order on all houses, but only on 8 of them; moreover, they may choose to enter the Draw in groups. In the housing context, the students turn out to have an incentive to honestly submit their true preferences: the draw is non-manipulable for them. However, in theory they could still strategize about choosing the subset of 8 houses on which they submit their preferences.

The issue of knowledge is interesting in this case. Even though the algorithm can be found on the Stanford webpages, most students and administrators do not fully understand how it works. Therefore, the Stanford Housing Draw cannot be assumed as common knowledge among the students. An interesting phenomenon seems to occur: even while admitting not to understand the algorithm, most students would say that they believe it to be fair (Parikh & Pauly 2012).

4. The Logic of Communication

Communication protocols are important in distributed computing: computing with distributed systems, where a distributed system is a set of computers that are connected through a communication network. Communication protocols are also interesting from a philosophical perspective, especially in the context of discussions of the value of privacy (see entries on privacy and computer and information ethics). The formal approach can help in answering philosophical questions such as “Does more security automatically lead to less privacy?”.

4.1 Communication and Distributed Computing

In the following example, the inspiration does not only flow from social problems to formal solutions, but also the other way, from successful social practices to formal procedures. Many algorithms for distributed computing are related to social protocols for communication in everyday life. An example is the use of a “talking stick” to regulate discussion and decision making in a group of peers, with the rules that the talking stick gets passed around and only the person who is holding the stick is allowed to talk (Nerburn 1999).

A computer communication protocol based on this social procedure is the token ring protocol. A token ring in distributed computing is a network where each computer is connected to exactly two other computers in such a way that each computer is reachable in the network, and where a single “token” circulates around the ring-shaped network. Communication can only be initiated by the current owner of the token.

Sometimes the token gets lost through computer or network failure. In such cases the token has to be regenerated, with a guarantee that only one computer has the token. This problem of regenerating the token in a token ring is called the leader election problem. Here is an algorithm for it:

Assume communication takes place clockwise, and each computer can distinguish its clockwise neighbour from its counterclockwise neighbour. Assume all computers have different identifiers (positive whole numbers) and each computer knows its identifier.

Each computer sends its identifier around the ring. When a computer $c$ receives an identifier, $c$ compares it to its own. If the identifier is greater than its own, $c$ passes it on. If it is less than its own, $c$ discards it. If it is equal to its own, $c$ declares itself to be the leader.

It is not hard to see that this guarantees the computer with the highest identifier $i_{\text{max}}$ to become the leader (see Lynch 1996). No assumptions need to be made about the number of computers in the ring, nor about any computer knowing anything about the size of the ring or the identifiers of the other computers. A next stage of the protocol could be for the leader to send around a request to register as non-leader and halt.

One further level of abstraction is from distributed computers or processes to interacting intelligent agents, or multi-agent systems. These agents can be computers, robots, humans, teams of humans, or some mix of these. It is commonly assumed that the agents have a degree of autonomy, that the agents have a restricted local view of the system as a whole, and that there is no designated controller of the whole system (see Wooldridge 2002, second edition).

4.2 Common Knowledge and Social Procedures

Many social procedures are designed to create common knowledge (Lewis 1969 and entry on common knowledge). The old-fashioned ritual that takes place when you withdraw a large amount of money from your bank account and have it paid out to you in cash by the cashier is an example.

How and whether common knowledge can be achieved depends on available communication facilities. Public announcement or publicly observable ritual (the cashier ritual mentioned above) can create common knowledge. But, as Halpern and Moses (1984) proved, message exchange in a distributed environment, where there is no guarantee that messages get delivered, cannot. Halpern and Moses use the example of two generals who are planning a coordinated attack on a city. The generals are on two hills on opposite sides of the city, each with their own army, and they know they can only succeed in capturing the city if their two armies attack at the same time. But the valley that separates the two hills is in enemy hands, and any messengers that are sent from one army base to the other run a severe risk to get captured. The generals have agreed on a joint attack, but still have to settle the time, so they start sending messengers. But they cannot be sure that the messengers succeed in delivering their message. And if they get through, there is no guarantee that the message of acknowledgement will get delivered. And so on.

Even if common knowledge is sometimes hard to achieve in practice, it serves as necessary presumption in regulating society. Roman lawgivers found out long ago that if citizens within their jurisdiction could plead innocence because of being unaware of the law, no offender could ever get convicted. So they invented principles like Ignorantia legis neminem excusat, “ignorance of the law excuses no one”. Societies that abide by the rule of law have to be organized in such a way that citizens in principle are in a position to know the law. The laws have to be properly published and distributed, e.g., by being printed in a government gazette that every citizen has access to.

In his book “Rational Ritual” (2001), Michael Suk-Young Chwe points out the importance of size of groups for which common knowledge gets established. A brand name that is common knowledge in a large group is worth a lot of money. Chwe analyzes the example of advertisements broadcasted during the American football Super Bowl. He compares the enormous cost of making something common knowledge by means of such advertisements to the obvious benefits. Part of the benefit is in the fact that the advertisements create common knowledge. An important consideration when deciding to buy a smartphone of a particular brand, for example, is the knowledge that others are going to buy the same model too.

5. Strategic Reasoning and Cooperation

The large field of game theory is extensively explained in other lemmas in this encyclopedia (see entry on game theory). This field of research has been very active since the appearance of the seminal book (Von Neumann and Morgenstern 1944). Similarly, social choice theory and in particular voting theory (see entries on social choice theory and voting methods) were already thriving fields of research long before the term social software came along.

It is useful to investigate how formal methods and an algorithmic perspective can help solve societal problems. For example, in the case of the famous prisoner's dilemma (see entry on prisoner's dilemma) it is interesting to try to design policies that make cheating the other agent less profitable by penalizing it. Notice that this “social software engineering” takes place at the meta-level, on top of the level of the prisoners choosing their strategies (van Eijck forthcoming).

A related recent trend in game theory that is relevant for social software, is to step away from solution concepts such as the Nash equilibrium and instead to focus on the process of rational deliberation. This type of research delineates both the normative principles guiding the players' strategic reasoning, and the psychological phenomena explaining the differences between predicted and observed behavior in games (Pacuit forthcoming; Camerer 2003; Meijering et al. 2012).

In Section 4.2 we briefly discussed the role of the study of knowledge when analyzing social procedures. It turns out that in voting theory in particular, it is useful to design a logic to explicitly model the knowledge that agents bring to bear when they are voting. It is especially interesting to model what happens in a group when agents vote strategically by misrepresenting their own preferences in order to manipulate the outcome (van Eijck forthcoming; van Ditmarsch et al. 2012).

In recent years in the research area of multi-agent systems, formal approaches to social procedures have also been used to help design actual software, for example for cooperative problem solving in teams, coalition formation, knowledge fusion, auctions, and negotiations among software agents (Dunin-Keplicz and Verbrugge 2010; Pauly 2002; Vazirani et al. 2007; Shoham and Leyton-Brown 2009; Chalkiadakis et al. 2011). This literature is mostly normative in nature.

In contrast, another fascinating area of research, evolutionary game theory (see entry on evolutionary game theory), investigates how features like altruism, social norms, moral behavior and cooperation could actually have evolved. This area combines both normative and descriptive work (Axelrod 1984; Bowles and Gintis 2011; Sigmund 2010). As a particular social software contribution to this area, Gärdenfors (2012) characterized how much cognition and communication are required for several kinds of cooperation, from simple flocking behavior, through reciprocal altruism (“I'll scratch your back if you scratch mine”), up to fully fledged teamwork.

6. Conclusion

In conclusion, the formal perspective on social procedures and intelligent interaction, which emphasizes algorithms and information, has produced a wide variety of important insights. It has also led to interesting philosophical discussions.

Bibliography

General References

  • Eijck, J. van and R. Verbrugge (eds.), 2009, Discourses on Social Software (Texts in Logic and Games: Volume 5), Amsterdam: Amsterdam University Press.
  • ––– (eds.), 2012, Games, Actions and Social Software: Multidisciplinary Aspects (Lecture Notes in Computer Science: Volume 7010), Berlin: Springer.
  • Pacuit, E., 2005, Topics in Social Software: Information in Strategic Situations, Ph.D. thesis, New York: City University of New York.
  • Parikh, R., 2002, “Social Software”, Synthese, 132: 187–211.
  • Pauly, M., 2001, Logic for Social Software, Ph.D. thesis, Amsterdam: ILLC.

Fairness

  • Brams, S., 2005, “Fair Division”, in Oxford Handbook of Political Economy, B.R. Weingast and D. Witteman (eds.), Oxford: Oxford University Press, pp. 425–437.
  • Brams, S. and A. Taylor, 1996, Fair Division: From Cake-Cutting to Dispute-Resolution, Cambridge: Cambridge University Press.
  • Huygens, C., 1657, Van Rekeningh in Spelen van Geluck (Calculation in Hazard Games), Amsterdam. Translation into modern Dutch by W. Kleijne, Utrecht: Epsilon (1998).
  • Moore, J., 1992, “Implementation, Contracts, and Renegotiation in Environments with Complete Information”, in Advances in Economic Theory — 6th World Congress, volume 1, J.-J Laffont (ed.), Cambridge: Cambridge University Press.
  • Padma, T., 2007, Mathematwist: Number Tales from Around the World, Chennai: Tulika Publishers.
  • Parikh, R., 1983, “Propositional Game Logic”, in 24th Annual Symposium on Foundations of Computer Science, Washington, DC: IEEE Computer Society, pp. 195–200.
  • Pauly, M., 2005, “Changing the Rules of Play”, Topoi, 24(2): 209–222.
  • Rawls, J., 1971, A Theory of Justice, Cambridge, MA: Harvard University Press.
  • Robertson, J. and W. Webb, 1998, Cake-Cutting Algorithms: Be Fair If You Can, Boca Raton, FL: A.K. Peters.
  • Steinhaus, H., 1948, “The Problem of Fair Division”, Econometrica, 16: 101–104.

The Stable Marriage Problem

  • Gale, D. and L. Shapley, L., 1962, “College Admissions and the Stability of Marriage, American Mathematical Monthly, 69: 9–15.
  • Parikh, R. and M. Pauly, 2012, “What is Social Software?”, in Eijck and Verbrugge 2012: 3–14.

The Logic of Communication

  • Chwe, M. S.-Y., 2001, Rational Ritual, Princeton and Oxford: Princeton University Press.
  • Halpern, J. and Y. Moses, 1984, “Knowledge and Common Knowledge in a Distributed Environment”, in Proceedings of the 3rd ACM Symposium on Principles of Distributed Computing (PODS), pp. 50–61. A newer version appeared in Journal of the ACM, 37(3) (1990), 549–587.
  • Lewis, D., 1969, Convention: A Philosophical Study, Cambridge, MA: Harvard University Press.
  • Lynch, N., 1996, Distributed Algorithms, San Mateo, CA: Morgan Kaufmann.
  • Nerburn, K., 1999, The Wisdom of the Native Americans, Novato, CA: New World Library.
  • Wooldridge, M., 2009 (second edition; first edition: 2002), An Introduction to Multi-Agent Systems, Chichester: John Wiley and Sons.

Strategic Reasoning and Cooperation

  • Axelrod, R., 1984, The Evolution of Cooperation, New York: Basic Books.
  • Benthem, J. van, S. Ghosh, and R. Verbrugge (eds.), forthcoming, Modeling Strategic Reasoning (Texts in Logic and Games, FoLLI-LNCS), Berlin: Springer.
  • Bowles, S. and H. Gintis, 2011, A Cooperative Species: Human Reciprocity and its Evolution, Princeton and Oxford: Princeton University Press.
  • Camerer, C.F., 2003, Behavioral game theory: Experiments on Strategic Interaction, Princeton: Princeton University Press.
  • Chalkiadakis, G., E. Elkind, and M. Wooldridge, 2011, Computational Aspects of Cooperative Game Theory, (volume 5 of Synthesis Lectures on Artificial Intelligence and Machine Learning), San Rafael, CA: Morgan and Claypool Publishers.
  • Ditmarsch, H. van, J. Lang and A. Saffidine, 2012, “Strategic Voting and the Logic of Knowledge”, in Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems (AAMAS'12), vol. 3, pp. 1247–1248.
  • Dunin-Keplicz, B. and R. Verbrugge, 2010, Teamwork in Multi-Agent Systems: A Formal Approach, Chichester: Wiley.
  • Eijck, J. van, forthcoming, “Strategic Reasoning in Social Software”, to appear in van Benthem, Ghosh, & Verbrugge forthcoming.
  • Gärdenfors, P., 2012, “The Cognitive and Communicative Demands of Cooperation”, in Eijck and Verbrugge 2012: 164–183.
  • Meijering, B., H. van Rijn, N. Taatgen and R. Verbrugge, 2012,“What Eye Movements Can Tell about Theory of Mind in a Strategic Game”, PLoS ONE, 7(9), e45961, doi:10.1371/journal.pone.0045961
  • Pacuit, E., forthcoming, “Strategic Reasoning in Games”, to appear in van Benthem, Ghosh, & Verbrugge forthcoming.
  • Pauly, M., 2002, “A Modal Logic for Coalitional Power in Games”, Journal of Logic and Computation, 12: 149–166.
  • Shoham, Y. and K. Leyton-Brown, 2009, Multiagent Systems: Algorithmic, Game-Theoretic and Logical Foundations, Cambridge: Cambridge University Press.
  • Sigmund, K., 2010, The Calculus of Selfishness, (Princeton Series in Theoretical and Computational Biology), Princeton and Oxford: Princeton University Press.
  • Vazirani, V.V., N. Nisan, T. Roughgarden, and E. Tardos (eds.), 2007, Algorithmic Game Theory, Cambridge: Cambridge University Press.
  • Von Neumann, J. and O. Morgenstern, 1944, Theory of Games and Economic Behavior, Chichester: Wiley.

Copyright © 2014 by
Jan van Eijck <jve@cwi.nl>
Rineke (L.C.) Verbrugge <rineke@ai.rug.nl>

Open access to the SEP is made possible by a world-wide funding initiative.
Please Read How You Can Help Keep the Encyclopedia Free