Evidential Decision Theory is a radical theory of rational decision-making. It recommends that instead of thinking about what your decisions *cause*, you should think about what they *reveal*. This Element explains in simple terms why thinking in this way makes a big difference, and argues that doing so makes for *better* decisions. An appendix gives an intuitive explanation of the measure-theoretic foundations of Evidential Decision Theory.
Our beliefs come in degrees. I'm 70% confident it will rain tomorrow, and 0.001% sure my lottery ticket will win. What's more, we think these degrees of belief should abide by certain principles if they are to be rational. For instance, you shouldn't believe that a person's taller than 6ft more strongly than you believe that they're taller than 5ft, since the former entails the latter. In Dutch Book arguments, we try to establish the principles of rationality for degrees of belief by appealing to their role in guiding decisions. In particular, we show that degrees of belief that don't satisfy the principles will always guide action in some way that is bad or undesirable. In this Element, we present Dutch Book arguments for the principles of Probabilism, Conditionalization, and the Reflection Principle, among others, and we formulate and consider the most serious objections to them.
The main aim of this Element is to introduce the topic of limited awareness, and changes in awareness, to those interested in the philosophy of decision-making and uncertain reasoning. While it has long been of interest to economists and computer scientists, this topic has only recently been subject to philosophical investigation. Indeed, at first sight limited awareness seems to evade any systematic treatment: it is beyond the uncertainty that can be managed. On the one hand, an agent has no control over what contingencies she is and is not aware of at a given time, and any awareness growth takes her by surprise. On the other hand, agents apparently learn to identify the situations in which they are more and less likely to experience limited awareness and subsequent awareness growth. How can these two sides be reconciled? That is the puzzle we confront in this Element.
Suppose that you prefer A to B, B to C, and A to C. Your preferences violate Expected Utility Theory by being cyclic. Money-pump arguments offer a way to show that such violations are irrational. Suppose that you start with A. Then you should be willing to trade A for C and then C for B. But then, once you have C, you are offered a trade back to A for a small cost. Since you prefer A to C, you pay the small sum to trade from C to A. But now you have been turned into a money pump. You are back to the alternative you started with but with less money. This Element shows how each of the axioms of Expected Utility Theory can be defended by money-pump arguments of this kind. The Element also defends money-pump arguments from the standard objections to this kind of approach. This title is also available as Open Access on Cambridge Core.
An agent often does not have precise probabilities or utilities to guide resolution of a decision problem. I advance a principle of rationality for making decisions in such cases. To begin, I represent the doxastic and conative state of an agent with a set of pairs of a probability assignment and a utility assignment. Then I support a decision principle that allows any act that maximizes expected utility according to some pair of assignments in the set. Assuming that computation of an option's expected utility uses comprehensive possible outcomes that include the option's risk, no consideration supports a stricter requirement.