# It Had To Be That Way

This is the first of at least two posts responding to the comments to the post about *Probably?*.

AEW wrote that he did not understand how the distinction between risk and uncertainty relates to subjectivist and frequentist interpretations of probability. He gave this very good example:

Say I have two dice, one with dollar signs on 4 sides, and one with dollar signs on two sides. I put one in each hand (behind my back) and let you choose a hand. I take that die and say ‘I’m going to roll this, and if a dollar sign comes up, you win a dollar.’ What are the odds that you win a dollar? You could say that the probability is either 1/3 or 2/3, you don’t know. Or you could say the probability is 0.5. Does this count as uncertainty? It’s true that to say that the probability is 0.5 is subjective in the sense that someone who peeked behind my back would have a different belief of the probability. But it’s fully consistent with a frequentist interpretation of probability in that if we repeated the process ad infinitum, the frequency would approach 50%.

A couple of things. First, before the whole process starts, we are dealing with risk. If the hand is truly selected randomly, there is a 50% chance that there will be a 1/3 chance of a dollar sign, and a 50% chance that there will be a 2/3 chance of a dollar sign, so there is a 3/6 chance, or 50% chance, that I will roll a dollar sign. And true to a frequentist interpretation, if we repeat the whole thing (picking the hand, then rolling the die that was in that hand), the frequency of dollar signs will approach 50%.

But *after* we pick the hand, but before we roll the die that was in that hand, what are we dealing with? If we roll *that die* over and over, the frequency will not approach 50%. It will approach either 1/3 or 2/3. It sounds like we are operating under risk if we rephrase the game, “I am holding one die. There is a 50% chance that it has two dollar signs, and a 50% chance that it has four dollar signs.” But we are not just talking about an event in the future–the die actually *already has* either two dollar signs or four dollar signs. So if we looked at the die, we could get more information and get a “better” (i.e., more accurate) probability.

So, risk or uncertainty? I really struggle with this. Indeed, maybe it doesn’t even matter whether the die already has two or four dollar signs; maybe the distinction between the present and the future isn’t even relevant. Some people (me? haven’t decided yet) are complete determinists, and believe that if we had all information about everything, we would know exactly what would happen in the future. So in some sense, if we could get more information about things and get better probabilities, we don’t know the real probability of anything at all. (Indeed, to a pure determinist, the real probability of every event, given full information, would always be either 0% or 100%.)

But, even if everything is in some sense uncertain, the distinction is still useful. (I am allowed to say “useful” because I am not a philosopher; rather, I am interested in how we can use these ideas as we think about creating and shaping law.) Some things operate more like known probabilities, and some things operate more like unknown probabilities–that is, some things are more risky (e.g., coin flipping), and some things are more uncertain (e.g., presidential elections). And tax probability statements fall, I think, into the latter group.

(Side note: there’s a great example of determinism in action in the book *The Eudaemonic Pie*, which chronicles the successful attempt of some nerds to beat a roulette wheel. Basically, they realized that roulette balls fall where they do because of physical forces, so they studied roulette wheels and built a computer they could strap to their body that would allow them to input the relevant physical facts and predict where the ball would fall. They were successful. The most amazing thing is that this all happened in the 1970s, and they had to build all the computer equipment themselves, down to designing, drawing, and using acid to etch the circuits onto the PC boards. I cannot recommend this book highly enough. There is also a sequel of sorts, *The Predictors*, which details what these guys did next: started a wildly successful stock-picking business, of course! This is all nonfiction.)

To steal liberally:

What’s A Frequentist, Sarah?Two different dice, with dollar signs

Your pick of one is in the past

Risky? Certain? Fate defines

The meaning when the die is cast.

I’m glad to see that my dice story reflected more than a trivial misunderstanding of your line of thought. To my mind, the distinction between present and future in this context isn’t logically relevant. (It seems that people can differ on their views of oatmeal cookies and still find common ground when it comes to the determinacy of the universe.) And I’d call myself indifferent between the dice game and a coin flip. But it seems that people do react to that sort of difference, and aren’t you, in part, addressing the psychological aversion to stochastic outcomes (what I used to call “risk” or “uncertainty” interchangeably) and its relationship to underlying philosophical considerations?

It seems to me there are two related potential factors in a person’s aversion to stochastic outcomes. One is an aversion to complexity– preferring a coin flip to my dice game, say. Decision theory assumes that people are indifferent between compound lotteries (my dice game), and simple lotteries (coin flip) if they have the same resulting probability distributions over outcomes. Laboratory experiments show that this is not the case.

The other factor would be an aversion to outcomes that by their nature are not repeatable, and thus not amenable to frequentist interpretation.

Instead of doing the dice thing, I tell you that in the Republic of Freedonia there is an election between the PFJ candidate and the JPF candidate. One will win, and if it’s the PFJ, I’ll give you a dollar. What’s the probability that you win a dollar? You could say you have no idea; it could be very probable or almost impossible. On the other hand, you could say that the probability is 0.5. What I’m trying to do here is repeat the dice example with the difference of using a non-repeatable event.

It seems to me that in many cases of “uncertainty,” both factors are at play. With your McCain-to-win example, a frequentist would be shy of talking about probability at all because it’s non-repeatable. But on the other hand, McCaine’s likelihood of winning depends on all sorts of things, for-want-of-a-nail things in Iraq maybe–lots of chance events. So, is it complexity or non-repeatability that makes people feel uncertain about the presidential election?

In tax law, uncertainty could maybe come from the use of standards rather than bright-line rules, say, thus making outcomes seem less like repeatable experiments. Or it could come from inconsistencies between judges–something like a compound lottery. When you say “uncertainty,” are you referring to one or the other or both? Or am I completely off track?

“So, risk or uncertainty? I really struggle with this.”

One thought, Sarah. After we pick the die, but before we roll, there is both an element of present uncertainty (which die is it?) and an element of future risk (which side of the die will come up?). Trying to separate out the risk from the uncertainty seems hard, although I wonder if we could do it. Does this seem right to you?

AEW: You are completely

ontrack. By uncertainty I am primarily getting at nonrepeatability. But it’s not unrelated to complexity, because what makes an event “repeatable” is that the factors that affect the outcome in a way that interests us are relatively few. (E.g., no particular coin flip istrulyrepeatable; each is unique–but not in a relevant way.) Knight puts it this way: “The practical difference between the two categories, risk and uncertainty, is that in the former the distribution of the outcome in a group of instances is known…while in the case of uncertainty this is not true, the reason being in general that it is impossible to form a group of instances, because the situation dealt with is in a high degree unique.” (Risk, Uncertainty, and Profit, Chapter 8.)Lars: I think I’ve also heard from your cousin, Mr. O’Roon. I’m a fan of your family.

It seems to me that this can best be discussed in the context of Bayesian probability where all probability models start with some prior belief. Bayesian probability can be interpreted from both an objectivist point of view and a subjectivist point of view. An objectivist would view it as consistent rational reasoning with limited knowledge. A subjectivist would view Bayesian probability as incorporating prior belief.

You can also talk about the full distribution of an event instead of just the most probable outcome. When you estimate the full distribution of an outcome, you have the most probable outcome, the average outcome, and the individual probability of each possible outcome.

http://en.wikipedia.org/wiki/Bayesian_probability