"Probable", "probably", "probability", "likely", "likelihood" - we use these terms all the time. But what exactly do we mean by them?
Personally, I agree with this position:
What is probability? Is it a property of a thing (e.g., a coin), a property of an event involving a thing (e.g., a toss of the coin), or a description of the average outcome of a large number of such events (e.g., “heads” and “tails” will come up about the same number of times)? I take the third view.
What does it mean to say, for example, that there’s a probability of 0.5 (50 percent) that a tossed coin will come up “heads” (H), and a probability of 0.5 that it will come up “tails” (T)? Does such a statement have any bearing on the outcome of a single toss of a coin? No, it doesn’t. The statement is only a short way of saying that in a sufficiently large number of tosses, approximately half will come up H and half will come up T. The result of each toss, however, is a random event — it has no probability.
That is the standard, frequentist interpretation of probability, to which I subscribe.
In the ado of humankind, false concepts and the facile use of probability notions are of considerable currency. But why? Apparently, we have evolved to be good at ignoring uncertainty - in clever and viable ways. Innumerable cases show, our species manages to come up with reasonable solutions to the challenges of life even when information is too scarce to give us certainty. We make heroic assumptions to find orientation in a world that is too complex, detailed, and too arcane and concealed to ever be approached with complete certainty of its facts and nature. Yet, we survive, and even accomplish considerable progress. By systematically faking certainty. How do we do it?
In psychology, uncertainty was made famous by the work of Daniel Kahneman and Amos Tversky. In their 1982 collection of research, “Judgments under Uncertainty,” the psychologists explained that when you don’t have enough information to make a clear judgment, or when you are making a decision concerning something too complex to fully grasp, instead of backing off and admitting your ignorance, you tend to instead push forward with confidence. The stasis of uncertainty never slows you down because human brains come equipped with anti-uncertainty mechanisms called heuristics.
In their original research they described how, while driving in a literal fog, it becomes difficult to judge the distance between your car and the other cars on the road. Landmarks, especially those deep in the mists, become more hazardous because they seem farther away than they actually are. This, they wrote, is because for your whole life you’ve noticed that things that are very far away appear a bit blurrier than things that are near. A lifetime of dealing with distance has reinforced a simple rule in your head: the closer an object the greater its clarity. This blurriness heuristic is almost always true, except underwater or on a foggy morning or on an especially clear day when it becomes incorrect in the other direction causing objects that are far away to seem much closer than normal.
Thus, with good grounds ...
Gerd Gigerenzer is a strong advocate of the idea that simple heuristics can make us smart. We don’t need complex models of the world to make good decisions.
The classic example is the gaze heuristic. Rather than solving a complex equation to catch a ball, which requires us to know the ball’s speed and trajectory and the effect of the wind, a catcher can simply run to keep the ball at a constant angle in the air, leading them to the point where it will land.
So, then we may conclude ...
That’s what a [psychological] heuristic is, a simple rule that in the currency of mental processes trades accuracy for speed. A heuristic can lead to a bias, and your biases, though often correct and harmless, can be dangerous when in error, resulting in a wide variety of bad outcomes from foggy morning car crashes to unconscious prejudices in job interviews.
Part of our disposition to being biased in order to cope with uncertainty is what is called the halo effect whereby one transfers the (strongly felt, though potentially non-existent) authority of one data set to another data set, whose stand-alone authority is doubtful or unclear, as when a very negative personal experience with a redheaded person leads one to generalise the supposedly evil character of read-haired people - read more here.
I am interested in finding out more about the way in which this kind of heuristics colours our political perceptions and comportment. If it plays such an ubiquitous and important role in everyday life, it is likely to be of considerable weight regarding our political behaviour. These deliberations may be a first small step toward an anthropology of politics and freedom.
Concerning the role of politics in coping with unavoidable ignorance and uncertainty, see also my three-part piece on Why It Is Not True That Politics Makes Us Worse (1/3) - Thirteen Conjectures on Politics (1/3) and Rivers Working Like Politics - On Chaos, Complexity and Intermediary Conditions.