Bronze-level article

Black swan

From RationalWiki
(Redirected from The Black Swan)
Jump to navigation Jump to search
A black swan (Cygnus atratus), existing despite previous assumptions.
If for some unlikely reason you are looking for the 2010 film of the same title, try Wikipedia.Wikipedia

A black swan is an unpredictable, rare, but nevertheless high-impact event. The concept is easily demonstrated and well known but naming these events as "black swans" was popularised by Nassim Nicholas Taleb in his book of the same name, which was described in The Sunday Times as one of the 12 most influential books since the Second World War.[1]

Etymology[edit]

Taleb describes rare but high-impact events as like seeing a "black swan" after making the statement "all swans are white." It's rare and unpredictable whether you'll see a black swan, but it proves that such a grandiose statement like "all swans are white" is incredibly wrong. A philosopher once criticised Taleb for the analogy based on the question "what if you defined a swan as being white?" — Taleb told them to just shut up.[note 1] He doesn't like smartass 'philosophers' much.

The etymology also allows for a few interesting asides on the nature of evidence and confirmation bias. In the book, Taleb notes that the phrase "all swans are white" is logically synonymous with "all non-white objects are not swans" (just think about it for a second). Therefore evidence to back up one assertion must logically back up the other. As the observation of a red Mini Cooper is evidence for the second assertion that non-white objects aren't swans, it must therefore back up the assertion that all swans are white. As a friend of Taleb's exclaimed upon passing a red car "Look, Nassim! No black swan!" This demonstrates the main problem with trying to confirm our hypotheses, rather than look for evidence to falsify them, and the role our own ignorance and lack of knowledge plays in understanding just what the frak is going on.

Properties of a black swan[edit]

A black swan event has three properties that make it so.

Rarity
The black swan is a rare event.[note 2] It lies outside the realm of common experience and nothing in our past experience points to its possibility. A black swan is that million-to-one chance that statisticians said would never happen because it was a million-to-one chance. Of course, million-to-one chances happen 9 times out of 10.
Extreme impact
When the black swan strikes, it has a massive impact. It is two of the world's tallest buildings being destroyed, it is a stock market crash that wipes out billions, it is a tsunami or earthquake over a major city. Not only is the qualitative nature of the black swan outside our regular experience (meaning we can't see it coming), so is its sheer size — a single event can dominate over all other factors.
Retrospective predictability
This property explains the concept of "black swan blindness" or "black swan denial". It is the illusion that we can actually see things coming. This is because of the narrative fallacy; our ability to construct a sensible story using only the pertinent information, happily discarding the information that wasn't actually useful in the end. This can only be done with the benefit of hindsight — failing to understand this makes you more vulnerable to their effects.

Thesis[edit]

The central thesis around the black swan phenomenon is that these events aren't just important but actively dominate history. Nassim Nicholas Taleb compares this to Richard Dawkins' description of evolution as the product of random chance and small increments; except that black swan theory states the increments are actually very large. As a result, we can't learn specific lessons from history because, without the benefit of hindsight bias, history is actually unpredictable and thus we must learn to deal with more general consequences.

Ignorance-based thinking[edit]

While most people are happy with thinking about what they do know, Taleb takes great pains throughout The Black Swan to try to focus his readers on what we don't know — which is far more relevant to the black swan problem. Unpredictable events by their very nature are things that lie outside our common experience and happen precisely because of this. Therefore a good appreciation of our own ignorance and a full rationalization of where our knowledge ends is essential in dealing with (although not necessarily avoiding) black swan events.

The turkey[edit]

It quite literally has no fucking idea what is about to happen.

One of Taleb's fables regarding Black Swan type events is that of the turkeys that are raised for Christmas dinner — this can also be applied to any other suitable farm animal, or even users of popular websites.[2] Judging from past events, the turkey can consider itself lucky. It is fed and watered every day and generally kept happy. No indication from these past events suggests that one day it might be slaughtered for food. However, after this event (assuming the turkey lives at least to the point where it finally figures it out) it becomes "obvious" that it was being raised for slaughter; the protection and vaccinations are to keep it healthy, the excess food is to fatten it up. In short, after the event, the narrative becomes clear.

Until that point, however, the turkey would have no idea and it would be unfair to say that the turkey would have been able to predict its own demise from its 100 day eating binge during that time itself. But even more important to how psychology prevents us from recognising "the black swan problem" is that the turkey's belief that every day would be fantastic would be reinforced by the fact that every day was fantastic. The accumulation of supporting information doesn't just reassure the turkey, but also actively destroys its ability to think about what it doesn't know. As a result the turkey undergoes a terminal revision of belief on the very day it has received the maximum validation of the belief that its life will continue getting better. Taleb generally is referring to this phenomenon when he says apparently silly things like "newspapers make you less knowledgeable about the world".

The moral of the story; do not be a turkey. Alternatively, don't trust burgeoning stock markets or constantly rising house prices.

Mediocristan and Extremistan[edit]

The normal distribution.

Taleb makes an important criticism of the usage of "Gaussian" normal distributions[3] as the backbone of statistical modelling when applied to phenomena that do not seem to follow such a distribution and are instead skewed by rare, but massive outliers. He comes up with the concepts of "Mediocristan" — the realm of properties that are Gaussian in nature, like people's weight or height — and "Extremistan" — where properties like a person's fortune, market behaviour or success in intellectual or artistic professions are unevenly distributed, and the inclusion or exclusion of one extreme outlier can massively change the overall picture.

One can best visualise this with a quick thought-experiment. Take 100, or even 1000, people and compare their heights. Even adding in Robert Wadlow (the tallest man ever recorded) and his massive 2.72 meter frame he would only take up 0.17% of the total height and would barely skew the average by a single percent. However, if you took 100 or 1000 or even 10,000 people, compared their wealth and added Bill Gates's amassed fortune, he could easily dominate the group, holding 99+% of the money in the group. As Taleb notes when discussing this particular thought-experiment in The Black Swan, the other thousands in the group barely represent a rounding error in Bill Gates's personal fortune, or the average daily change due to random economic fluctuation. This particular property that leads Taleb to conclude that history doesn't just feature black-swan events, but is controlled and shaped by them almost exclusively, with the small accumulative changes making little impact in the grand scheme of things. In Extremistan, a single outlier can absolutely dominate the group as a whole.

Taleb admonishes statisticians, financial analysts, and quantitavely-minded social-scientists for fundamentally misunderstanding and misrepresenting phenomena from Extremistan realm by applying methods that only work in the context of Mediocristan. In other words, thinking you're in Mediocristan when you're living in Extremistan.[note 3] He is especially critical of economics — a study which bases many of its assumptions on the idea that extreme events will rarely occur because of its reliance on models using a normal or Gaussian distribution. Predictions for oil prices or house prices, or how a company's stock will fare, foretell decades into the future — but such predictions rapidly become redundant, as they're rarely accurate within even a few months. In fact, virtually all economic forecasts are falsified by black swans — large and small — precisely because they are not, indeed cannot, be accounted for with accuracy.[note 4] At least with weather reporting, the meteorologists behind the science actively admit they're limited to a few days at most where they can be "fairly" sure.

Examples[edit]

History is filled with examples that satisfy black swan criteria. Taleb very, and deliberately, vaguely categorizes them into either positive or negative black swans: ones that cause good and ones that are disasters. Good black swans would include the rise of the Internet, which certainly wasn't accurately predicted in science/speculative fiction even as late as the 1980s, while the rise of Adolf Hitler and others like him, the September 11th attacks on the World Trade Center, and the Challenger space shuttle disaster are archetypal bad black swans. The difference isn't terribly important, although it can be demonstrated that "bad" ones tend to be quick and rapid, while "good" ones are often slow and creeping, going unnoticed until the black swan has fully formed and people barely realise it any more. The point of "black swan theory", as it has become known, is that we should maximise our exposure to potentially good black swans, and learn to deal with the bad ones as effectively as possible.

An example would be the post-9/11 security measures undertaken by the US government. The attacks happened precisely because they were unpredictable and no one thought it was possible. After the attacks, measures were put in place to prevent the attacks being repeated, but any "repeat" of the attacks would have to, by definition, be equally unpredictable! So the highly publicised case of the government employing Hollywood script-writers to come up with scenarios wasn't just stupid, but highly counter-productive.

Critique[edit]

See Igon Value Problem and the Dunning-Kruger effect, the former named in honor of Taleb's promoter Malcolm Gladwell. Chapter 4 of The Black Swan could not have been written by anyone with a solid grasp of conditional probability. Chapter 9 exposes the author's astonishing ignorance of Bayesian statistics (in reference to a possibly biased coin, the archetypical example no less). And throughout, a distinction between "linear" and "non-linear" mathematics is proffered that is downright laughable to anyone trained in the field.

Further Reading[edit]

See also[edit]

Notes[edit]

  1. Okay, so this is highly paraphrased. See the book for the more serious version. But in the mean time, see this discussion for why arguing from semantics like that is just plain boring.
  2. Except in places where they are native like Australia, of course.
  3. Travel advice from the Foreign Office strongly suggests against doing this, like smoking pot in Dubai.
  4. LTCM,Wikipedia a particularly egregious case, has become synonymous with the words "epic fail" in financial circles.

References[edit]