Taleb's four works: Fooled by Randomness, The Black Swan, Antifragile, Asymmetric Risk
These four books revolve around randomness and risk in the real world and discuss them, with some overlapping parts.
The Drunkard's Walk
The brain of a 21st-century person is not much different from that of humans tens of thousands of years ago; there are many defects in rational thinking, such as poor handling of randomness and a tendency to impose causality.
The author's explanation for why humans tend to impose causality is: after you assign causal relationships to a series of events, they become easier to remember.
Luck plays a larger role in a person's success than people imagine (this sounds like a loser’s excuse, but it's true). Although the probability of success is low, with a sufficiently large number of people you will see many successes, and some of those successful people will tell you that the reason for their success is their belief in the Flying Spaghetti Monster. Based on this, you can judge whether someone's success is due to luck or ability: look at how long they've been in the profession and the number of people in their group; if they've only been in the profession for three years and their group has 1,000 people, then their success may be due to luck.
Time can filter out noise, so you shouldn't check the stock market every day; many intraday fluctuations are just noise.
The Black Swan
Black swan events have the following characteristics:
Cannot be predicted in advance
Produce major impacts
Can be explained after the fact (and therefore causality is easily imposed)
Lack of evidence for A's existence does not mean A does not exist
Europeans saw only white swans, so they believed all swans were white until they discovered black swans in Australia (I think it was Australia?). A turkey, cared for by a farmer throughout its thousand-plus-day life, believes the farmer will always care for it—until Thanksgiving, when the farmer slaughters it.
Domain dependence
This is a human thinking flaw. For example, statisticians are professional in their field most of the time, but off the job, away from the institute, they easily forget to apply statistical knowledge in daily life. (I remember Taleb gave more specific examples, but I forgot and I'm too lazy to look them up.)
Is it a black swan? Is it not a black swan?
An event that seems like a black swan to some people may not be to others: the 9/11 attacks were a black swan event (not for the terrorists who planned the attack). The 2008 financial crisis was a black swan event (not for the author of this book; the collapse of the financial system was only a matter of time). A scholar eating hotpot and singing is robbed by bandits on the way (for the scholar this is a black swan event; for the bandits it is not).
How to guard against black swans?
Black swan events are unpredictable, but losses can be reduced through prevention. If airports strengthen security checks, we don't need to know when terrorists plan to hijack a plane to reduce the probability of a successful terrorist attack; if officials who are the "fragile drivers" of the financial system are fired, the probability of a financial crisis will decrease or the losses will be reduced; if the scholar had brought more bodyguards, or simply not worked in that line of work, he might have avoided the disaster. Don't look at the probability of an event occurring; look at whether you can bear its consequences if it does occur.
Asymmetry
If you measure the heights and weights of 1,000 people and then add the strongest person in the world—suppose he is 2.6 meters tall and weighs 180 kilograms—he won't significantly affect the group's average height and weight. But if you measure the wealth of 1,000 people and then add the richest person in the world, the group's average wealth will increase greatly. Banks can lose decades of profits in a few days (the 2008 financial crisis). Fifty percent of a stock's gains over thirty or forty years can occur in just a few days.
Therefore, Taleb argues you should not trust financial mathematical models because they do not account for black swan events, which can have effects too large to ignore.
Nonlinearity
When your monthly salary is 1,000, a raise of 3,000 excites you; when your monthly salary is 50,000, a raise of 3,000 brings you much less joy than when you were earning 1,000. (Taleb gave other examples, which I forgot.)
Antifragile
The opposite of fragile is not robust but antifragile. If something is damaged by external stress and can repair itself, we call it robust; if something benefits from external stress rather than being harmed by it, we call it antifragile.
Modern financial systems are more efficient than before, but also more fragile. Fewer companies go bankrupt, but when a company does fail, it produces huge effects; garbage companies that should have gone under are rescued by governments because they are "too big to fail," as in the 2008 financial crisis. Modern transportation is more efficient than before, but disease spreads faster as well; COVID-19 was a black swan event.
Practice is antifragile; we continuously improve ourselves from mistakes in practice. There weren't textbooks full of formulas teaching option traders how to trade at the beginning; early traders learned by trial and error, and later scholars compiled textbooks based on that. There weren't medical textbooks guiding doctors at the beginning; generations of doctors explored and wrote medical textbooks.
Biology is antifragile. After humans do physical labor, muscles suffer tiny tears, then repair and become stronger than before.
The antifragility of a group is built on the fragility of individuals: individuals who cannot adapt to nature are eliminated, making the group stronger.
Speech is antifragile: the more information is suppressed, the more its spread is encouraged. (This statement is somewhat absolute; information circulation has costs, and many suppressed messages are unknown to many people today.)
Do not try to intervene in a complex thing unless necessary. For example, if an illness is mild, don't take medicine recklessly; use your own antifragility to heal. Don't try to intervene in an entity as large as a country; it can produce unforeseen consequences. The U.S. once supported the Taliban against the Soviet Union, and later the Taliban became an adversary. Don't try to intervene in a financial system; companies that should go bankrupt should go bankrupt.
Nature is antifragile; nature does not like things that are too large—too-large things are fragile. Imagine a food chain: only 10% of the energy from one level is available to the next; when energy fluctuates in the ecosystem, the top predators are most affected. Also, heavy metals accumulate most in top predators. Look at society: when companies merge into larger companies, they also become more fragile. You might think mergers increase efficiency, but according to the author's research, the vast majority of company mergers do not improve efficiency.
Antifragility has limits; if you cut off your hand, don't expect a stronger, more powerful hand to grow back.
Time will eliminate fragile things; be cautious about trying things that have only existed for a few years. For example, it's better to eat foods that have existed for thousands of years rather than modern food-industry processed sugar-oil mixtures. If it's not urgent, don't try therapies invented only in recent years.
Some people shift fragility (transfer risk) while enjoying the benefits themselves. For example, executives in the 2008 financial crisis disregarded that their actions would make the system more fragile and made huge profits while making all taxpayers pay. Those who start wars do not go to the battlefield themselves but send soldiers. If you want to consult someone about what you should do, ask what they would do if they were you, so they would bear the "risk." If officials are to start a war, someone from their family must go to the front line so they bear the risk. Taleb describes this part more fully in Skin in the Game: if you want to benefit, you must bear the corresponding responsibilities and risks.
Iatrogenic harm
To be continued
Skin in the Game
The motto of a certain architecture school (I forgot which school; just take it as is) is: we train responsible architects. Why not "train clever engineers"? Because only responsibility can build a good building; cleverness cannot. If an architect's building collapses due to design problems and he bears no responsibility, you can imagine that person won't design good buildings no matter how clever he is.
To be continued.
Last updated