: The advice of a well-known economist-analyst teaches you to succeed, without relying on luck and intuition, to calculate options and take into account events and risks that seem impossible.
"Black Swans" - these are events that seem to be impossible but occurring
A person’s talent is to turn all environmental signals into meaningful information. This made it possible to create a scientific method, philosophize about the nature of being, and invent complex mathematical models.
Our ability to reflect on and manage the world does not mean that we are doing well. We tend to think narrowly in our ideas about him. Having come to some kind of judgment, we cling to it with a dead grip.
Human knowledge is constantly increasing, and such a dogmatic approach is not effective. Two hundred years ago, doctors and scientists were absolutely confident in their knowledge of medicine, but just imagine that turning to a doctor with complaints of a runny nose, you are prescribed a prescription for leeches!
Confidence in judgments forces us to take concepts beyond the framework of the concepts we have accepted as true. How to understand medicine without knowing the existence of microbes? You can come up with a reasonable explanation of the disease, but it will be erroneous due to lack of important information.
Such thinking may lead to unexpected surprises. Sometimes events are surprising not because they are random, but because our worldview is too narrow. Such surprises are called “black swans” and can make us reconsider the picture of the world.
Before a person first saw a black swan, everyone assumed that they are only white. White color was considered their integral part. Seeing a black swan, people radically changed the idea of this bird. Black swans are just as common as white swans, and as fatal as bankruptcy due to the fall of the stock market.
"Black Swans" can have fateful consequences for those who are blind to them
The effect of the "black swan" is not the same for everyone. Some may seriously suffer from it, while others will not even notice it. Access to relevant information is important: the less you know, the greater the risk of becoming a victim of the “black swan”.
Example. Imagine that at the races you put on your favorite horse named Rocket. Due to the physique of the horse, its list of rewards, the skill of the jockey and the sluggish competition, you are putting all the money to win it. Now imagine your surprise when the Rocket not only didn’t run after the launch, but chose to just lie down. This is the "black swan." Given the information, the Rocket should have won, but for some reason you lost all the money. On the contrary, the owner of the Rocket got rich by putting against her. Unlike you, he knew that the Rocket would go on strike in protest against animal cruelty. This knowledge saved him from the "black swan."
The influence of “black swans” can affect not only individuals, but also entire societies. In such cases, the “black swan” can change the world, influencing, for example, philosophy, theology and physics.
Example. Copernicus suggested that the Earth is not the center of the universe, and the consequences were colossal: the discovery cast doubt on the authority of the ruling Catholics and the Bible itself.
Subsequently, this "black swan" laid the foundation for a new European society.
It’s very easy to confuse us even with elementary logical errors
People often make mistakes when predicting what they know about the past. Considering that the future is a reflection of the past, we are mistaken, because many unknown factors run counter to our assumptions.
Example.Imagine you are a turkey on a farm. For many years, the farmer has fed you, cared for and cherished. Focusing on the past, there is no reason to expect change. Alas, on Thanksgiving you were beheaded, fried and eaten.
Making forecasts based on the past, we are mistaken, and this leads to serious consequences. A similar fallacy is cognitive distortion when we seek evidence of only existing beliefs.
We do not accept information that contradicts what we already believe in, and are unlikely to conduct further research. But if we decide to figure it out, we will look for sources that dispute this information.
Example. If you are firmly convinced that “climate change” is a conspiracy, and then you see a documentary entitled “Indisputable evidence of climate change,” you are very likely to be very upset. And if you start looking for information on the Internet, in the search terms you indicate “climate change is a hoax” and not “evidence for and against climate change”.
That is, we unwittingly draw the wrong conclusions: it is in our nature.
Our brain groups information in a way that makes it difficult to make accurate predictions.
During evolution, the human brain has learned to classify information so as to survive in the wild. But when we need to learn and quickly adapt to a dangerous environment, this method is completely useless.
Incorrect classification of information is called a false narrative: a person creates linear descriptions of the current situation. Due to the huge amount of information that we receive daily, our brain chooses only the one that it considers important.
Example. You probably remember what you ate for breakfast, but you are unlikely to name the color of the shoes of each passenger in the subway.
To give meaning to the information, we link it. So, thinking about your life, you mark certain events as significant, and build them into a narrative explaining how you became who you are.
Example. You love music because your mother sang to you before going to bed.
So you can not fully understand the world. The process works only with an eye to the past and does not take into account the almost limitless interpretations of any event. Even tiny events can have unpredictable, important consequences.
Example. A butterfly, flapping its wings in India, causes a hurricane in New York a month later.
If we arrange causes and effects in the order in which they occur, then we will see clear, causal relationships between events. But since we see only the result - a hurricane - we can only guess which of the simultaneously occurring events actually influenced such an outcome.
It's hard for us to distinguish between scalable and non-scalable information
We do not very well distinguish between types of information - “scalable” and “non-scalable”. The difference between them is fundamental.
Non-scalable information, such as body weight or height, has a statistical upper and lower limit. That is, body weight is not scalable, because there are physical limitations: it is impossible to weigh 4500 kg. Limiting the parameters of such non-scalable information allows you to make predictions about average values.
But non-physical or fundamentally abstract things, such as the distribution of wealth or album sales, are scalable.
Example. If an album is sold through iTunes, there is no limit to the number of sales: it is not limited to the volume of physical copies. And since operations take place online, there is no shortage of physical currency, and nothing will stop you from selling trillions of albums.
The difference between scalable and non-scalable information is crucial for seeing an accurate picture of the world. If rules that are effective for non-scalable information apply to scalable information, errors will occur.
Example. You want to measure the wealth of the population of England.The easiest way is to calculate wealth per capita by adding income and dividing it by the number of citizens. But wealth is scalable: a tiny percentage of the population can own an incredibly large percentage of wealth.
Per capita income data will not reflect the real situation in your income distribution.
We are too confident in what we consider famous.
Everyone wants to protect themselves from danger. One way is to assess and manage risks. Therefore, we buy insurance and try "not to put all the eggs in one basket."
Most make every effort to assess risks as accurately as possible, so as not to miss the opportunity and at the same time not to do something that you can regret. To do this, you need to evaluate all the risks, and then the likelihood that these risks materialize.
Example. Let's say you are going to buy insurance, but without spending money. Then it is necessary to assess the threat of illness or accident and make an informed decision.
Unfortunately, we are convinced that we know all the possible risks that we must protect ourselves from. This is a game mistake: we tend to react to risk as a game with a set of rules and probabilities that can be determined before it begins.
Considering risk in this way is very dangerous.
Example. Casinos want to earn as much money as possible, therefore they have developed a security system and will disqualify players who win too much and often. But their approach is based on a game bug. The main threat of the casino is not lucky and not thieves, but kidnappers who take the child of the owner of the casino hostage, or an employee who has not submitted a tax return to the Tax Service. Serious hazards to the casino are completely unpredictable.
No matter how hard we try. It is impossible to foresee any risk.
Why is it necessary to realize one’s ignorance?
Understanding that you don’t know much, you can better assess risks
Everyone knows the phrase: "Knowledge is power." But when knowledge is limited, it is more profitable to admit it.
By focusing only on what you know, you limit your perception of all the possible outcomes of this event, creating fertile ground for the emergence of the “black swan”.
Example. You want to buy shares of the company, but know too little about the stock market. In this case, you will follow a few drops and rises, but, in general, pay attention only to the fact that the trends are positive. Assuming that the situation continues, you spend all the money on stocks. The next day, the market crashes and you lose everything that you had.
Having studied the topic a little better, you would have seen numerous ups and downs of the market throughout history. By focusing only on what we know, we expose ourselves to serious risks.
If you admit that you do not know something, you can significantly reduce the risk.
Example. Good poker players know that this principle is critical to success in the game. They understand that their opponents' cards may be better, but they also know that there is certain information that they do not know - for example, the opponent’s strategies and the degree of his determination to go all the way.
Aware of the presence of unknown factors, players focus solely on their cards, better assessing the possible risks.
Understanding limitedness will help us make the right choice.
The best defense against cognitive traps is to have a good understanding of forecasting tools and their limitations. This may not save you from a miss, but it will help reduce the number of unsuccessful decisions.
If you are aware that you are subject to cognitive bias, it is much easier to understand that you are looking for information that supports existing claims. Or, knowing that people like to reduce everything to clear, causal narratives, you will be inclined to look for additional information for a better idea of the "picture as a whole."
You need to know about your shortcomings.
Example. If you understand that there are always unforeseen risks, despite the prospects of the opportunity, you will be more careful in investing large amounts of money in it.
It is impossible to overcome all the accidents or our limitations in understanding the complexity of the world, but it is possible to at least mitigate the damage caused by ignorance.
The most important thing
Although we constantly make predictions, it doesn’t work out well for us. We are too confident in our knowledge and underestimate our ignorance. The inability to understand and determine randomness and even our nature itself contributes to unsuccessful decision-making and the appearance of “black swans”, that is, events that seem impossible and make us rethink our understanding of the world.
Disbelieve in "because." Instead of wanting to see events in a clear causal relationship, consider a number of possibilities without focusing on one.
Realize that you don’t know something. For meaningful forecasts for the future, whether it is buying insurance, investing, changing jobs, and so on, it is not enough to consider everything that is “known” to you - this gives only a partial understanding of the risks. Instead, admit that you don’t know something, so as not to unnecessarily limit the information you are dealing with.