When Knowledge Meets Reality
- Yoram Zahavi יורם זהבי
- Feb 7
- 4 min read
Most of the time, most of us are convinced that we know much more than we actually do, even when it comes to our own basic perception of knowledge. Let’s start with what we know to be true. Something simple, like the fact that the moon's gravity affects the tides. How many of us can actually prove this or present a reasonable argument that would pass scrutiny from a skeptical reader?
Most of us accept the fact that the moon's gravity influences tides based on sources we trust. In reality, most of what we think we know comes from what we have read or heard from someone, who in turn received that knowledge from someone else… The vast majority of what we typically consider “knowledge” is not even second-hand but third-hand (or further) and far removed from our direct experience. A significant portion of what we believe we know is socially constructed—taught to us by parents, in schools, through media, gossip, in the workplace, and from other sources.
Furthermore, knowledge changes over time, and sometimes we hold on to knowledge that is simply incorrect. Whales were once considered fish (only in 1735 did Linnaeus classify them as mammals), the elements of the world were limited to just four (until Dmitri Mendeleev arranged the chemical elements in the periodic table in 1869), and the universe was perceived as static (until Edwin Hubble discovered in 1929 that galaxies are moving away from each other at great speed).
This is not only true for general knowledge but also for things we consider to be common sense. It sounds logical that eating fat makes you gain weight or that a heavier object falls faster. Right?
So what is it about us that misleads us regarding our own knowledge?
Discovering everything on our own is simply impossible, and sometimes we just avoid trying. We lack the time, computational ability, and access to resources, and in some cases, seeking knowledge can even be too dangerous. It’s much safer to learn what is toxic indirectly rather than discovering it firsthand. Most of us don’t really know if a manager has the skills they claim to have, and we certainly don’t ask to see their degrees or certifications—investigating social matters carries social risks, as it may break a taboo or offend others.
We often fail to distinguish between reliable information and misleading information. We might give too much weight to the words of authority figures or the majority rather than trusting our own experiences (Authority Bias). Alternatively, we may accept information as truth even when it contradicts our own feelings or knowledge (Conformity Bias).
We have a tendency to fill in the gaps (Narrative Bias, Gap-Filling Heuristic). Imagine sitting with friends at a meal, and the last bite of dessert remains on the plate. No one touches it. Some might think, “Everyone is too polite to take it,” or “Maybe this bite belongs to someone else and not me,” or “Maybe no one likes it,” or “Maybe everyone is just full.” Our minds naturally fill in missing details based on past experiences, fears, or partial information to create a complete story. We do this when recalling past events or interpreting present situations.
Hindsight Bias. “I knew they would win, I felt it coming!” or “It was obvious the stock would go up, all the signs were there.” People convince themselves that they knew something was going to happen before it actually did. This is the tendency to overestimate our ability to predict outcomes after the fact. Why does this happen? Because we look for logic and patterns in events, erase contradicting information, and want to reinforce our confidence while reducing uncertainty.
Overconfidence Bias. For example, if we are asked to estimate the weight of a Boeing 747 and provide an upper and lower estimate range that we are 90% confident will contain the correct answer, studies show that, in reality, only about 40-50% of estimates actually include the true value. We tend to overestimate the accuracy of our predictions and believe our knowledge is more reliable than it actually is. Why does this happen? Because we think we know more than we really do, or we are overly confident in our ability to guess correctly (even in fields where we have no expertise), or we want to appear knowledgeable and avoid giving an overly broad range.
Perhaps the most troubling area where we deceive ourselves is in how we perceive our own cognitive processes. Introspection—the process of looking inward to examine our thoughts, feelings, and mental processes—sounds like a reliable way to understand ourselves. However, research in cognitive psychology and neuroscience shows that our intuitive insights about ourselves are often misleading. We tend to believe that we understand how our own brains work better than we actually do.
I admit that this (partial) list might seem frustrating, but in reality, we seem to navigate our way through life quite well with the knowledge we possess.
The best way to cope with an evolving knowledge landscape is not to pretend to know everything but to cultivate a constant awareness of uncertainty. You can bridge the gap by adopting critical thinking—evaluating sources, seeking confirmation but also refutations for claims that seem obvious, and never stopping to ask questions—especially the ones we tend to avoid. Finally, develop cognitive flexibility—recognizing that knowledge is temporary and subject to change, and being willing to update our beliefs in light of new information. This is an opportunity to adapt to changes more effectively.
Thoughts for Reflection
🔹 When have you experienced a moment of uncertainty?
🔹 Which claim from this post would you want to put to the test?
🔹 And what, if anything, are you willing to change in your perception?

Comments