It is generally accepted by most that the complexity of the world far exceeds our capacity to fully comprehend it. Meaning we’re each forced to resort to various methods of simplification in our efforts to interpret things and make sound decisions in our lives—metaphors, rules, and generalizations are some of the methods we depend upon to do just that.
Our brain is constantly mapping the world in this way, and like all map projections, distortion of reality is necessary in order to establish a comprehensible form of it. This distortion is what’s known as the Principle of Fallibilty. And all Humans are affected by it. Now that shouldn’t really be news for anyone who is paying attention. But the extent of the problem and how it affects us negatively isn’t as widely accepted as I believe it probably should be.
Here’s an easy way of looking at the problem of fallibility. Think of a map for a moment. A map will always fall short of portraying the actual state of the environment it’s attempting to portray. You don’t look at a map and believe for one second that it truly represents the portrayed environment. It’s a lower resolution impression of reality—limited by its scale, detail, time period etc. If any map was truly capable of representing reality with 100% accuracy, it would literally have to be the size of the territory itself to account for every single detail—remodeling in real-time to comprehend any new variables that arose—like a huge slab of rock falling into the sea for example. Could a map tell me how many trees are in a forest? Or how cold the river is right now? Almost certainly not. A map’s utility is confined to its capacity of resembling reality.
One’s worldview shares similar parallels. It’s like a map in its own right—an impression of the world, restricted by the limits of our perceptional functions and our own prejudices. And like a map, the utility of our worldview is confined to its capacity of resembling reality. Could my worldview tell me where to invest my money? Or how to deal with a troubled relationship? That depends on how accurately one’s map represents reality. The closer your map can portray objectivity, the more likely your strategies will succeed and the better you’re able to understand the world around you. But know this: Your map’s impression of the external world is never objective. It can always be improved upon.
Debate and disagreement with others is an inevitable part of life as we each attempt to protect the integrity of our maps. This essentially involves two or more parties laying their maps out on the table and trying to win—by convincing the other side that their map is superior.
In the traditional sense of an argument, a set of opposing ideas or concepts will clash, marking the verbal deceleration of war—and then each person attempts to out-reason each other to win. Unfortunately, the truth is not often the aim of the debate in most cases. Rather, the ability to proclaim your map as the superior one is. When a person collaborates for truth, they take ownership of their worldview by integrating any new information that improves upon it. With such a mindset, one doesn’t have to rely on controlling the debate, only himself. Each debate is a chance to get stronger. A chance to strengthen the integrity of one’s outlook.
For most of mankind, it’s never been about what’s true and what’s not. It’s survival first. For most people of the world, facts that act against them are the only truths. When layers and layers of one’s lifestyle are built upon ‘truths’ that must remain valid in order for one to validate and accumulate power, respect, and security—then objectivity may quite literally threaten the walls of the subjective order in question. The inner voice of its naive beneficiaries will self-validate their way and oppose all those who disagree, with merciless ease. Look at the sheer amount of systems around our world that are built upon ideas that are contradictory, self-verifying, illogical and entirely closed off to the notion of fallibility. It’s not exactly difficult to see why conflict is often inevitable in this game. Any thinker or system that refuses to admit their fallibility is a problem simply waiting to happen.