The Reasoning Gap: Why Access to Information Hasn't Made Us Wiser
- Kathy Postelle Rixon

- 2 days ago
- 7 min read
A philosophical investigation into critical thinking in 2026
By Kathy Postelle Rixon | Chair, The Philosophical Society: Oxford | Researcher, University of Cambridge
There is a paradox at the heart of modern life. We have more access to information than any generation in human history. Within seconds, we can consult the collected knowledge of civilisations, read the findings of peer-reviewed science, examine primary sources that once required years of archival research to locate. The sum total of human understanding sits, quite literally, in the palm of our hands.
And yet, by almost every meaningful measure, our reasoning appears to be worse. Political polarisation has reached historic levels. Conspiracy theories spread faster than corrections. Educated people hold mutually incompatible beliefs about basic empirical questions. Institutions built on the premise of shared reason are under extraordinary strain.

How do we explain this? Why has the democratisation of information not produced a corresponding democratisation of wisdom? This is not merely a political question, nor a technological one. It is, at its core, a deeply philosophical problem and one that thinkers across the centuries have given us considerable resources to understand.
We do not have a facts problem. We have a reasoning problem.
I. The Kantian Diagnosis
Immanuel Kant, writing in the eighteenth century, made a distinction that resonates with uncomfortable precision today. In his Critique of Pure Reason, Kant argued that human beings never encounter reality directly. We do not simply receive the world as it is. Instead, we experience it through cognitive frameworks, what he called the a priori structures of the mind, that organise raw sensation into meaningful experience.
What this means, translated into the language of our current moment, is that information does not arrive in the mind as neutral data and remain there, unchanged. It is immediately processed, filtered, categorised, and interpreted through frameworks built from prior experience, cultural conditioning, emotional memory, and deeply held belief. Two people can read an identical article and emerge with entirely different understandings of what it said - not because one of them is lying, but because they are, quite literally, perceiving it differently.
This is not a flaw in human cognition that we might correct with better education or faster fact-checking. It is a structural feature of how minds work. Kant's insight suggests that the assumption underlying our information age, that if people simply had access to more facts, they would reach better conclusions, rests on a fundamental misunderstanding of the nature of human reasoning.
More information does not automatically produce better thinking. It produces more material for existing frameworks to process and, potentially, to distort.
II. Bacon's Idols, Revisited
Francis Bacon, writing more than a century before Kant, had already identified this problem with characteristic precision. In his Novum Organum of 1620, Bacon described what he called the Idola, the idols of the mind, systematic biases that prevent human beings from reasoning clearly.
The Idola Tribus, the Idols of the Tribe, are the errors common to all human beings: the tendency to see patterns where none exist, to give more weight to confirming evidence than disconfirming evidence, to allow hope and fear to distort perception. The Idola Specus, the Idols of the Cave, are the individual distortions that arise from personal history, temperament, and the particular intellectual passions that blind us to contrary evidence.
Most prescient for our purposes are the Idola Fori, the Idols of the Marketplace. These are the errors that arise from language itself: from the imprecision of words, from the way that terms carry emotional freight that distorts argument, from the way that public discourse simplifies and polarises complex realities into competing slogans. Bacon could not have imagined social media, but he described its epistemological consequences with remarkable accuracy.
What Bacon understood, and what we seem to have forgotten, is that the remedy for poor reasoning is not more information. It is disciplined method. It is the cultivation of habits of mind that actively resist our natural cognitive tendencies, habits that must be taught, practised, and continually renewed.
III. The Cognitive Science Confirmation
Modern cognitive science has given empirical substance to these philosophical intuitions. The research is, by now, extensive and sobering.
Confirmation bias, our tendency to seek, interpret, and remember information in ways that confirm what we already believe, operates at a level largely beneath conscious awareness. We do not experience ourselves as filtering information selectively. We experience ourselves as seeing clearly. This is, of course, precisely what makes the bias so powerful and so difficult to correct.
The psychologist Jonathan Haidt has argued, controversially but compellingly, that moral and political reasoning is largely post-hoc rationalisation. We reach our conclusions first through rapid, intuitive, emotionally-driven processes and then construct arguments to justify them. The arguments feel like causes. In reality, they are often effects. We are not reasoning our way to beliefs; we are believing our way to reasons.
Perhaps most unsettling is the research on what has been called the backfire effect: the documented tendency of some people, when presented with evidence that contradicts a deeply held belief, to hold that belief more firmly than before. Correction can entrench error. Information can deepen ignorance. This is not irrational in the narrow sense. It is a predictable consequence of the way identity and belief are intertwined. To abandon a deeply held belief is not merely an intellectual act. It can feel like a threat to the self.
We believe our way to reasons, rather than reasoning our way to beliefs.
IV. The Socratic Remedy
Socrates, whose philosophical method was built entirely on the recognition of human cognitive fallibility, understood something that we have been slow to relearn: that the path to wisdom does not begin with the acquisition of information. It begins with the acknowledgement of ignorance.
The famous declaration attributed to Socrates, that he was the wisest of men because he alone knew that he knew nothing, is not false modesty. It is a precise epistemological claim.
The person who believes they already know the answer to a question cannot genuinely inquire into it. Real thinking requires what Zen tradition calls shoshin, or beginner's mind: the willingness to approach even familiar questions as if for the first time.
The Socratic method, the dialectical exchange of question and answer, the patient examination of assumptions, the willingness to follow an argument wherever it leads regardless of personal discomfort, was designed not to transmit information but to cultivate a particular quality of mind. A mind alert to its own limitations. A mind willing to be wrong.
This quality of mind is not natural. It is, in fact, deeply counter-natural. It requires us to actively resist cognitive tendencies that evolution has selected for us because they are, in many contexts, useful. The problem is that the contexts in which lazy heuristics and in-group loyalty served our ancestors well are very different from the contexts in which we now need to reason clearly about complex, novel, and genuinely uncertain questions.
V. What Philosophy Offers to the Reasoning Gap
I want to be careful not to suggest that philosophy is a cure for the ailments I have described. It is not. Philosophers are as susceptible to motivated reasoning, confirmation bias, and in-group loyalty as anyone else, perhaps more so, given the particular pride of intellect that the discipline can cultivate.
But philosophy, practised seriously, offers something that I believe is genuinely rare and genuinely valuable in the present moment: a set of tools for thinking about thinking. Logic, epistemology, and the philosophy of mind give us ways to examine not just what we believe but how we came to believe it, and whether the process by which we arrived at our beliefs is one we should trust.
Philosophy also offers something perhaps even more fundamental: a tradition of sitting with uncertainty. The history of philosophy is, in large part, a history of questions that resist definitive answers, such as, questions about the nature of knowledge, consciousness, reality, and value that have occupied the greatest minds across the centuries without resolution. Learning to think well in the presence of uncertainty, to reason carefully without demanding premature closure, is perhaps the most important intellectual skill of our age.
We live in a moment of extraordinary epistemic pressure. The volume of information competing for our attention is unprecedented. The speed at which claims circulate has far outpaced our ability to evaluate them. The social and political stakes attached to what we believe have never been higher. In this environment, the temptation to retreat to simple certainties, to the comfort of a tribe, an ideology, or a set of unquestionable truths, is understandable and powerful.
Philosophy, at its best, resists that temptation. It insists on the hard work of genuine inquiry, which is the work of examining assumptions, following arguments to uncomfortable conclusions, and maintaining intellectual humility in the face of genuine complexity.
The goal of philosophy is not to provide answers. It is to ensure that our questions are worthy of the answers we seek.
VI. Closing Thoughts: The Examined Life, Revisited
Socrates, famously, declared that the unexamined life is not worth living. This claim is often read as a kind of intellectual elitism, a philosopher's assertion that only those who engage in sustained philosophical reflection are truly alive. I read it differently.
I read it as a statement about what human beings are: creatures capable of reflection, of questioning, of revising our beliefs in light of better reasons. When we stop doing these things, when we allow our cognition to be captured by habit, by tribal loyalty, by the sheer momentum of information flow, we are not living less well. We are, in a meaningful sense, living less humanly.
The reasoning gap is real. The information age has given us extraordinary resources and has, simultaneously, made clear that the bottleneck in human understanding has never been access to facts. It has always been the quality of the minds that receive them.
Cultivating that quality of nurturing genuine critical thinking, intellectual humility, and the courage to follow an argument where it leads is not a luxury. In 2026, it may be among the most urgent tasks we face.
It is, in any case, the task to which philosophy has always been devoted. And it is the task to which, in this space, I intend to return again and again.
— — —
About the Author
Kathy Postelle Rixon is Chair of The Philosophical Society: Oxford and a Researcher at the University of Cambridge, where her work explores the intersections of plasma physics, quantum mechanics, and conscious awareness. Through Magic in Harmony, she guides others toward deeper understanding of consciousness and transformative practice. She writes at the intersection of scientific inquiry, ancient wisdom traditions, and philosophical investigation.
Contact: kathy@magicinharmony.com









Comments