By Stephanie Zvan
Reprinted from the blog Almost Diamond with permission
I am not a rationalist.
I have friends who are rationalists. I do my best to think of it as a nice little hobby of theirs. I do cryptograms and other puzzles in my down time. They spend time hacking their thinking processes, or trying to. We’ve all got our thing.
Every once in a while, though, they’ll promote some argument or another from another rationalist, and I have to speak up. Why? Because the argument is a dreadful bundle of wrong wrapped up in a “logic” bow. Why? Because it doesn’t matter how well you regulate your thinking. You could overcome the limitations of the human brain and turn yourself into a computer. (You can’t, but bear with me here.) You’re still going to get garbage out if you put garbage in.
I’m not a rationalist because I’m an empiricist. I find no value in “logical” arguments that are based in intuition and “common sense” rather than data. Such arguments can only perpetuate ignorance by giving it a shiny veneer of reason that it hasn’t earned.
I boggle that we haven’t sorted this out yet. I particularly boggle that atheists of my acquaintance promote rationalism over empiricism. The tensions between basic rationalism and empiricism parallel the tensions between church theology and the philosophy of science. We have no problem rejecting church theology as not being grounded in evidence. Why do so many atheists praise rationalism?
Let me stop here and make it clear that I’m not rejecting logic or critical thinking. Goodness knows that I’ve spent hours just this summer helping people share useful heuristics that will, in general, help them get to the right answers more often. I’ve led workshops and panels on evaluating science journalism and scientific results. When I’ve spoken to comparative religion classes in the past, I’ve talked about religious skepticism with an emphasis on the basics of epistemology.
The problem isn’t logic or critical thinking. The problem is a tendency to view those skills as central to getting the right answers. The problem is a tendency to view them as the solution. They’re not, and the idea that they are is in distinct contrast with the way humanity has actually grown in knowledge and understanding of the world.
Rationalism is, at heart, an individualist endeavor. It says that the path to getting things right lies in improving the self, improving the thinking of one person at a time. It’s not surprising that the ideology and movement appeal largely to the young, to men, to white people, to libertarians. It focuses primarily on individual action.
That’s not how we’ve come to learn about our world, though. It’s not how science or any other field of scholarship works. Scholarship is a collaborative process. And I don’t just mean peer review and working groups, though those are important as well.
Scholars add to our knowledge of the world by building on the work of others. They apply tools and methods developed by others to new material and questions. They study the work of other scholars to inspire them and give them the background to ask and answer new questions. They evaluate the work of others and consolidate the best of it into larger theoretical frameworks. Without the work of scholars before them, scholars today and evermore would always be recreating basic work and basic errors.
All too often, I find rationalists taking this repetitive approach. They think but they don’t study. As a consequence, they repeat the same naïve errors time and again. This is particularly noticeable when they engage in social or political theorizing by extrapolating from information they learned in secondary school and 101-level college classes, picked up in pop culture, or provided by people pushing a political cause. Their conclusions are necessarily as limited as their source material and reflect all its cultural biases.
The situation is worse than that, however. Not only do these rationalists come up with poor conclusions, but they’re frequently convinced that they must be right because they know how to think better than other people do. A greater understanding of cognitive biases and traps should engender epistemic humility. It should build comfort with uncertainty and some ability to estimate how much uncertainty is likely to exist around a question. Instead, it often seems to foster arrogance, as though avoiding certain errors makes someone’s conclusions correct.
It doesn’t. Neither does identifying yourself as a logical person or critical thinker. Those probably even make the problem worse.
The problem is that there aren’t any real shortcuts to getting things right. If there were, someone would have gotten there long before us. Logic isn’t new. Critical thinking isn’t new. Rationalism certainly isn’t new. And none of them have ever stopped anyone from finding some subject to be grossly wrong about.
They never will. Hell, empiricism never will either. There are just too many topics for people to become educated enough in all of them that they’ll stop getting things grossly wrong. Adherence to empiricism does, however, limit the topics on which people feel qualified to opine. It reminds people that their conclusions are only as good as the information and evidence on which they’re based. It reminds them that experts matter.
Rationalism doesn’t do that. It doesn’t do it in theory, with the emphasis it puts on how one thinks. Even in the modern rationalist movement, which speaks more to collecting evidence than classical rationalism, I have yet to see any emphasis on epistemic humility. In fact, I see calls to apply rationalism broadly to people’s decisions, which adds to the uninformed arrogance I see too much of. This may be a marketing decision, made to help sell logic and critical thinking, as well as the classes that teach them, but in the end, it still pushes people to be wrong.
That’s why I’m not part of the rationalist movement. That’s why I’m not a rationalist.