We Don’t Need More Information

How do we make sense of reality if we can’t trust facts, sources, or one another?


     This image was taken by the crew of Apollo 15.

Information rarely informs. As it turns out, the availability — and even the understanding — of information often does not directly shape beliefs and preferences. Sometimes, information can have the opposite effect.

In 1979, a study run at Stanford University recruited a group of undergraduates that held opposite views on capital punishment. Half of the participants opposed the continued practice of it and doubted its effectiveness as a deterrent; the other half supported it on both counts. Both groups were provided with literature describing two other studies of capital punishment, one concluding that capital punishment was a demonstrable deterrent and the other concluding the opposite. Both studies contained substantial data to support their determinations. After reviewing the studies, participants were asked to respond to them. Predictably, those who already held pro-capital punishment views retained them and cited the study in favor of it as offering the more credible evidence. Those who already opposed capital punishment preferred the other study. But here’s the catch: both studies were fake. When the participants were informed that the studies they had reviewed and the data supporting them were falsified, they were then asked again about their opinions on capital punishment. No one changed sides. In fact, the Stanford researchers observed that at each point in the study, the participants’ incoming views only intensified. As far as the participants were concerned, the truth validated their views as handily as the lie.

Studies like the one run at Stanford are quite common. Their staging — the twisting and turning of their Mamet-like plots — is intentionally devised to scrutinize the effect of information on belief. Some of them have multiple turns, each one objectively undercutting what was briefly supporting evidence for opposing views, yet having the effect of calcifying subjective prior commitments. Time and again, they demonstrate that our views, whether casual opinions or sacred beliefs, are mostly a combination of confirmation biases and the social foundation of everyday reason. Confirmation bias — our tendency to prefer information that supports what we already think — is a well-known pitfall of the human mind. But the evolutionary root of reason itself, which is theorized to be the result of emotionally-driven group dynamics, is fascinating. It suggests that reason is not, as we like to think, a mathematical program running in our brains. Instead, it is an emotional function. A more recent study run by Andy Norman at Carnegie Mellon University explores that connection between the evolutionary origins of reason and the modern dissonance of information and belief.

“The human capacity for reason probably evolved to align our mental states both with circumstances in the world and with the mental states of others…We use reason to advance various proximal ends, but in the main, we do it to overwrite the beliefs and desires of others: to get others to think like us.” - Andy Norman, Carnegie Mellon University

There is an inherent irony to these studies. They demonstrate that facts don’t directly shape beliefs, which is itself a fact unnecessary to validate the premise. The truth is, we don’t need studies like this to explain what we already experience every day.

 

 

Just recently, I was chatting with a contractor working on my house. Somehow, the subject of conspiracy theories came up. (Though those who know me well would find this no surprise, I promise you that I do not go out of my way to inquire after others’ paranoia nor spread my own. Those of you who don’t know me as well should know I write the preceding with a smile and a wink. I’m not that paranoid, but I’m not not paranoid.)

I wasn’t surprised to hear that my new friend was skeptical of the pandemic, skeptical of masks, and skeptical of the vaccines. And, I wasn’t surprised to be dished an extra helping of suspicion-stew either — toward governments, corporations, any group larger than a few people, really — within just a few minutes of striking up this conversation. And when he paused, took a quick scan of the area for listeners-in, then stepped a bit closer and said, “Hey, but I got a real big one for ya,” I knew exactly where he was going: “We didn’t go to the moon.”

I told him I was familiar with that one. He grinned. “All right then,” he clapped, “Lemme take ya deeper!” I stood my ground, poker-face intact, showing him I was ready for anything. “The world ain’t no globe, brother.” And there it was. The big one.

The pièce de résistance — the grand conspiracy theory of everything — is The Flat Earth. For it to be true, an incredibly complex armature of lies must support the reality that unbelievers — we Globers — accept. “I’ve heard that story, too,” I said, wary of disappointing him. “But tell me, why are you persuaded that the Earth is flat?” He answered with a question of his own. “You ever been up high enough to see the curve of the Earth?” I said no. “Well, then. Neither have I, brother. No one has.” I didn’t point out that, actually, some people have, because I already know that in order to go full Flat Earth, you have to believe that NASA is part of the conspiracy. You have to doubt every bit of photographic and documentary evidence available. Now, rather than recount our chat word for word, I’ll fill in a bit here for the uninitiated. Those who believe that the Earth is flat in 2021 don’t believe that you can sail a ship off the edge and plummet into the void. They believe that Antarctica, that massive continent at the South Pole — the “bottom” of our semi-spherical planet — is actually an impassible ice ring around the edge of a flat disc world, enclosed by an invisible but impenetrable dome. It’s been my experience that none of those “facts” is worth debating with a believer. So I asked my new friend, “But have you been to the edge and seen the ice for yourself? Have you touched the dome?” He laughed. “Hell no!”

My contractor and I don’t see the world the same way. Because I’m fascinated by conspiracy culture, I’m familiar with much of the same information that has shaped his worldview. I’ve read it all; I understand the arguments and interpretations. Yet, I have made very different conclusions about the world, about reality.

 

 

How can two people know the same facts, but have a completely contrary knowledge of the world? For two weeks, I observed this man carefully measure things, carefully take into consideration mathematical, physical, and chemical truths about our world in order to do fine, professional work on our property. And yet, this is a man who believes he resides in a reality very different from mine. How can we both successfully operate in a world when we disagree on fundamental truths?

I think the reason is obvious: We don’t make conclusions about reality based upon facts alone. Nor are all facts created equal when supporting perceptions and conclusions. Reality is something I believe we all know — either consciously or subconsciously — to be too complex to be fully understood. There are aspects of reality we can understand, and there are aspects that not only confound us but elude our very perception. Making judgements about reality based upon only that which we can perceive is necessary to life. But we can still reason that it only partially illuminates our view. Some do, and some do not.

Regardless of our conclusions, we share a more important thing than just a common set of facts. We are all subject to the same limits when it comes to verifying those facts. Neither my contractor nor I, for example, can personally prove many of the very facts that support our most important conclusions about the reality in which we exist. He hasn’t touched the dome or seen the ice ring; I haven’t been to space and looked back upon the globe. I could have become an astronaut, and verified the curvature of the Earth by seeing it with my very own eyes. Perhaps. But at this point in my life, that door is almost certainly closed. So I have to trust those that have. Could I measure the curvature of the Earth for myself, from right here on the ground? With some travel, some time, and some patience, yes, I could. I have not done this, and it’s not at the top of my to-do list. This is because mathematically verifying the curvature of the Earth won’t change my existing belief that it is curved. I don’t need more information to accept that the Earth is not flat.

Epistemology is a house of cards. A theory of knowledge that does not depend upon authority may be the holy grail of the philosophers of the world, but no such theory will ever matter to most people. If we all had to directly validate every fact that supported our understanding of the world, we would not have time to do anything else. Practical epistemology — a constant, forensic analysis of what differentiates opinions from justified beliefs — would be totally incapacitating. That is why we defer to authority.

 

 

Most of us find ourselves in positions of persuasion, some more frequently than others. As a designer, I am constantly persuading people to make choices. Though I often draw upon data to strengthen my recommendations, rarely do those who hear me request access to review the data for themselves. They trust me, yes, but they also know that the data are not necessary to validate the choice they make. This is, in part, because I’ve already persuaded them of that. I believe in the power of data to validate design decisions. But I also believe that good design is essentially a self-fulfilling prophecy. If the objective of a design decision is focused and clear, then the design itself should provide no alternative outcomes. In other words, designing for a variety of outcomes and then measuring which are preferred by users is a waste of time if the designer already has a specific preferred outcome in mind. In that case, designing for only that outcome will ensure that it occurs with far more regularity and accuracy than the the more exploratory approach. What this relies upon is, just like any other belief, a prior commitment. When I draw out those prior commitments, they are always far more persuasive than any data in any amount.

I also find myself exhausted by the increasingly warlike experience of information in our culture. Following the news often feels as if I’m the subject of one of those notoriously nested Stanford studies; what was true yesterday is disproven today only to be validated again tomorrow. Whether it’s a matter of what diet helps or harms you or which agents are friends or foes of Democracy, it can be easy enough to lose the thread; maintaining a grip on it yields ever diminishing returns. But the emotional, evolutionary root to reason offers a key for anyone earnestly navigating this ever more complex world. What matters more than facts are feelings. Feelings are facts, also. Early in my career I was counseled to remember that “they won’t remember what you said, but they will remember how you said it” — that the emotional reality of what I do will outlast its factual grounding. Few truths have maintained their emotional and intellectual persuasion like that one.



Written by Christopher Butler on May 14, 2021,   In Essays


Next Entry
The Luddites Were Right Luddism wasn’t really about technology. But was their skepticism toward development a good thing? This image depicts Ned Ludd in 1812.
Previous Entry
The Problem With Design is Designers Psychology and process can get in our way just as often as they can help us get things done. This painting by Rene Magritte is ironically
chrbutler.com is the personal website of Christopher Butler.
© Christopher Butler. All rights reserved.
About this Website
Subscribe to the Newsletter
RSS