Facebook Is Not Necessary

I wrote this shortly after the 2016 election. Though it exposed deeper divisions among Americans than were commonly acknowledged beforehand, I refuse to believe that division is the permanent nature of our country.

The first time I saw Facebook, I was looking over a friend’s shoulder at his laptop, which sat open on a chair in his room. We were on our way somewhere, and he left his machine open as we walked out. I saw his profile, with his photo on the top left and a stream of messages on the right, and I wondered whether I should have one, too.

The first time I saw a facebook, I was a sophomore in college and had just started my first year as an R.A. One of the veteran R.A.s pulled me aside at the Residence Life Office and pulled a softcover book out from a drawer. It was the kind of machine-bound book you could order at a Kinkos; the cover some kind of lightweight cardstock, the interior right out of the photocopier. On the front was the school’s logo, and inside were pages filled with photos of the current student body arranged in alphabetical order. I learned that day that a facebook was made every year for the major offices around campus — Residence Life, Student Life, Academic Affairs, Public Safety, etc. — but not for the students. But I also learned that, every year, someone borrowed one for the night and made bootleg copies to spread around. Somehow, something that would be the default structure of most students’ online experience in just a matter of years was, then, contraband. It’s as if the school thought that having access to that sort of thing might not lead to much good.

Here’s the thing. I don’t like Facebook. If you’ve read me before, you’ll already know that I don’t think they’re a social network; that there really is no such thing, in the technological sense; that they’re nothing more than data farm for advertisers, and we the seeds. Facebook’s mission statement is, ironically, “to make the world more open and connected.” But they do nothing of the sort. It’s becoming clear that they do the opposite.

Facebook wants you and me to spend as much time as possible within Facebook. So they’ve made deals to bring as much content as possible to Facebook, rather than to distribute Facebook’s connective tissue throughout the web. Only one of those approaches would have made the world more open. But the one they’ve chosen simply rebuilds the world within Facebook. They are re-making the world within their walls, and closing it off to anyone without a username and password. That’s their prerogative — and really, there’s nothing particularly wrong with that, hubris notwithstanding — but it’s not what they say they’re doing. They are not making the world more open.

Facebook analyzes the data they harvest from us in order to draw conclusions about who we are, what we like, what we don’t like, what we’re doing, what we want to do, who we know, who we love, who we don’t, what we want, what we don’t want, what we have, what we don’t have, where we live, where we’ve been, where we want to go, what we believe, what we don’t believe, and on and on and on. They do this for their customers so that their customers can get better at selling us things without it feeling like they’re selling us things. And so that our experience within Facebook feels less and less like what we don’t like about the world outside of Facebook. So that we encounter more agreement and less resistance. Before Facebook shows me information — even a status message from someone it knows is my friend — its algorithm decides whether it should appear based upon a filtering decision that comprises countless individual factors. The more I time I spend in Facebook’s world, the more it feels like a place that gives me what I want. Facebook hopes that such a fantasy world will be difficult for me to leave. Fantasy worlds are not necessarily bad. We’ve made them as long as we’ve made things — we’ve written them down, bound them in paper, drawn and painted them, sculpted them, put them on stage, filmed and projected them — because human life has always needed the relief of fantasy. And however capable we’ve always been about spending too much time in fantasy worlds, fantasy worlds have always been clear about what they are. Until Facebook. The Facebook experience is as parochial and disconnected as the real world experience it supposedly wants to correct. But, the more time we spend in Facebook’s world, the more it feels like a place that gives us what we want. Your Facebook is not like my Facebook. We are there together, but we are disconnected from one another.

That fantasy masquerades as reality within Facebook is more than just a troubling phenomenon. In the 1990s, virtual reality prompted a moral panic over the idea that people might never disconnect from digital worlds. But the fidelity of those worlds was just not seductive enough. The glitchy weirdness of the outside world was just better. And The Lawnmower Man was about the best anti-PR virtual reality could ever have. Since then, we’ve had our dalliances with digital fantasy addiction — people whose lives fell apart around them while they “lived” in World of Warcraft or Second Life — but they’ve been relatively limited in their scope. And plenty of people who were rescued from those fantasies would settle on metaphor to describe what it was about them they found so irresistible: “There was more truth in there than on the outside.” Again, they knew it wasn’t real.

But on Facebook, discerning between truth and fiction — between what is real and what is not — is more difficult. As we’ve all discovered in recent months, Facebook’s news feed is full of false information. Most of that stuff could be revealed for what it is with just a little digital legwork — in most cases, simply by clicking the headline and reading the story itself — but as it also turns out, Facebook has succeeded in making leaving Facebook a pretty hard thing to do for the very people most likely to encounter fake news on Facebook. That is why fake news, easily debunked as it is, spreads like a virus. It’s rarely read, yet often “liked” and shared. We’ve learned that a short, well-written lie can be better at shaping collective thought than the longer, reasoned, researched, and accurately reported stuff we used to call news.

When enough fake news is spread, serious things can happen. Bad things. At a small scale, an individual’s reputation can be ruined. At a larger one, a foreign government can manipulate the outcome of a rival’s election.

The questions that the fake news phenomenon provokes are many. I want to ask two. First, who is responsible?

Is Facebook responsible for fake news? And if so, is Facebook responsible for outcomes that can be tied to the proliferation of fake news? To be fair to Facebook, it’s not a simple question, nor is it a matter of technological complexity. The basic question of whether a support structure is responsible for the content which it supports has been asked since the first message board existed on the internet (and of course, prior to that in a variety of non-digital forms). Every internet platform you can name has faced the question, and when they’ve demurred to answer it, the question has even been turned upon internet service providers and governments. The can just keeps being kicked. So clearly, technology obscures the clarity of the question. (Though, I would point out that it has always been obvious that television channels have borne responsibility for their programming, and have always been held accountable to that, hence ratings systems and the like.) The question, then, needs to be put more simply. What responsibility does Facebook have to the truth?

Facebook, of course, has the same responsibility to the truth as you and I. Is it wrong to intentionally mislead (lie), to look the other way when someone else does (hide from lies), or help spread information you know to be false (spread lies)? Yes. It is each of our moral responsibility to not do each of these things. So, why would it be any different for Facebook?

So far, Facebook has been reluctant to take responsibility for the material on its platform. They appear to have two reasons for this. First, they seem to indicate that it is not their place to interfere in the exchange of information within Facebook, as doing so would not help in making the world “more open and connected.” That a standard for the content that makes Facebook run does not exist, or that such a standard would work against Facebook’s very purpose is ridiculous. Or perhaps more bluntly put — coming from a company that seems to be tireless in their crusade against the nipple — bullshit. Facebook is an aggressive censor of content on their platforms, yet they’ve drawn a line between one form of self-expression and another, despite the fact that only some of Facebook’s populace is puritanical enough to find nipples objectionable. Basically, Facebook needs to be called out on its selective hyper-vigilance.

The second reason Facebook claims it cannot be responsible for the information it transacts is even more spurious. Facebook claims it doesn’t have the ability to discern the truth of information shared on its platform — that it lacks the technical facility. This is even more laughable than their first claim. Facebook’s data graph is unprecedented in its scope and depth. The sophistication of the graph is precisely what Facebook sells to its advertisers — the power to hyper-target individual consumers. They can tell advertisers more about you than you believe you have shared (or ever would) with them. And yet they can’t trace the source of a post? They can’t evaluate the reliability of those sources? They can’t triangulate the post, it’s source, and the nature and influence of the groups of users that originate its viral spread? Come on, Facebook. You can’t discern fake from real, but you can find every nipple everywhere? Right. Zeynep Tufekci is charitable when she says that Mark Zuckerberg is in denial. I say he is a liar and a coward and lacks the requisite character to wield such sweeping influence in the world. Apparently, people within Facebook agree that Facebook’s current public posture is, at best, hypocritical, and at worst, cynical. (In both cases, it’s destructive.) Well, some of them agree, anyway. Word is that a group is meeting secretly to determine what they can do about this. But, their meeting is in violation of specific direction from Facebook’s superiors not to meet about such things. Facebook would like to evade this controversy, but I suspect they will not be able to. We cannot let them.

So, Facebook says that it cannot adjudicate on the truth of information, and even if it could, it shouldn’t have to. I think that’s bogus, for the reasons I’ve stated already. But let me add one more. Facebook, as a content management platform, is designed to be anti-truth. Compare it to Wikipedia, and you’ll see what I mean. On Wikipedia, you have the freedom to set up an account, log in, and edit any entry you like. You can change the name of George Washington to Billy-Bob Washypants if you like. You can say he’s still alive and sells insurance in New Jersey. You can say he has sixteen cats and has gone vegan. Whatever. Will your edits stick? No. That’s the point. The openness of Wikipedia is the outworking of a philosophy that truth, when crowdsourced, is just as strong — if not stronger — than when managed by a select few. Now, one too many bogus edits and you’ll be banned from Wikipedia, which is, again, in line with their philosophy. If your intent toward information is chaotic-evil, fine, but you’re not entitled to go about your trickster ways anywhere you like, and Wikipedia is entitled to kick you out. So how about Facebook? On Facebook, only the original creator of a post on Facebook can edit it. The crowd has no recourse but to complain to Facebook, and Facebook, of course, has already said that the content isn’t their problem. So no matter how interested the crowd might be in preserving the truth, there is no mechanism for them to do that. That’s an architectural decision that seems obviously preferential when designing a “social network.” After all, if you post a picture of your baby with humblebrag captions about how smart and perfect she is, what right do I have to change any of that, even if I find it annoying? But what if you post a story claiming something that is just simply false, like that a candidate who did not, in fact, come close to winning the popular vote won the popular vote? What then? Well, something. Something must be done then. Otherwise, lies will eat truth.

I said earlier that the fake news phenomenon has left me with two questions, the first having to do with assigning responsibility. The second question is one that is probably not possible to answer without returning to the first. It is one of epistemology — how we know what we think we know. In a large, distributed human society such as ours, the majority of our knowledge is, philosophically speaking, epistemologically weak. Most of what we know, we know because someone we trust told us so. Which means that most of our knowledge is second-hand, at best. But how could it be any other way? History is essential to understanding much of anything about our existence, and history is entirely based upon an epistemology of trust. Do we trust what someone else says about something we cannot experience ourselves? The question is a rabbit hole down which we mostly dare not descend. Significant communities have coalesced around negative answers to that question, most of which we write off as “conspiracy theorists.” Perhaps that’s all many of them are. But their existence shows that epistemology is an existential precipice, and all any one of us need to tumble over it is the slightest push.

As far as our more immediate truth problems are concerned, we could wait for Facebook to fix itself, or we could fix the problem by doing one simple thing: leaving Facebook. Facebook cannot really fix its truth problem without undermining its economical leverage — remember, we are their product, not their customers — so, if you feel that Facebook should be held accountable to truth, consider leaving Facebook so that they may consider a future without their product. Facebook’s only leverage is you and me.

When I suggest that leaving Facebook is the thing to do, I tend to get a common response: “And go where?” The implication being that no one will leave Facebook without a satisfactorily similar destination; a viable alternative; a new Facebook. But why must there be an alternative? Why must there be a facebook at all? I tend to think of Facebook in a somewhat Amish way. The Amish, contrary to popular belief, don’t just reject all “technology” out of hand. Instead, they look at every technological option available to them and decide, as a group, whether that technology will benefit the community. That decision is not always simply theoretical. Often, they will adopt a technology and try it — give it a grace period, so to speak. During that time, the question will always be whether the benefits of that technology — to the community, always to the community — are worth its costs. So, we’ve tried Facebook. It has made itself around us and we it. And now the question must be asked: Do we keep it? Is it good for the community? And we must be courageous and intellectually rigorous enough to answer the question on behalf of the world community, not the community within Facebook. Is Facebook good for the world?

I don’t mean to suggest that this is an easy question to answer, or that Facebook is all bad. Are there good things about Facebook? Of course! Has Facebook helped people to build community (and specific communities) that would have never been possible without Facebook? Probably. But again, in totality, is it worth the cost? The question cannot be dismissed as luddite or digitally dualist or whatever. It is not about technology; it is about impact.

That being said, at the risk of courting controversy and being the subject of a digital-dualism witch hunt, let me offer one other thought: Imagine you attended a community gathering — a neighborhood party, family gathering, that sort of thing — and while you were there, you started to notice unfamiliar people filling the room. Throughout the evening, you watch as they observe you and your friends and family. You see them take notes about everything — what you say, who you talk to, who you are with, what you do, what you like, what you own — and you see them share those notes. How long would that go on before you — or anyone — showed them the door? We have always had a fraught relationship with commercial activity. We like it sometimes; abhor it others. But few people would tolerate such commercial mediation in their homes and personal lives. And yet, we do on Facebook. We not only tolerate it, but we cooperate with it. We’ve embraced it. Why? The stealth of Facebook’s commercial mediation probably has something to do with it. What we do not easily see, we easily ignore. Too easily. We must strive to perceive the commercial reality beneath every single interaction on Facebook. No interaction of yours or mine is without monetary value to Facebook and its customers. And it is that value which stands between truth and fiction on Facebook. It has nothing at all to do with making the world “more open and connected.”

As far as I’m concerned, Facebook is the Philip-Morris of the 21st century. Facebook cannot sell hypertargeted advertising while also claiming to not model and influence cultural and political engagement. The only question left is who will blow the whistle? Or, how many of us will take the power back by logging off for good? If that sounds too extreme for you, let me leave you with a few seed thoughts.

If you believe that Democracy is vulnerable to dictators who obstruct truth, consider Advertising. We are already under its rule.

If that is too abstract for you, consider, then, our present reality. Which is more dangerous to truth in society, Trump or advertising? Who works for whom?

Are you willing to sacrifice Facebook for the sake of truth? You should be. Kick the habit. Facebook isn’t necessary, but truth is.

 

𓁏

 

Heavy Rotation

Last week, my wife and I and a few friends went to see Yeasayer at The Cat’s Cradle. We saw them there in 2012, when they were touring for their third album, Fragrant World, and, as expected, they were outstanding. They are one of the few bands who sound much, much better live than in recordings. Don’t get me wrong, their recordings are really good. But live, wow, they pack a huge punch. Maybe it’s the sound system at the Cradle, but the base and drums, in particular, are just powerful. Actually, a moment or two into their first song, Ira Tuton’s bass broke. They were able to fix it pretty quickly, but the band without bass? No way. Their songs are built around bass, and Tuton is a great bassist. Anyway, they rocked. You should listen to their latest album, Amen & Goodbye. But my point is this: We expected them to be great. We were not let down. But, we knew nothing about Lydia Ainsworth, who opened up for them. She was the big surprise of the evening. She performed solo along with many layers of digital accompaniment. I was intrigued. I made a note on my phone to look her up when I got home. I’ve been listening to her album, Right from Real, nonstop ever since. It’s beautiful, mysterious, wonderful. I feel like she made the Bat for Lashes album I wanted after The Haunted Man (no shade to The Bride, it’s just not my cup of tea), and has taken up the mantle of artists I grew up with like Kate Bush, Tori Amos, Peter Gabriel, and the like. Check her out.

Also, this is a performance using a modular synthesizer at a public library. It’s very good. There are others like it. I’ve been keeping them playing in a tab.

Recent Tabs

Every episode of Black Mirror. Who will command the robot armies? These gifs of growing mushrooms are gorgeous. This video is also beautiful. This drawing machine can draw with almost any pen and with very high precision. This is a meme. “If you’re a female sci-fi geek, you should somehow get your hands on this indie movie. Then you should show it to your boyfriend so that you can carefully femsplain to him what is going on with the plot.” This is a computer that can read lips. Daily life in the coldest city on Earth. Scientists Hook Up Brain to Tablet — Paralyzed Woman Googles With Ease. Cool. Cool cool cool. Terrifying. Goosebumps. “In The Revenge of Analog, the alluring material quality of objects is always highlighted, but ignores the fetishism that has led us to revalue it, skipping over the more simple fact that analog has become appealing for the same reason you can’t put your phone down: novelty.” So how’s the whole switching your publication to Medium going? Sad chairs of academia.

 

You can receive these articles straight to your inbox.
Subscribe Here.



Written by Christopher Butler on December 4, 2016,   In Essays


Next Entry
Introducing The Liminal I know. It’s been a while. More than six months, actually. It’s been a much longer break than I intended, and I’ll be up front with you, it’s not
Previous Entry
Talking to Machines Greetings from a very quiet kitchen at the Newfangled HQ. Mark’s Sonos Jazz alarm went off at 7:30 as usual, and right now one of my favorite

⌨ Keep up via Email or RSS
Impressum
© Christopher Butler. All rights reserved