Technological Luxury is Powered by Oppression

The greatest lie of technology is that it is the great equalizer. Technology has its own caste system. If you don’t know that, then that means you’re not at the top.

“You can talk to Josh and ask him to do various tasks. Josh makes your life at home easier.”

Such is the marketing copy for Josh, “who” is a voice-activated home automation system. Yes, exactly the sort of “interface layer” I wrote about last time. A few days after I published my “rant” against the over-simplification of the interface layer — the tip of an abundantly complex technological iceberg coalescing vast database architecture, machine learning, and information logistics — my friend Elias messaged me a link to Josh on Twitter. His accompanying message: “And so it begins.” Indeed it has begun.

It is interesting how much Josh is currently relying upon the high-value name recognition of brands like Sonos and Nest. According to Scott Belsky, who wrote the original “Interface Layer” essay, systems like Josh will commoditize those brands. In my response, I expressed my doubt. Josh will do its thing, but then some other system will come along — Joan, or whatever — that will do everything Josh does with a few additions and perhaps better. But that will only matter if Josh or Joan can actually work with your existing systems. Some people will have a different system from Sonos and pass on Josh because it doesn’t recognize theirs. Perhaps, at some point, some or all Joshes will recognize “all” the systems, but those systems will still do their thing. Josh won’t be the “face” of the code that controls your thermostat, or your stereo. It’ll just be, as a friend also messaged me in reply to my last essay, your butler. But that doesn’t commoditize the guts any more than an actual butler would commoditize any other home good.

Last time I checked, very few home goods were actually at the commodity level. Especially not for the sort of people who employ butlers.

Another friend of mine, Leban, nails the social fracturing of this kind of technology:

“All that ingenuity — all that ‘disruption’ — and who does it serve? Why not use that same energy to improve diplomatic relations between countries and avoid wars, or feed human beings that are starving, or create solar powered water purification systems that are operable anywhere. And so on. The problem with UX is that you can only empathize with the end user so long before you HAVE to start thinking about what’s really important: food, water, shelter, safety. Fuck the people who can afford butlers and their inability to wipe their own ass. I want to help everyday people just live.”

That we humans need help from one another to “just live” may seem like a subsistence barrier — a minimum level of existence that the future, by necessity, must leave behind. But I think not. Cooperative existence is existence. Life at its most basic yields profundity the likes of which technological complexity rarely manifests. It’s the truth. Epiphanies are had in the shower; not while dismissing notifications. To pursue that — “just” living — as a designer, or from any discipline, should be the end, not the means.

Leban’s further point about empathy is worth a bit more depth, too. He says, “you can only empathize with the end user so long before you HAVE to start thinking about what’s really important.” And yet, that seems to be exactly what doesn’t happen.

Good design requires empathy. That seems to be a basic fact. And so it follows that, in the case of designing a system like Josh, one must spend more and more time in the headspace of the sort of person who will hand his or her life over to Josh. Actually, the operative word should be “can” — the sort of person who can hand his or her life over to Josh. The sort of person for whom “just living” is taken care of, and paying for home automation is even possible. In other words, the elite. Which makes for a pretty decadent feedback loop, doesn’t it? If shelter, food, water, and other basic needs are never in doubt, then the perspective and problems of those for whom they are in doubt are never seen, understood, or valued. Which means, of course, that those problems remain unsolved. The necessity of focus on the “problem” of home automation does not catalyze recursive problem solving. It remains, pun very much intended, with its head very much in the clouds. With that in mind, it should be no surprise that, however counterintuitively, “spreading access to the Internet enhances inequality.”

There’s never a shortage of new tech things that provoke the same old question: Why this, not that? Wherein this is a luxury; that a need.

Though my own worldview doesn’t quite account for such an eventuality, one might still wonder what we do once need no longer exists. There is an often unspoken assumption that the just course of humanity is in the spread of the sort of “quality of life” to which we of the “first world” have become accustomed. Plenty question that, from both sides of the equation. Plenty ignore it, and live lives of contentment and plenty completely outside the purview of modern technology. Nevertheless, there is a momentum to technology, and I write this (and you read it) from neither its spring nor its crest, but carried within its current. We’re in it, and it’s dark in here.

Technological criticism is, by its nature, a dark art. It acknowledges the wonder and beauty inherent in creating technology, but, by way of offering critique, it spends more time immersed in the travail of our collective Frankenstein. It looks at the monster out of control, not the order of the laboratory from which the monster came.

And so, to say that “we are hopelessly hooked on technology,” is, of course, hyperbolic and stating the obvious. But the essay from which that quote comes deftly ties together the thinking of several other authors writing on the subject of technological habituation. Jacob Weisberg, writing for the New York Review of Books, provides an overview of four different books; two by Sherry Turkle on the nature of human relationships mediated by digital technology, one on the civility (or incivility) of comments discourse, by Joseph M. Reagle, and, finally, one on understanding habit-forming in product design, by Nir Eyal.

For me, the material devoted to explaining the role of habituation in current product design trends was the most interesting, and disturbing. There does seem to be a meaningful — but often glossed-over or ignored — difference between the sort of stumbled-upon habituation of social media like Instagram, and the psychologically manipulative intent ascribed to it once it’s firmly entrenched in culture.

It’s obvious that Instagram’s value is very much predicated on its addictiveness. But I still question whether the addictiveness was intended. Was Instagram designed to be a honeypot, or was it designed to be a great photo-sharing app to which we happen to have become addicted? At the time of its acquisition by Facebook, a $1 billion price tag was astronomical and unprecedented. Yet, that value was based upon the assumption that such a large and passionate base of users addicts could somehow be monetized (by selling their attention to advertisers).

Nevertheless, what troubles me is that in product design and venture capital circles, the addictiveness is not a malady — a byproduct of humans + technology worthy of study and cure — but a virtue! So much so that more than a few “experts” are now paid to help software companies design better addiction engines. This new field is called “captology,” the prefix an acronym of “computers as persuasive technology.” The biography of its founder, B.J. Fogg, reads almost like a supervillain story — the slow corruption of genius over time. He began as a graduate student in Stanford’s Human Sciences and Technologies Advanced Research Institute, studying how “computers can change people’s thoughts and behaviors in predictable ways.” Yet, instead of using such study to help humans better understand and wield their technology, Fogg instead runs “persuasion boot camps” for tech companies. No doubt they are willing to pay handsomely to learn the ways of the technopiate kingpin. That captology exists at all, and is a term coined in earnest, not in the pejorative by its critics, seems an especially dark turn for society.

The 4,000-word essay ends with a slight coda, in which Tristan Harris, a former captology acolyte, calls for a more conscientious approach to product design. This provides the hopeful endnote I needed. Yet, I couldn’t help but regret the lack of 4,000 more words on the ethics of engagement in the attention economy. After all, the author himself concludes that though Harris’s “nudges” are in the right direction, ensconced in an industry so bought-in that a word like “captology” no longer connotes any cynicism, they’re not quite iconoclastic enough. However well-intended, minor design details that value a user’s time, are, as Weisberg concludes, “wildly inadequate to the problem.

Aspirations for humanistic digital design have been overwhelmed so far by the imperatives of the startup economy. As long as software engineers are able to deliver free, addictive products directly to children, parents who are themselves compulsive users have little hope of asserting control. We can’t defend ourselves against the disciples of captology by asking nicely for less enticing slot machines.”

No we cannot. We have to summon the courage and strength to get the hell out of the casino altogether.

On Screen

Have you seen World of Tomorrow yet? It’s a simple 16-minute animation that tells the story of a little girl who meets her own clone from 227 years in the future. It has haunted me ever since I first saw it. If you haven’t seen it, don’t Google it, don’t read anything about it. Just take 16 minutes of your day and watch it. Then read Noel Murray’s review.

After that, watch this clip from Rick and Morty, Season Two, in which Morty and Rick play a game at an intergalactic arcade called Roy: A Life Well Lived. It’s basically a four-minute reboot of The Inner Light.

Recent Tabs

In today’s episode, we talk about how exactly we’re going to go from this, to this.” Why the World Looks the Way it Does, Elite Edition. Rethinking Reading on the Web. In case you somehow missed this, one woman’s food consumption is the rest of our Ex Machina fanfic. If you like that, then trust me, the parody is even better. The Eaglecam is back! Our favorite bald eagle couple are sitting on a couple of new eggs. We sell books that cannot be printed. Old Forgotten Houses. “The internet will suck all creative content out of the world.”

 

You can receive these articles straight to your inbox.
Subscribe Here.



Written by Christopher Butler on February 11, 2016,   In Essays


Next Entry
The Most Difficult Thing About Design I’ve been thinking all week about design. What is design, exactly? Is it an idea? An action? A thing? Linguistically, it’s all of those things.
Previous Entry
Beneath the Interface Layer Hello friends. Picture me in the text editor. I’ve been there a lot this week. I wrote a couple of articles over at Newfangled, one On Letting Go,

⌨ Keep up via Email or RSS
Impressum
© Christopher Butler. All rights reserved