Ethical Technology, 1

A few weeks ago, I sat down for a cup of coffee and conversation with a new friend—someone who had been put in touch with me by a mutual friend of ours who knew that we had a shared interest in technology, ethics, and all the “big ideas” in between. Vance teaches this stuff—hes’s an Associate Professor of Philosophy and Chair of the Philosophy department at Guilford College—while I try to figure out how to apply it in a working world where few actively or deeply consider ethics (until it’s too late, that is). Our connector was Paul, director of iBiblio.org and Clinical Associate Professor of the schools of Journalism & Mass Communications and Information & Library Sciences at UNC. These guys are both way smarter than me; I’m glad to know them both.

Anyway, there Vance and I sat for several hours batting around all kinds of issues and ideas—from monopolies of information to informational transactional responsibility to the filter bubble to the economics and politics of informational systems to automation and on and on. This was one of those conversations I wish I had recorded, because when I got home, I felt that panicky sensation as my brain began—out of necessity—to hemorrhage vast swaths of detail that had been exchanged between us. I did jot down a bunch of notes and have been adding to them here and there since.

Which brings me here. I’d like to take my note-jotting to the web. So here begins a few days of rambling on what comes to my mind when I think about ethics and technology.

I’ll start with monopolies of information. Let the unedited rambling begin:

    • *

The big news: Digitized information is centralized, and therefore, easily controlled. I know, I know—the conventional wisdom has been that the revolutionary nature of the internet is in its decentralization of information, which is disruptive to the powers-that-be. But no, I don’t think that’s actually true. The internet has created the impression of decentralization. But, once information is digitized, it’s bits are part of someone’s property. Maybe that someone is just some relatively unknown person who happens to have her own server, but for the most part, that someone is Google, or Facebook, or some other massive corporation that has offered storage so plentiful that it almost disappears from common perception. As James Bridle has pointed out, the cloud is a lie. The cloud is marketing speak for sprawling, hot server facilities that horizontally out-scale most architecture you’ve ever encountered. They’re not invisible, they’re not lighter than air, they’re not public. They’re massive, opaque, and very, very private. The point is that once information is digitized, ownership becomes pretty complicated—which works to the corporate advantage, because in most cases, you the uploader actually hand ownership over the the host. The question of who owns your emails, your documents, your videos, pictures, music, etc. is fraught with issues that have not been worked out with you in mind. Think about what this means in the long term.

If books are digitized as a way of “extending” their reach (the anecdotally accepted motivation behind e-books), their success in extension will create economic pressure to no longer produce any physical copies. That’s just one example—one that many are pondering right now. But once that happens, where the physical is truly eclipsed by the digital, the entity which controls the storage of the digital controls the flow of information. Perceptions and knowledge can be controlled as never before. Of course, this isn’t a novel idea. Even at the beginning of the web—at least the web as we know it today, Howard Rheingold wrote of his concern for how the internet could be too easily dominated by too few powers:

“The telecommunications industry is a business, viewed primarily as an economic player. But telecommunications gives certain people access to means of influencing certain other people’s thoughts and perceptions, and that access—who has it and who doesn’t have it—is intimately connected with political power. The prospect of the technical capabilities of a near-ubiquitous high-bandwidth Net in the hands of a small number of commercial interests has dire political implications. Whoever gains the political edge on this technology will be able to use the technology to consolidate power.”

Before digitization, we had distributed freedoms—to browse and discover information in unique ways based upon the individualized freedoms of other individuals, store owners, and librarians to curate collections. Though access was challenged in physical ways (e.g. no central inventory, things going out of print, costs barring ownership in some cases), those factors were not centrally controlled. Once a book was printed, it was very hard to alter or destroy or control. But if all books go digital, they will be comparatively simple to alter, destroy or control. If someone at the top of the corporate food chain wanted to (or was persuaded to) blink a text out of digital existence, it could be possible to do so. It’s ironic: digitization appeals to the desire to spread and share information, yet it makes it easier than ever before to control, alter, or censor.

In between me and discreet information are other issue of exposure and access. When I am shopping on Amazon and see “people who bought ___ also bought ____” I think I am getting an objective, qualitative recommendation. After all, if people bought that thing I want also bought this other stuff, then maybe I’m more likely to want this other stuff than other other stuff. But the question underlying all of that is, What about the other stuff that’s not even in this ecosystem? Or, bigger yet, what about making stuff for yourself—not necessarily all of it, because the DIY thing can become an idol of its own—instead of buying everything?

And what about the role of privacy in all of this? What is amazing to me, now reading back in Rheingold’s writings, is not that some of the outcomes have indeed come to pass and were unseen by him decades ago (he did see many of them) but that the possible human responses to some of those outcomes elicited a could-you-imagine?! incredulity that, in its naiveté, makes me shake my head in sadness. Example:

“The second school of criticism focuses on the fact that high-bandwidth interactive networks could be used in conjunction with other technologies as a means of surveillance, control, and disinformation as well as a conduit for useful information. This direct assault on personal liberty is compounded by a more diffuse erosion of old social values due to the capabilities of new technologies; the most problematic example is the way traditional notions of privacy are challenged on several fronts by the ease of collecting and disseminating detailed information about individuals via cyberspace technologies. When people use the convenience of electronic communication or transaction, we leave invisible digital trails; now that technologies for tracking those trails are maturing, there is cause to worry. The spreading use of computer matching to piece together the digital trails we all leave in cyberspace is one indication of privacy problems to come. Along with all the person-to-person communications exchanged on the world’s telecommunications networks are vast flows of other kinds of personal information—credit information, transaction processing, health information. Most people take it for granted that no one can search through all the electronic transactions that move through the world’s networks in order to pin down an individual for marketing—or political—motives. Remember the “knowbots” that would act as personal servants, swimming in the info-tides, fishing for information to suit your interests? What if people could turn loose knowbots to collect all the information digitally linked to you? What if the Net and cheap, powerful computers give that power not only to governments and large corporations but to everyone? Every time we travel or shop or communicate, citizens of the credit-card society contribute to streams of information that travel between point of purchase, remote credit bureaus, municipal and federal information systems, crime information databases, central transaction databases. And all these other forms of cyberspace interaction take place via the same packet-switched, high-bandwidth network technology—those packets can contain transactions as well as video clips and text files. When these streams of information begin to connect together, the unscrupulous or would-be tyrants can use the Net to catch citizens in a more ominous kind of net.”

Not only has all of that come to pass, but we have come to pass on being concerned about it. What Rheingold characterized as “unscrupulous” and even tyranny has become basic business practice. So things have changed, but more importantly, so have we. The connection between information and democracy is extraordinarily meaningful, yet it seems to be one we take largely for granted. Of course, Rheingold continues on in that chapter to rail against “the selling of democracy” in a way that chillingly corresponds to what is going on all around us today. You should really read this. What it shows is that today’s circumstance didn’t just happen to us, it is the deliberate fulfillment of an agenda—one that should cause anyone to sincerely question the integrity of the democracy we live in.

That’s probably enough for today.



Written by Christopher Butler on December 11, 2011,   In Essays


Next Entry
Ethical Technology, 2 So, picking up on Part 1. The next thing that came to mind as far as ethics and technology are concerned is the filter bubble (as coined by Eli
Previous Entry
Smarter, Better Cyborgs This is my third Interaction column for Print Magazine. …or, as I originally titled it, “Designing the Unseen.” But, this title plus Tim Lahan’s

⌨ Keep up via Email or RSS
Impressum
© Christopher Butler. All rights reserved