Technology and Moral Responsibility

As I’ve been cleaning up my site, I’ll occasionally read back over something I wrote many years ago. Yesterday, I re-read a series of posts I wrote over a decade ago on ethics and technology. Here’s the first one.

What surprised me yesterday was how I could have easily written the exact same post today. The same issues — information monopolies, data integrity, automation, and the connection between information and power — are not only central to the everyday experience today, but are being discussed in virtually the exact same way as they were more than ten years ago. The only difference is the shift from what if to what is. In 2011, the concerns I had seemed predicated on possibility — that certain bad things could happen if we let them. Well, we let them. Just about every thing that I looked at in just the first of five posts on technology and ethics is more established, more destructive, and more brazen today than it was in 2011.

But here’s the other surprise: In my first post — again, back in 2022 — I talked about how people like Howard Rheingold wrote about nearly all the issues we’re struggling with today with an expectation of certainty nearly twenty years before then. It’s been thirty years now since Rheingold wrote The Virtual Community, and I’d say that it has earned status as not only a classic but an oracle. Here’s just a taste — how Rheingold opens his tenth chapter, titled “Disinformocracy”, and a few clips that follow:

“Virtual communities could help citizens revitalize democracy, or they could be luring us into an attractively packaged substitute for democratic discourse… [The] notion of online subscribers as commodities isn’t likely to go away… The telecommunications industry is a business, viewed primarily as an economic player. But telecommunications gives certain people access to means of influencing certain other people’s thoughts and perceptions, and that access–who has it and who doesn’t have it–is intimately connected with political power. The prospect of the technical capabilities of a near-ubiquitous high-bandwidth Net in the hands of a small number of commercial interests has dire political implications. Whoever gains the political edge on this technology will be able to use the technology to consolidate power… High-bandwidth interactive networks could be used in conjunction with other technologies as a means of surveillance, control, and disinformation… We live in a hyper-reality that was carefully constructed to mimic the real world and extract money from the pockets of consumers.”

We had have a responsibility to think about the moral inputs and outcomes of technology before we create things. And continue to do so as we use them. And then continue to do it some more. It’s not a one-time thing, and it should never be a reaction. That’s what Rheingold implies throughout his entire book. (Rheingold is a fascinating person; still active today; still thinking about information and people and how the two construct reality together. Look him up.)

I must admit, though, that stumbling upon this old article was disappointing to me. Not just because here we are, over a decade later, further pulled into the quicksand of our digital situation, but because I haven’t done anything about it. Sure, I could say, “but what can I do about it anyway? I’m just one person, and the”situation” is controlled by multi-billion-dollar corporations now.” That would be reasonable, I suppose. But would it be acceptable? I’m not so sure.

What can I do about information monopolies, about the easy exchange of privacy, access, and truth for convenience, about the willful blind eye we turn to the hard, environmental costs of our digital technology? Surely something rather than nothing. I need to think on this and not let myself off the hook until I’ve come up with something to do.



Written by Christopher Butler on
April 16, 2023
 
Tagged
Log