Ethical Technology, 4

So far, my ramblings have gone from monopolies of information to the filter bubble to the economics of the internet.

Today, what about automation in general? Is is ultimately dehumanizing?

Cyborgology had a good post recently about automation called Commentary on Race Against Machine, in which they noted that Norbert Weiner, mathematician, father of cybernetics and author of The Human Use of Human Beings, wrote that automation is essentially…

“…the precise economic equivalent of slave labor.”

In other words, Weiner believed that if any process can be automated or maintained/accelerated by robotics, than the only hope to compete “organically” (i.e. using human beings) would be to employ slave labor. I think what that really means is that there is a limit to growth, or that growth is unsustainable without either moving toward full automation or treating people like slaves. In either case, the poor as a swath of society grows.

So we need to think a bit more critically about progress. What it means and what we’re willing to do to create it. Oh, and what the world might actually be like if that progress is made.

“Any increase in productivity required a corresponding increase in the number of consumers capable of buying the product.”

That comes from an Economist story about automation and artificial intelligence that begins with an anecdote about Ford Motor Company. Henry Ford famously paid his workers top wages in order to ensure that they could afford the product they had a part in creating. In doing so, he attracted the most talent 1914 had to offer to Ford and away from competitors. But the article goes on to point out that today’s technological progress—far beyond replacing a man with a motor—has reached a level of sophistication which renders not just the doer obsolete, but the entire job. Today, artificial intelligence is beginning to surpass the cognitive abilities of humankind, which means that the technologically-induced obsolescence previously limited to manual labor will now expand to swallow creative work as well. Indeed, if Eric Schmidt is right, we have already turned a corner from employing machines to help us with the work our thinking has produced to asking machines what we should be thinking about. He has said:

“I actually think most people don’t want Google to answer their questions. They want Google to tell them what they should be doing next.”

Huh. I wonder, who, exactly, does he mean by “people”? Everything in me bristles at this, but you know what? I may just be a relic of the past. (Yes, I’m fishing for you to tell me otherwise.)

But back to Henry Ford. What I’m wondering about is what happens when you pair his idea to make it possible for his workers to drive the cars they made with automation? If economies are inexorably moving toward greater levels of automation, and putting vast swaths of working people out of work and into poverty, at what point does the number of poor exceed the number of consumers that would buy the products being produced by automation? There must be a fine line past which imbalances are too disruptive if not irreparable. In other words, does this situation not ultimately undermine itself? I feel that it certainly does, though that does not necessarily preclude it from happening. Myopia has always been a critical flaw of society.

Doug Rushkoff has been thinking out loud that perhaps society is simply moving away from work as its central pursuit. After all, if it gets done, why worry that it’s not us doing it? So he asks, “Are Jobs Obsolete?” I think he’s probably on to something, but unfortunately, I don’t expect working society to go quietly into the night. Even if the trajectory of technology would otherwise support us spending our time differently—he points out that isn’t this what technology is for, anyway?—I wonder if good, old-fashioned human nature isn’t too large an obstruction to this kind of societal shift. It prompts many very serious questions. Who are we if we’re not working? If we’re not producing? I’m sure the answers are simple to the enlightened, but one person’s enlightenment is another person’s crazy. In between us and the kind of tomorrow Rushkoff describes is, I imagine, quite a bit of conflict.

Shifting this just a bit:

Something that fascinated me while I was reading The Numerati a few years ago was not that new software was capable of doing the large-scale data analysis that Stephen Baker described in the book—stuff that people used to do but just slightly more accurately than a coin toss—but that the major role of human beings in all of it was sales. The few who had the brainpower to conceive of the algorithms upon which the machine “intelligence” was built did so with self-sufficiency in mind. The goal was not to design a complete system, but to design the beginning of one that would learn and develop itself into one beyond the sophistication of which its creators were capable. In the interim, the only other role of note is facilities maintenance—ensuring that the world of the machine mind doesn’t run out of juice and blink out of existence. But surely that, too, is a role the machine could take on itself at some point not too far off. Ultimately, this sort of technology leaves its creators behind.

So, this is the transitional space we’re in. It’s an uncharted, hazy place in which huge leaps are being made in machine intelligence and automation that are largely unseen and unacknowledged by the majority of the working populace. When progress outpaces our ability to perceive it—to really get our minds around it—conflict is unavoidable. The “gains” of this sort of technology can only look like losses to everyone else. And of course, someone stands to profit, because someone will own that technology. But those someones become even fewer in number, an even more obscure elite. This is what prompts revolutions, which is why, before any real utopia is possible, we must first survive an even more severe winnowing of opportunity and equality.



Written by Christopher Butler on December 14, 2011,   In Essays


Next Entry
Ethical Technology, 5 Well, I think I’m winding down here. I can tell that I’m close to being out of things to say (for now, anyway); my mind has begun to wander back to
Previous Entry
Ethical Technology, 3 Continuing these thoughts in a meandering sort of way (Part 1 → Part 2 → You Are Here). Yesterday I was talking about the filter bubbles that we

⌨ Keep up via Email or RSS
Impressum
© Christopher Butler. All rights reserved