I follow with lots of unease ongoing discussions about the state surveillance system at the NSA. The revelations combine the very big with the very small, huge systems that sweep up the most insignificant human data. I don’t really know what to think. Maybe Congress will come up with a law protecting the private sphere, i.e. individual privacy. But to what effect? I tend to think that the horse is out of the barn. The individual citizen has been turned, has turned himself or herself into a data point. With the technology too big to control, somewhere out there, data is always going to be ready at hand and to use.
In instances big and small, the political and ethical predicaments surrounding technology are thoroughly intermeshed and inseparable. The story goes beyond the NSA. One could add one more story on top of another. The big tech companies, embarrassed and now outraged about the recent NSA revelations, have been sharing personal information with other large companies for years. Journalists at Rupert Murdoch’s operation News of the World/News Corporation in England hacked into the private voicemail of some six hundred people. Rather than offer a solution or alternative, the massive data dumping by Julian Assange at Wikileaks and Eric Snowden and Glenn Greenwald is a part of the very problem they seek to resist, namely the ready availability of massive amounts of information. Add to that instances of cyber bullying, cyber stalking, banks too big to fail, the rise of a plutocratic class in the U.S., Russia, and China, the almost universal use in the west of cell phones, credit cards, EZ Passes, online banking, internet porn, and identity theft. You get the picture. Meanwhile big neuroscience and cognitive science tell us that the sense of an individual self with freedom of choice is an illusion.
And then the robots. I read last week a piece in the NYT Science Times about robots, about making robots more like us, and the trend to more human-robot interacting. I wasn’t trying to be particularly prurient. I was actually trying to be serious, sort of, when I realized, as if automatically that in 30 years time or less, people are going to be having sex with robots. And they won’t even blush. I’m willing to bet on it. It’s virtually inevitable, a matter of when and how, not if or whether.
I’m not sure, but I think I’m now a technological determninist, which is the notion that technological innovation determines social practice and the parameters of moral meaning. Why do kids sitting next to each other on a couch text each other instead of talk? There’s no mystery. The reason people do crazy things with technology is very simple. They do it because they can. This sudden realization about sex with robots called to mind the centrality of science and the momentum of mental inquiry in “Tomorrow,” the last essay in George Steiner In Bluebeard’s Castle: Some Notes Towards the Redefinition of Culture (1971). We open each successive door of knowledge represented by Bluebeard’s Castle because it’s there and because we can. This is the logic of intensification by which we become aware of our own being, and we do so knowing that the disasters are “flagrant,” that the last door opens out into cosmic dimensions beyond human comprehension and control (p.136). (You can read “Tomorrow” here: http://www.anti-rev.org/textes/Steiner71a/4.html)
Where is the human being in all of this? When Lyotard wrote The Postmodern Condition, he found the resistance to these totalizing systems in the inhuman and sublime. Derrida thought that there was something fundamentally uncontrollable about the archive. I’m not sure. I look for the human in a still small voice, as small as the voice of God revealed to the prophet Elijah in the desert when he fled for his life from the sovereign authority of Ahab and Jezebel; as small as the presence of God reduced in the system of Babylonian Talmud to a bat kol (the daughter or an echo of a voice). I think we’re all still fumbling for an on/off switch that might not even exist.