(Originally Read and wrote this post September 11th 2008)
This book which inspired the hit movie, Blader Runner, is considered a classic of Science Fiction. The novel switches between Rick Deckard, a bounty hunter of androids, living in a Post-Apocalyptic earth, and John Isidore, a “chickenhead” slowly being made mentally disabled by the nuclear dust in the air. Androids serve as slaves for humans on the colony world of Mars, which was built after the nuclear war as a salvation from radiation poisoning. The plot begins when a new model of android, the Nexus-6 who are almost indistinguishable from humans, escapes onto earth and Rick Deckard must hunt down these escaped slaves.
The novel opens with the following line:
A merry little surge of electricity piped by automatic alarm from the mood organ beside his bed awakened Rick Deckard. Surprised–it always surprised him to find himself awakre without prior notice–he rose from the bed, stood up in his multicolored pajamas, and stretched. Now, in her bed, his wife opened her gray, unmerry eyes, blinked, then groaned and shut her eyes again.
This passage provides a lot of insight into how the prose of science fiction functions in comparison to your average realist novel. Carl Freedman in his Critical Theory and Science Fiction remarks that “in some of its particulars, the passage could be the straightforward opening of a mundane novel . . . a married man, lying in bed beside his wife, awakes and is, presumably, about to start the day. The stylistic register of the paragraph, however, marks it as unmistakably science fiction. They key factor here is the reference to the mood organ.” Technology has advanced so far that they have invented a device called the mood organ, which can alter the emotions, desires, and moods of humans just like drugs. Dick offers this concept with a healthy dose of playfulness:
“Dial 888,” Rick said as the set warmed. “The desire to watch TV, no matter what’s on it.”
“I don’t feel like dialing anything at all now,” Iran said.
“Then dial 3,” he said.
“I can’t dial a setting that stimulates my cerebral cortex into wanting to dial! If I don’t want to dial, I don’ want to dial that most of all, because then I will want to dial, and wanting to dial is right now the most alien drive I can imagine.”
Rick’s wife, Iran, realizes the manipulative nature of the technology and how hopeless the world has become; she wants to feel depressed instead of allowing a machine to alter her mood (like our modern day concept of taking a pill for all our problems) because that’s the genuine feeling one should express in a hopeless situation where the world and human existence is slowly being eradicated by nuclear fallout. The propaganda on TV and radio encourages everyone to emigrate to Mars, depicted as a paradise compared to earth, the utopian salvation for the dystopian earth. Later, the escaped androids from Mars reveal to us that this couldn’t be further from the truth. The novel makes it clear that both worlds are pretty much utter wastelands, slowly dying, and horribly miserable places to live with only fatuous programs like Buster Friendly and his Friendly Friends and old Science fiction novels to relieve the boredom. These are worlds where one needs a machine capable of altering moods to make them feel happy. If one needs a machine to program their emotions is it an authentic emotion? Just what is an authentic emotion anyone? This idea of questioning authenticity seems to be a primary concern of Dick that extends much further to the possibly artificial division of the human and androids themselves.
On the job, Rick Deckard soon learns to the empathize with the androids. He is not sure if he can kill them anymore. The key quality of androids that separate them from humans is that they lack empathy. Dick reveals this concretely in one of the final scenes of the novel where the androids in the most cold and meticulous manner rip off the legs of a spider, despite it being an endangered species. At this point the reader loses all sympathy for them, yet I can’t help but feel this isn’t exactly what Dick wants the reader to take away. Despite his sudden burst of empathy, Deckard does kill them anyway. This emotional crisis at first comes off as rather pointless and silly. Why waste multiple chapters documenting Deckard’s emotional crisis over his new found empathy for the androids only for it to play no role in the actual resolution of the plot?
The answer to this question I think brings us to one of the main themes of the novel: it critiques the very idea of empathy. The novel calls into question that one quality that the story repeatedly claims makes us human and separates us from androids and animal. Its when Deckard sees that Luba Loft, the android he is assigned to kill, can appreciate art that he has his existential crisis. Another example from the book comes in the form of two androids who happen to be married to each other; it is interesting that the novel never explains the ritual of android marriage, yet their feelings for each other seem genuine. Another android in the book talks warmly of the literature she read on Mars, quoting Shakespeare and other famous poets, with a true appreciation of aesthetics and the emotional quality of those works that only empathy would allow. And of course there is the big glaring question hidden in the very central premise of the novel: if the machines do not have higher order thinking, if they cannot be classified as human or feel emotions or suffering, then why do they keep trying to escape from slavery on Mars in the first place? As I pointed out, the novel contrasts all this with the androids cold and calculated dissection of the spider. A superficial read leads the reader to agree with the central assumptions of human characters who occupy this world, dismissing all the other proofs that the androids possess empathy, and that the androids are simply cold-blooded killers. I think its this difficulty to sympathize that is precisely the point. We lose sympathy for the androids here, but we forget that humans have performed scientific experiments on animals and other humans (think Nazi Germany) for centuries, treating them in just as cold and callous a manner. Notice when this happens, except for the few psychopathic animal rights activists, we generally never blink an eye. It’s not the actual torture that is the issue, but WHO is committing the torture. We already deem a priori that the androids cannot be human, even though, in complete contradiction to our expectations there is little evidence in the novel that they in fact lack empathy other than we are constantly being told such from the human point-of-view. The events that supposedly show a lack of empathy are all things human beings are equally capable of doing. So is Dick suggesting androids are living creatures and Deckard is simply committing murder (classified as Not-murder because they are a slave class)? Is Dick implying that for all their supposed empathy human beings quite often show rather little empathy for their fellow living creatures? I think this gets close to the core of Dick’s point.
The title, Do Androids Dream of Electric Sheep, is meant to be ironic. Can androids dream? And if so do they sympathize with their own kind? We are told again and again that androids would betray their own kind in a heartbeat to save their own neck. But of course human beings often do that too; I believe we call such actions scapegoating. So Dick is conflating the actions of the androids and humans, but it’s very easy to miss this point.
Having read a few other works by Dick, I get the impression he doesn’t think much of religion. The religion of this novel, Mercerism, is Dick’s way of mocking religion. It is a religion that one ups Christianity; now with the help of the latest virtual reality technology and top-notch Hollywood special effects you too can follow Christ’s death by experiencing it first hand in the form of William Mercer being stoned by a crowd. At the core of this religion is the idea of a universal human empathy (much like Christianity in a way); except now you connect with all other human beings through a machine to share in their emotions, to be uplifted by others if you feel down and raise others when you feel elated. All of which mocks the very core of religion: the idea of empathy. Dick makes this point in the most blunt way throughout the narrative of the novel. Considering Mercerism is supposed to be about extreme empathy for all living things and we are told multiple times that human beings are the only creatures capable of empathy (that’s what makes them human), very few characters, whether they happen to be human or android, seem to show any empathy for others; at least not a substantial empathy that has any sort of real effect on a person’s behavior.
Rick Deckard can feel empathy for the androids and doubt his ability to kill them anymore, but then still does it anyway. So much for empathy! Empathy becomes a shallow emotion; mere words, and perhaps a shallow feeling, rather than anything substantial that will lead you to engage in different actions. You might as well be hooked up to a machine programming your mood. Oh wait, that does happen in the novel, and I believe that’s precisely the point of having it there. It’s a metaphor for phoney emotions, the quick modern society drug to feel something, anything–false meaning. I think what Dick is ultimately getting at in this novel is that human beings aren’t very far from being robots themselves, that we are equally as programmed just by television, drugs, and technologies. Very little separates the humans from the androids, only human illusions and ideology. I think he is also considering where “human being” status begins and ends. The guiding question of the novel might be stated thus: how can we be so certain artificial intelligences, animals, or other inanimate objects aren’t human in that they feel, think, and suffer the same as we do?