One thing I thought was interesting about this section is that as the humans (I suppose we aren’t really sure if he is a human) dehumanize themselves by dialing for a certain emotion on a certain day. A large part of humanity is our emotion, and without our ability to reason and vary our opinions and actions according to said emotions, how much different are we from the androids that Rick tries so hard to kill? In other words, by dialing an artificial emotion, I feel as though Rick is somehow ironically moving away from his humanity, even though it is humanity he wishes to protect.
For my discussion of part one of Do Androids Dream of Electric Sheep By Philip K. Dick, I would like to combine the quote “Empathy, evidently, existed only within the human community, whereas intelligence to some degree could be found throughout every phylum and order” (30) with the fact that the humans in this novel turn a dial to, in a way set their emotional status for the day and own empathy boxes which create in them their sense of empathy.
The fact that empathy seems to be an emotion that the humans can almost artificially create makes this quote about the importance of empathy to humanity rather ironic. By dialing for emotions and connecting to a program to feel empathy, I feel as though these humans of post-World War Terminus are almost artificially creating their humanity, even though it is their most valued asset. The humans prefer biological life over androids, as seen in Rick’s job to kill androids and the fact that it is an embarrassment to have an electronic animal instead of a real one. This would make me think that the humans would strive to keep their own bodies free of the influence of machines, but this novel seems to show the opposite. For instance, Rick and his wife dial emotions to stop their argument and to prepare themselves for work. This seems to take the effort out of being human and basically gives the humans a formula to deal with other people, which to us is what being human is all about, interacting with other people and learning through emotion.
In this quote, Rick says that “intelligence [. . .] to some degree can be found throughout every phylum and order”. I find it interesting that he does not include androids in this assessment of intelligence. Apparently, Rick is able to completely separate biological life from robots in his mind, yet he still relies on technology and his electric sheep to keep him acceptable in his social sphere. This is interesting to think about, that Rick vehemently separates himself from the androids, while at the same time relies on technology to keep himself what he views as human.
This is a really good point, Emily, and I think it’s definitely the question at the heart of the novel: how do we define life? It is ironic that while humans use their emotions and empathy as the reason for their superiority to androids, they have come to the point where they rely on machines to access these aspects of their “humanity.” We can see this, to a lesser extent, in our society today as well. While we claim to hate the “isolating” influence of technology in our lives, we use it to carry out more and more of our human interactions. I think we could look at computers today as a sort of empathy box—they certainly allow us to view, understand and hear the stories of people from all over the world. But I feel like this is also “diluting” our humanity—to empathize, we are told to put ourselves in someone’s shoes, which invokes an image of a close physical and emotional presence to the person we are empathizing with. Is this sort of connection lost over the internet? And if we continue to rely on computers for our connections with other humans, how long will it be before we rely on them for emotions as well? How long before we have computer chips in our brains, and at that point, how do we separate the idea of “human” from that of “machine”? (Not to sound paranoid, but these are the questions that this brings up for me.)
ReplyDeleteThe point made in the novel about empathy took my attention as well as it did for you, Emily. The idea about an "artificially created humanity" which you mention is very interesting. In a way, it seems as though humans have become machines as well. For example, as you have already mentioned in your entry the dialing of moods through the mood organ makes humans artificially emotional. Such thoughts about technology's growing influences on humans are serious enough to consider in real world situations. Really, do we actually pay enough attention to the growing influences of technology on our lives? In Jill Galvan's critical essay about the post-human world--in which he stresses the need for humans to expand their empathy to modes of technology, instead of just for humans--is a very important point to be made. We should accept technology as a vital part of our lives and create a society in which humans and technology are peacefully integrated for only by then, will we "shape each other's existence" as Galvan says in his essay.
ReplyDeleteI also thought it was interesting to see how Meg thought that computers could be seen as "a sort of empathy box." I think Meg, that you are absolutely correct! Computers and humans have created a relationship that is very influential on one another. It is also interesting to see it in a slightly different way: computers, especially the Internet, enable us to mask our emotions and display them in a different way. Therefore, in a way, we are dialing our emotions every time we place an emoticon or an exclamation point/question mark when we represent ourselves on the Internet. Humans put emotions into technology and technology, in return, also manipulates humans as well. All in all, the interchanging relationship between technology and humans I think is an important aspect of the novel as androids, in the real world, could be referring to any modes of technology.