Friday, November 11, 2011

Bow Down To Your Robo-Lords

All this discussion about post-humanism and cyborgs and the uncanny valley has got me thinking, would robots and cyborgs gaining control in society be that bad of a thing? Think about it: what would be wrong with, if the technology could be perfected, something other than a human ruled the earth, as long as our existence was not threatened or harmed? If the robots were completely logical in the strictest sense, and had no emotional input or otherwise human influences on their judgments, why wouldn’t we want them in positions of power?

I think it’s because essentially, as humans, we are scared of losing control of our world. This is a product of us, in our existence, being in total control of our world for as long as we can remember. Turning over our power to something non-human, no matter how rational and right the robot-overlords may be. Also, due to all the movies and sci-fi stories about it, I feel we have an inner fear of robots gaining a self-consciousness and self-awareness and turning into human-esque features.

The fear of robots and humanoid robots is really interesting to me as well. After our discussion of the uncanny valley and the fear of things the more and more they became human-esque, I am finding it hard to distinguish if we are scared of the human form of the robot itself, or the something close to human with greater intelligence than we could have. Furthermore, I am having trouble drawing a line where the human-robot line would be, and at what point would a cyborg cross the line from human to robot? Perhaps this is hard to determine, because as we have discussed in existentialism, human existence comes before essence. Since it’s so hard to determine what essentially defines a human, that is why it is hard to draw the line between human and robot in the case of cyborgs.

If there was a race of human-designed super-robots with perfect software and infallible reasoning and logic, but no self-awareness or self-conciousness, with the lack of emotion, would you be opposed to them ruling us? Why or why not?

3 comments:

  1. Personally, I do not see a problem with robots being a part of our world. But I do not think that robots on any level adequate to rule or have a position of power. For instance, if the robot had no emotions and act strictly on logic, would we have any welfare type policies (whether you agree with it or not) or any other types of social adjustment due to emotions. I believe that many of the decisions that are made are made with emotions factored into it. I mean, am I right? Do you think that? Or are the social cost the same as logical and rational decisions. I mean like abortions, welfare, war...so on. What do you think?

    ReplyDelete
  2. If such robots existed, I don't think it would be a problem if they ruled us. The issue is, however, that I'm not sure such robots could ever exist. Even the notion of "infallible reasoning and logic" is a bit tricky. This robot race would have to be created and programmed by humans, so how would we decide amongst ourselves what type of logic and reasoning can be termed infallible?

    As Phong pointed out, oftentimes situations arise that you can't solve with an algorithm. Emotions definitely do limit logical reasoning at times, but at other times I feel like they're necessary to make a decision - and also to defend that decision.

    ReplyDelete
  3. I am not opposed to robots in our world, as they make our lives easier. However, I am opposed to the idea of super-robots ruling us. I believe that many humans would not like super-robots in charge of the world. I do agree that this attitude probably stems from the fear of losing control; I sure am. If super-robots ruled the world, you would be placing your fate into the hands of a programmed machine of algorithms. Who's to say that there are enough algorithms to accommodate every situation? I agree with Phong and Anna that there are some situations that algorithms cannot address.

    ReplyDelete

Note: Only a member of this blog may post a comment.