Tuesday, February 12, 2008

This I Believe...

(Yeah yeah I'm ripping the title off from NPR...I know)

This is the first entry in what I hope will be a series of entries where I give my opinion on...well...anything really. All that matters is that I'm being honest with you, my (rather anonymous) reader and putting it all out there. I'm setting aside my fears of judgment and rejection to better get to know myself. So disagree, argue, laugh, think I'm completely full of it; when it comes down to it, it isn't really about you. This might be slightly painful but hopefully a little enjoyable for all. :-D

I'm going to start with something a little easy, perhaps a little silly, but 100% honest. Here we go!

I hate robots.

Okay, okay...hate is a strong word. I strongly dislike robots.

Let me share with you a little something I wrote during my Classic Problems in Philosophy class junior year of college-

"I am not afraid to admit that I am a carbon chauvinistic speciesist. In order to be counted amongst those who actually have minds, you need to have grey matter like the rest of us.

I couldn't love someone who didn't have a mind, because any emotion they displayed, or anything they said to me would simply be the result of software. I'm not saying I would discriminate against robots, but I don't believe the artificial intelligence entity to be on the same level as those of us with brains.

Just because my Furby said "Kah may-may u-nye" (Furbish for "I love you") doesn't mean I believed it was capable of feeling love. It was simply responding to the program that says "If X happens, say Y". Consequently, I didn't feel any remorse when I accidentally gave it a heart attack with my TV remote control."

Clearly I am not a functionalist. I don't think you can recreate the human brain in silicon form. Even if you could make a computer program functionally identical to the human brain, it's still just a computer program. Maybe that makes me a bit of a soul theorist, but I think I'm okay with that.

At the end of my Theories and Methods class last week I pondered out loud whether or not Durkheim would recognize a church comprised of robots. One of the other students reminded me that computer technology is no where near that advanced and I had no need to fear a Matrix-style/I, Robot revolt any time soon.

Regardless, I want to be prepared.

Summing up: Part of my personal identity is dependent on me being a carbon-based human being. I do not believe I can be functionally re-created in a computer; there is something more to me than just electrical impulses. If there comes a time in my life where we run into an A.I. issue (a la the Matrix) you'd better believe I'm investing in an EMP (electromagnetic pulse) device.

(If you have any issues that you're interested to know my opinion about, feel free to leave them in a comment!)


Laura said...

I have a few questions that I ask myself and I am always looking for another opinion.
Are humans just as predictable as computers? In that if we were to know all of the variables could we predict all and every human decision?
If God does exisit then do we really have free will?

lindsey said...


Thanks for sharing the slightly disturbing questions. I thought I'd go ahead and respond with my opinion and anyone else is welcome to chime in too...

I think if we were able to know all the parameters and variables that yes, we could predict human behavior. We're sort of able to do it now with people we know well.

If God exists in the Christian ideal then we need free will in order to be judged at the end of our days, unless you're a Calvinist (in that case it's all been decided already). If we don't have free will, we can't be held responsible for our actions, good or bad.

My personal opinion, well...that'll be the subject of another blog post. :-)

Fraz said...

you are totally Will Smith in I, Robot, you realize this? I have a rumba named Sonny that might have to visit you sometime.

nate Arnold said...

i broke your ferby