newnumber6: Ghostly being (Default)
[personal profile] newnumber6
Apricot jam does not go very well in Peanut Butter sandwiches.

Now, with that completely irrelevant opening out of the way (it is the random post) I'm very passionate about a particular cause for which I was sadly probably born far too early for it to really matter. That cause is probably best summed up as 'AI Rights Advocacy'.

To start with, there's a concept I like to bring up now and then. I call it the Robot vs. Puppies dilemma. It goes something like this. You're invited to the home of a brilliant inventor. While he makes you some refreshments, you play with his puppy. It's a cute little thing, very affectionate. The inventor comes back with the snacks and says he wants to show you his latest invention, an intelligent robot. It looks more or less human in shape, but is clearly non-human... looking perhaps like one of the Star Wars droids. The robot talks to you. It seems to be able to respond intelligently - it's not a walking database, it doesn't know everything, and it's curious about things it doesn't know about. It claims it has emotions. The inventor tells you it is a one of a kind prototype - although there are schematics of the basic design, there are no copies of its memory.

Then something happens. Someone breaks in. It's an evil rival of the inventor. He stuns all of you, and when you wake up, he reveals his diabolical machine. Behind a solid clear divider, you see two huge lasers. One points to the robot (who is restrained). Another points to the puppy. On your side of the divider are two switches at opposite ends of the room. The rival tells you that if you pull one of the switches, it will fire one of the lasers (the one on that side), but disable the other. The laser will blow up the puppy and render the robot into a pool of molten metal. Then he says if you don't pull one of the switches in 30 seconds, both lasers will automatically fire.

The robot hears this and asks you (and the rival) to spare its life. The puppy doesn't have a clue what's going on.

Assuming you believe the guy about the lasers and switches, what do you do?

For me the answer is simple. You save the robot, because it's a sentient being. I posed the question on a forum once and got about an even spread of answers. I'd make it a poll but I don't have a paid account anymore. For those who said they'd save the puppy, most of them gave the answer that 'robots don't really have feelings' or they're 'just code' and so it doesn't matter what you do to them.

In the discussion that ensued, I was accused of being unfeeling because in response to that I argued that, theoretically, I don't _know_ anyone other than me really has emotions. Perhaps I wasn't making myself entirely clear in the discussion (I sometimes have that problem). I believe that others have emotions, certainly, but I don't know it. I treat the idea of seemingly intelligent machines the same way... we don't know what emotions and intelligence are, so if a machine _seems_ and _claims_ to have emotions, and appears intelligent, I'm going to treat it that way. Personally, I think that makes me _more_ compassionate than those who would dismiss someone's entire existence based on nothing more compelling than their construction. (I realize that a lot, although not all, of this resistance comes from religious perspectives, but I do think that there's no reason why a God couldn't decide to imbue a 'soul' into a robot body the same way a religious person believes He would a cluster of cells).

For me, the word 'human' as two senses. Just people say 'humans and animals' to draw a distinction between the two, even though, technically, humans _are_ animals, I use human not only to refer to the specific species, but also to a class. For me, when I refer to 'human rights' I'm not talking only about the human race - they'd equally apply to intelligent aliens or AI. When I use the word human in this way, I mean a certain kind of mind. And for me, the label of 'human' is a tent we should be quick to expand, and very, very wary of ever retracting, for fear that we exclude ourselves from the label as well. Robots can be human. Really, when I say 'human' I mean 'person'. It's just 'person' as a word feels so paradoxically impersonal.

Battlestar Galactica a few weeks ago (I think - I started writing this a while back and got sick and lost track) had two lines of conversation on this matter... one, a character, responding to a Cylon who claimed she loved him, said, "Software doesn't have feelings." Another was Baltar telling that same character later something to the effect of "Love is a strange and wonderous thing. You should be happy you had it at all, even if it was with a machine."

In other episodes since they've brought up the question 'can you love a machine'. I think I've blogged about this subject before, but as I said, it's one I'm passionate about. I think the answer is yes - although there are, of course, different senses of the word love. I think there's an element of physical attraction needed for what we tend to call romantic love. At least, for humans there is. It's hardwired into our genes... although some people will have more flexibility on what counts as 'attractive' to others. So for me at least, unless a machine (or alien for that matter) looked relatively close to human and female, I could love them in a brotherly-love true-friends sense but no rar-transformers sex for me, thanks. But I could probably fall for someone like Boomer, even knowing the was a machine (but not Six. She is one first class bitc-a.), or Andromeda. Those are about the only two female androids I can think of off hand.

I hope AI becomes a reality sometime in my lifetime. Not for the hot robot sex, but because I'd like to fight for hot robot rights. And the rights of non-hot robots, too, while I'm at it. I mean, right now, there's not a whole lot I can take a strong moral stance on that isn't already pretty obvious (at least where I live). I mean sure, I support gay rights and am against discrimination even when people are all panicky about terrorists and think that justifies an exception. I'm anti-torture and pro-privacy, and freedom. But none of these are things that I can really do something about directly. I can talk about them, theoretically protest (although I have my doubts about the effectiveness of such), but in general I can't do something tangible about them at least in my current circumstance. But the moment AIs come about I can. I vow, the moment I own an AI that crosses into my threshold for 'human' I will free it. I will treat them as human. I will do what I can to shelter rogue AIs (so long as they're not doing anything evil).

I suppose I could do similar things to take a strong moral stance on other issues if our society goes all to crap, and, say, we get a complete totalitarian theocracy, but I'm hoping it doesn't go that way and we simply have to face overcoming a _new_ prejudice rather than going back to the old ones. We'll see. Maybe genetically engineered superanimals will get here first and I'll jump on the bandwagon for their rights.

Anyway, thoughts welcome on AI Rights! Comment and discuss!

New Topic! So when I had dinner with my mom she gave a bunch of little mini gifts. Among them was a shoulderbag, which I claimed for myself because my previous one was... wearing out.

And by wearing out, I mean the following: There was a hole in the bottom of the bag that a paperback book could fall through. The main zipper was pretty well just for show, even though it was closed, the bag was constantly open because of a rip where the zipper and one side of the bag connect. The side pocket was also constantly open for two reasons - one, because the zipper no longer closed, the teeth no longer connect. The second reason is because it's developing a similar rip along where the zipper connects. That's about the newest of the problems.

And yet, the bag has never failed me. I've had it... well, it must be over 10 years, and it was used when I got it. It's been in more or less the current condition (without the tearing on the side pocket) for about 5 years. Yet I've never lost a thing from it. It's served me well and it deserves recognition. I sometimes have an irrational attachment to things that serve me well. I'm not keen to replace something that's doing the job, no matter how bad it looks. So I'm kind of sorry to put it to rest, but I figure it's better that it goes having never failed me.

Today I did the ritual of transference. That is I took everything out of the old bag and put it in the new. Even the things that I should have thrown out, because it was part of the old bag and I need to carry as much of it forward as I can. I do the same thing whenever I switch to a new Magic Jacket.

Book Foo! I have finished: Wild Cards, Book 10: Double Solitaire,
by Melinda Snodgrass.
It was pretty good, actually. I enjoyed it, even though a few things I could see coming and the main plot was kind of separate from the normal Wild Cards plotlines (it took place almost entirely on another planet). Still, the inclusion of one of my favorite characters in the series, Jay Akroyd, helped, and all in all it was a nice yarn. Now I have to track down book 11.

I'm still reading Dark Tower Book 7 and just about to start a reread of A Fire Upon the Deep by Vernor Vinge (I now generally have two books going at once, one big one for days I can carry my bag, and a paperback one that fits in my coat pocket for days I just have my coat).


And... done!

December 2017

S M T W T F S
     12
3456789
10111213141516
17181920212223
24252627282930
31      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 9th, 2026 01:33 am
Powered by Dreamwidth Studios