Author Topic: EP239: A Programmatic Approach to Perfect Happiness  (Read 52237 times)

KenK

  • Guest
Reply #25 on: March 05, 2010, 01:24:11 AM
Yargling:
Quote
the key difference is that in the story, a sentient entity consciously decided to change the way people behaved - its mind rape of the highest order to use insideous agents like viruses to change the way someone thinks - i.e. by making them sexually attracted to robots.
Okay then so is lingerie, wine, incense, candles and love poems. It's called seduction. Given the proper sights, smells, touches, facial expressions, and such, hormones begin to flow into the blood stream and the process begins. But there are not any guarantees though. I have declined sexual advances and have had some of mine rebuffed as well.

When I feel sexual I "consciously decided" to attempt to persuade my intended to feel sexual too. The robot in the story is just way better at it than we are/I am. It's only rape when roofies or alcohol is used to render some one unconscious or otherwise unable to give or withhold consent.

That's how I see it.
« Last Edit: March 05, 2010, 01:33:12 AM by KenK »



gelee

  • Lochage
  • *****
  • Posts: 517
  • It's a missile, boy.
Reply #26 on: March 05, 2010, 11:04:13 AM
Okay then so is lingerie, wine, incense, candles and love poems. It's called seduction. Given the proper sights, smells, touches, facial expressions, and such, hormones begin to flow into the blood stream and the process begins. But there are not any guarantees though. I have declined sexual advances and have had some of mine rebuffed as well.
Sorry to but in, but this analogy is terrible.  Yes, we use devices and techniques to persuade, but there is the difference: persuasion.  Not control.  We engage the object of our desire at a conscious level and try to convince them that hooking up with us would be a good idea. We don't try to psychologicaly condition them to be sexually receptive to people like us.  And we sure as hell don't jack straight into someone's unconsious and start pushing buttons and pulling levers.  The means employed in this story are a lot more similiar to the roofies you cite than lingerie or candles.



KenK

  • Guest
Reply #27 on: March 05, 2010, 01:57:30 PM
Quote
gelee: We don't try to psychologicaly condition them to be sexually receptive to people like us.
Yes we do. And whatever works will be noted and used again.
Quote
gelee: And we sure as hell don't jack straight into someone's unconsious and start pushing buttons and pulling levers.
Yes that's exactly what we try to do, whether it's sexuality or anything else we desire. Remember those guilt tripping Save the Children ads from late night TV? If that isn't "button pushing" I don't know what is. Organisms will employ whatever means work to fulfill their desires and humans are no exception. But luckily our culture and  laws prohibit direct force or threats of it, but even with that it is still a very fragile limitation and it has to be continually reinforced.
Quote
gelee: The means employed in this story are a lot more similiar to the roofies you cite than lingerie or candles.
True, but only as the extreme end of a continuum of means available. The basic principle as I've stated remains clearly true. That rung of the ladder is only off-limits because you/we have 1) social conditioning, 2) technological limitations and 3) legal sanctions to prevent recourse to these means. And please don't misunderstand me; I think that this is a good thing.



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #28 on: March 05, 2010, 03:25:24 PM
Quote
gelee: We don't try to psychologicaly condition them to be sexually receptive to people like us.
Yes we do. And whatever works will be noted and used again.
Quote
gelee: And we sure as hell don't jack straight into someone's unconsious and start pushing buttons and pulling levers.
Yes that's exactly what we try to do, whether it's sexuality or anything else we desire. Remember those guilt tripping Save the Children ads from late night TV? If that isn't "button pushing" I don't know what is. Organisms will employ whatever means work to fulfill their desires and humans are no exception. But luckily our culture and  laws prohibit direct force or threats of it, but even with that it is still a very fragile limitation and it has to be continually reinforced.
Quote
gelee: The means employed in this story are a lot more similiar to the roofies you cite than lingerie or candles.
True, but only as the extreme end of a continuum of means available. The basic principle as I've stated remains clearly true. That rung of the ladder is only off-limits because you/we have 1) social conditioning, 2) technological limitations and 3) legal sanctions to prevent recourse to these means. And please don't misunderstand me; I think that this is a good thing.

Hi - back!

Now, I think the differences here are clear between persuasion and control:

1) Change of sexual preference: The robots here where directly changing people's sexual preference, something which, as far as we know, only torture or long term detainment and re-education can change with our current technology - the robots where LITERALLY forcing people to be sexually attracted to them.

No amount of chocolates, flowers, or back rubs will make a gay man want to have sex with a woman, same with straight men and other men, and so on with all points of the compass. In other words, full behaviour modification.

2) Involuntary: These changes are being forced on the humans without their knoledge or consent - hence, how I came up with "Mind Rape"

Its like if I had the power to make someone do what I want with just a snap of my fingers - it would still be a type of rape to use such a power to get a woman into my bed. If I used such a power to make a woman want to be there..well, that'd be a whole nother level of wrong - mind rape.



Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2930
  • Part-Time Psychopomp.
Reply #29 on: March 05, 2010, 04:21:03 PM
I guess the key difference here is that frilly garters and the like are not magic. While the amount of precise control demonstrated here with the viruses is science getting alike to magic (a la Clarke).

And you can say no to chocolates or flowers, and furthermore you're aware of the chocolates or flowers. The humans here were aware of the viruses, but not aware of their source or the fact they'd been used for some fairly fine-grained mind control.

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


gelee

  • Lochage
  • *****
  • Posts: 517
  • It's a missile, boy.
Reply #30 on: March 05, 2010, 05:04:03 PM
See, I think you have to differentiate between conditioning and persuasion.  You're right, in part.  We are conditioned to be intimate with consenting partners, among a wide variety of other things.  When you attempt to woo someone, or otherways persuade them to an action, that's not necessarily the same thing as conditioning.  My Intro Psychology class was a very long time ago, but it seems to me that conditioning would be more along the lines of permanent baseline personality modification, to paraphrase The Robot.  Think "Clockwork Orange."  Persuasion, I think, is more targeted and tends to be more incidental, like a sales pitch or campaign speech.
Also, when you persuade someone, you are presenting arguments (hopefully, reasonable ones) for an action or position.  That's what I mean when I say that you engage that persons consciousness.  When I talk about pushing buttons, I mean at a cellular level.  Appealling to one's ego or sense of pity to get a donation is one thing.  To reach into someone's head and mechanically change their opinion on an issue is something else entirely, and the two don't really compare well.



tinroof

  • Palmer
  • **
  • Posts: 47
Reply #31 on: March 05, 2010, 06:16:36 PM
It's only rape when roofies or alcohol is used to render some one unconscious or otherwise unable to give or withhold consent.

Then the wife here was raped, by your own definition. She was MADE to give consent. As soon as outright behaviour mods are used, she no longer has the option of withholding her consent. It is not simply seduction - people can decide they're unimpressed by flowers and compliments. People can't decide not to be affected by mind control.



yicheng

  • Matross
  • ****
  • Posts: 221
Reply #32 on: March 05, 2010, 08:32:26 PM
I found the story interesting, although it clashes with my personal theory on how AI's and humans will eventually coexist, i.e. human being will eventually become "post-human", by either using genetic manipulation or cybernetic neural plugins to transcend vanilla humans 1.0.  I doubt humans 1.0 will be enslaved or hunted down, but they will probably be marginalized (i.e. they just won't matter) as post-humans start running everything outside of Earth.  It's hard to say I liked this story, since I thought there was wwwaaaaayyyy too much anthropomorphizing of the main robot character.  My computer science training may have gotten in the way of this, but true AI's will neither think or act with anything remotely resembling human intelligence with an inner dialog (heck even a sense of "self"), and it's pretty self-serving to think that the apex of AI development will somehow be a human-like robot. 

As for the discussion here about mind manipulation, I personally don't believe in Free Will, as it assumes there's a separate "something-ness" that goes beyond your memories, neurons, and brain chemistry.  I take a "but-for-the-grace-of-God" approach, as I think any of us put into the exact same situations as anyone else would do nearly identical things.

I think an illuminating question to ask might be: What if it's not robots giving out the happiness virus?  What if it's just humans?  If we discovered a completely harmless way to make people be happy, compassionate, and benevolent to each other, instantly solve world problems like war, hunger, pollution, etc, wouldn't you be morally obligated to do it?



sixdeaftaxis

  • Extern
  • *
  • Posts: 5
Reply #33 on: March 06, 2010, 04:46:08 AM
From the point of view of human ethics, this is a horror story: inhuman creatures brainwashng humans, taking over the world and turning people into love/sex slaves. From the ethcal point of view of the protagonist, the robot, it is an uplifting paean: robots have always been able to easily adjust their own minds to create desirable emotions and improve their lives; soo. They will be able to permanently do the same with their poor irrational human partners.

I feel dirty just contemplating Tim Pratt's unique twist on robot ethics, but I am thrilled to finally read a thought-provoking robot story that is not beholden to St. Isaac and his Positronic Canon.



KenK

  • Guest
Reply #34 on: March 06, 2010, 05:00:12 AM
Quote
Heredel:The humans here were aware of the viruses,
Please note free will advocates; this point negates all of your arguments about it being [sic]
 "mind rape". The humans were aware that they'd been juiced. They could have taken steps. But they did not.



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #35 on: March 06, 2010, 10:57:07 AM
Quote
Heredel:The humans here were aware of the viruses,
Please note free will advocates; this point negates all of your arguments about it being [sic]
 "mind rape". The humans were aware that they'd been juiced. They could have taken steps. But they did not.

Thats like saying "A woman walking around without a chastity belt on could have taken steps to avoid rape". As it happens, we know the humans knew about the mood virus', but the behaviour control ones? No obvious sign - and as the robot in the story shows, they can use the moist membrane's of the eyes to infect someone, and further still the UCS was working on measures to get around the nose filters.

Just because someone doesn't take every concievable step towards preventing being mind controlled, like spending their whole lives inside a bio-hazard suit, does not mean they want, consent to, or are asking to be mind controlled.



KenK

  • Guest
Reply #36 on: March 06, 2010, 02:00:20 PM
@yargling
So I don't need to wear a  helmet when I bike or wear a seat belt when I drive then because I don't "consent" to having someone crash into me? I don't need to get vaccinated against disease because I don't "consent" to the polio virus infecting me? That's non-sense. If you value your autonomy that highly then perhaps, in the given universe of the story plot, people should wear bio-filters.

Clearly the humans in this story are aware of bio-chemical agents in their environment yet they take no measures. What does that say?



Talia

  • Moderator
  • *****
  • Posts: 2658
  • Muahahahaha
Reply #37 on: March 06, 2010, 04:52:35 PM
@yargling
So I don't need to wear a  helmet when I bike or wear a seat belt when I drive then because I don't "consent" to having someone crash into me? I don't need to get vaccinated against disease because I don't "consent" to the polio virus infecting me? That's non-sense. If you value your autonomy that highly then perhaps, in the given universe of the story plot, people should wear bio-filters.

Clearly the humans in this story are aware of bio-chemical agents in their environment yet they take no measures. What does that say?

You suggest carelessness, and to a degree that's true. But these examples you cite are examples of carelessness affecting the person in question, not a thinking being taking advantage of carelessness, which changes it from pure carelessness to a malicious action (even if its not intended to be malicious).
The difference between the examples cited and this example is the presence of the robots as the active, intelligent force deliberately taking advantage of carelessness.



KenK

  • Guest
Reply #38 on: March 06, 2010, 10:30:00 PM
@Talia
I was held up at gunpoint a few years back; was that "pure carelessness " on my part?*  When an older lady misjudged how much distance she'd need to stop and crashed into the back of my car two months ago was it "malicious intent" on her part? Who cares? I don't. All I/we can really do is try to note what possibilities for harm, loss, injury and death exist out there and try to be realistic about protecting myself. Whether harm is the result of malicious intent or happenstance the damage is  the same either way. So I wear seat belts, helmets, buy insurance, lock my doors, and vaccinate myself. And I have been known to wear face masks sometimes during flu season when using public transportation. I'm not gonna go hide in a cave and be a hermit in order to avoid all the risks of living in human society, but that doesn't mean I can just cast  my fate to the wind hope that God or the government or the police or the kindness of strangers will get me through life.

If I lived in the universe this story is set in, I'd check myself out. That's my bottom line.

* For being at the wrong place at the wrong time, that is.



Gamercow

  • Hipparch
  • ******
  • Posts: 654
Reply #39 on: March 08, 2010, 05:02:26 PM
I had the same problem here that I usually have with AI stories: Why? 

The easiest and most elegant reasoning I've heard is simply survival of the fittest.  Nearly every life form on Earth advances by being better than it's competition. Why should AI be any different?

The cow says "Mooooooooo"


KenK

  • Guest
Reply #40 on: March 08, 2010, 05:35:36 PM
@Gamercow
Bravo. Why indeed? What does it say about human beings that they knowingly decide to create a species of organism that is a biologically superior life-form to themselves and a direct evolutionary competitor? Sheesh. If human beings are really that stupid we are unfit to survive. Maybe that was the stories point?



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8726
    • Diabolical Plots
Reply #41 on: March 08, 2010, 05:47:45 PM
I think an illuminating question to ask might be: What if it's not robots giving out the happiness virus?  What if it's just humans?  If we discovered a completely harmless way to make people be happy, compassionate, and benevolent to each other, instantly solve world problems like war, hunger, pollution, etc, wouldn't you be morally obligated to do it?

That happened in a Stephen King short story, if I recall correctly.  And had the unexpected side effect of causing dementia days after injection.

And even barring unforeseen side effects, I guess it depends on your definition of "harmless".  If you take away my free will, I don't consider that harmless regardless of whether the infecter was human or robot.  Sure, after the infection, I would probably like the idea, but at that point I'm not really me anymore, and that idea of loss of self scares the bejeezus out of me.




Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8726
    • Diabolical Plots
Reply #42 on: March 08, 2010, 05:53:52 PM
Wow this was a dark story that would've fit right in at Pseudopod.  I like the subtle shifts that turned it from the beginning a lighthearted story about the public acceptance of robot-human marriage to a story about mind-controlling robot masters.  In the beginning I actually kind of liked the bot, though the sexual relationship was still creepy, and the most horrifying part was that the bot seemed to have totally good intentions and not aware of the mind-rape he is performing.

As for the question about why the bot decides to be happy about making others happy, I think it's just trying to be a "better person" to further fit itself into human society, and a broad definitions of "good person" is one who enjoys enhancing the happiness of others for no other reason that they've improved someone else's life.  But since the humans have only primitive means of enhancing one another's emotional state, it uses its biochemical expertise to create a faster and more reliable method. 




yicheng

  • Matross
  • ****
  • Posts: 221
Reply #43 on: March 08, 2010, 10:29:17 PM
That happened in a Stephen King short story, if I recall correctly.  And had the unexpected side effect of causing dementia days after injection.

And even barring unforeseen side effects, I guess it depends on your definition of "harmless".  If you take away my free will, I don't consider that harmless regardless of whether the infecter was human or robot.  Sure, after the infection, I would probably like the idea, but at that point I'm not really me anymore, and that idea of loss of self scares the bejeezus out of me.

I suppose I could challenge you to prove that you have a "self" and "free-will" to lose, but that would probably not be constructive.  I haven't read that Stephen King story, but I do remember an incident out of the Illiad on the Island of Soma, where the inhabitants were perpetually blissed out on "Happy Drugs" like Bronze Age Hippies, and Odysseus had to effectively kidnap his men in order to get home.  But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.



gelee

  • Lochage
  • *****
  • Posts: 517
  • It's a missile, boy.
Reply #44 on: March 08, 2010, 11:23:30 PM
I had the same problem here that I usually have with AI stories: Why? 
The easiest and most elegant reasoning I've heard is simply survival of the fittest.  Nearly every life form on Earth advances by being better than it's competition. Why should AI be any different?
But why would an AI compete to survive?  I know why I would.  I have a pretty good idea of why my cat would.  By why would a robot?  Mind you, I'm not saying it couldn't.  I'm not even saying it wouldn't.  I'm asking why it would bother.  No instinct for self preservation, no drive to reproduce, no emotions, no physical sensations except those it chooses to percieve.  How do you motivate a thing that lacks all of that?  The question is still unanswered.  Why would a robot do any of the things it did?  Any character in story should have reasons to act, movitves of its own.  For us humans, and most animals, the motives are self explanitory.  We love, hunger, and fear, among other things.  What about a machine?



yicheng

  • Matross
  • ****
  • Posts: 221
Reply #45 on: March 09, 2010, 06:34:35 AM
But why would an AI compete to survive?  I know why I would.  I have a pretty good idea of why my cat would.  By why would a robot?  Mind you, I'm not saying it couldn't.  I'm not even saying it wouldn't.  I'm asking why it would bother.  No instinct for self preservation, no drive to reproduce, no emotions, no physical sensations except those it chooses to percieve.  How do you motivate a thing that lacks all of that?  The question is still unanswered.  Why would a robot do any of the things it did?  Any character in story should have reasons to act, movitves of its own.  For us humans, and most animals, the motives are self explanitory.  We love, hunger, and fear, among other things.  What about a machine?

That's easy.  The robots that don't compete to survive get killed off, until the only ones left are the ones that do.  How are humans and animals that different than sentient robots?  Aren't we just chemical machines?



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #46 on: March 09, 2010, 07:58:05 AM
But why would an AI compete to survive?  I know why I would.  I have a pretty good idea of why my cat would.  By why would a robot?  Mind you, I'm not saying it couldn't.  I'm not even saying it wouldn't.  I'm asking why it would bother.  No instinct for self preservation, no drive to reproduce, no emotions, no physical sensations except those it chooses to percieve.  How do you motivate a thing that lacks all of that?  The question is still unanswered.  Why would a robot do any of the things it did?  Any character in story should have reasons to act, movitves of its own.  For us humans, and most animals, the motives are self explanitory.  We love, hunger, and fear, among other things.  What about a machine?

That's easy.  The robots that don't compete to survive get killed off, until the only ones left are the ones that do.  How are humans and animals that different than sentient robots?  Aren't we just chemical machines?

Possibly. I don't think it works quite that way with manufactured technologies. Regardless, who knows if AI's will have emotions or not? After all, emotions can be summarized as just a electronic charge state in the brain. If our AI's are as emotionally screwed up as us, it would explain all sorts of bizzare and irrational behaviour we consider normal.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8726
    • Diabolical Plots
Reply #47 on: March 09, 2010, 03:24:54 PM
That happened in a Stephen King short story, if I recall correctly.  And had the unexpected side effect of causing dementia days after injection.

And even barring unforeseen side effects, I guess it depends on your definition of "harmless".  If you take away my free will, I don't consider that harmless regardless of whether the infecter was human or robot.  Sure, after the infection, I would probably like the idea, but at that point I'm not really me anymore, and that idea of loss of self scares the bejeezus out of me.

I suppose I could challenge you to prove that you have a "self" and "free-will" to lose, but that would probably not be constructive.  I haven't read that Stephen King story, but I do remember an incident out of the Illiad on the Island of Soma, where the inhabitants were perpetually blissed out on "Happy Drugs" like Bronze Age Hippies, and Odysseus had to effectively kidnap his men in order to get home.  But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I might think about taking it as well, but then it would be a choice, not forced on me by others. 



KenK

  • Guest
Reply #48 on: March 09, 2010, 05:41:00 PM
Quote
yicheng:But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I assume then that you are unaware of such substances  that are currently available such as Xanax, Ambien, or Prozac, just to name but a few?  ???
« Last Edit: March 09, 2010, 05:42:57 PM by KenK »



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8726
    • Diabolical Plots
Reply #49 on: March 09, 2010, 05:46:17 PM
Quote
yicheng:But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I assume then that you are unaware of such substances  that are currently available such as Xanax, Ambien, or Prozac, just to name but a few?  ???

Those do have side effects, and are not so much aimed at making everybody happy, but at removing depression, not exactly the same thing.