Author Topic: EP239: A Programmatic Approach to Perfect Happiness  (Read 42735 times)

Swamp

  • Hipparch
  • ******
  • Posts: 2230
    • Journey Into... podcast
on: March 02, 2010, 01:22:39 AM
EP239: A Programmatic Approach to Perfect Happiness

By Tim Pratt.
Read by Stephen Eley.

First appeared in Futurismic, April 2009.

Opening poem: “Scientific Romance”

My step-daughter Wynter, who is regrettably prejudiced against robots and those who love us, comes floating through the door in a metaphorical cloud of glitter instead of her customary figurative cloud of gloom. She enters the kitchen, rises up on the toes of her black spike-heeled boots, wraps her leather-braceleted arms around my neck, and places a kiss on my cheek, leaving behind a smear of black lipstick on my artificial skin and a whiff of white make-up in my artificial nose. “Hi Kirby,” she says, voice all bubbles and light, when normally she would never deign to utter my personal designation. “Is Moms around? Haven’t talked to her in a million.”

I know right away that Wynter has been infected.


Rated R. Contains mature sexual situations and adult themes. (And robot themes.)


Listen to this week’s Escape Pod!

Facehuggers don't have heads!

Come with me and Journey Into... another fun podcast


Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2938
  • Part-Time Psychopomp.
Reply #1 on: March 02, 2010, 01:27:49 AM
Just a note: UCS — Very real, very not actually evil.

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


Subgenre

  • Guest
Reply #2 on: March 02, 2010, 01:48:53 AM
I would expect science fiction to use human/robot relationships to make a comment about human nature. Less expected, but still relevant would be an actual sincere thought exercise about the "person"-ness of sapient machines. But all this story achieved was to make me paranoid of behavior modification. Secret conspiracy of robots manipulating human society? How is that anything but a horror story? Or are my humanist values and artistic attachment to free will (despite a general skepticism of supposed metaphysical qualities) abnormal for a science-fiction fan? Is this story trying to make me feel misanthropy towards robots? Or more accurately, misroboticy? Or is it trying to make me feel irrational in my mistrust in the reductionist view of human emotions as a mechanical phenomena? Or is it just a 50's style scifi monster story about how robots are "stealing our women"? Is Kirby supposed to be a moral being with misguided but well-intentioned motives, an amoral being whose position is understandable if regrettable due to the modifiable nature of his own emotional state and his resulting value judgments regarding emotions, or an immoral being who is quite aware of the attachment human beings have to free will and how his actions are violating it? Is this is a well-written story for leaving me wondering about these things or a poorly written one? Where does thought-provoking end and vaugely-written begin?

Anyways, if this story has achieved anything for me it's that perhaps decades from now when I'm a crotchety old man telling young transhumanists to get off of my cyberlawn, some memory may spark in my dilapidated baseline-human brain and cause me to form the Anti-Behavior-Modification League.




Boggled Coriander

  • Lochage
  • *****
  • Posts: 545
    • Balancing Frogs
Reply #3 on: March 02, 2010, 02:23:11 AM
If this story had been written by Mr. Nobody McNeverheardahim, I would have assumed it was a tidy little story about how robots are trying to gain acceptance in the future and they're opposed by a couple of bigoted humans, just as countless other beings throughout history have been on the receiving end of bigotry.  And I would have assumed the disturbing subtexts were unintended.

But the story wasn't the work of Mr. Nobody McNeverheardahim; it was written by Tim Pratt, in whom I have more faith than in any elected official.  I had some of the same thoughts that Subgenre had, but I'm fully willing to give the author the benefit of the doubt here.  Yes, it was meant to be creepy.  (Not to claim to have an author's-mind-reading hat or anything, but I doubt the "Union of Concerned Scientists" would have been invoked if the author wasn't trying to make us feel vaguely uneasy.)

Oh, and the poem?  Extraordinary.  I liked it even better than the story.

"The meteor formed a crater, vampires crawling out of the crater." -  The Lyttle Lytton contest


Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #4 on: March 02, 2010, 11:00:00 AM
I have to ask, was I the only one who thought the story changed from a romance to a horror story of sorts mid-way through? The same with 'Just Do It', although that one was closer to the end.

I thought the robots where just trying to influence humans up until the last few lines, and then the mention of 'permanent personality alternations' put a huge twist on the story of sheer horror. Mind control, or rather, this insidious type of mind control, is never something I'm 100% comfortable with stories.

Of course, society itself alters our behaviour, i.e. to reduce aggression and/or the actions you take because of being angry (no murdering, etc), to respect each other's property, and so forth: All things that wouldn't necessarily exist without a society to agree the rules and enforce them.

But society itself is not a conscious entity and its controlling influence over us is restricted to mostly less direct methods. But robots fundamentally controlling our feelings and emotions is quite frankly a terrorifying idea.



Listener

  • Hipparch
  • ******
  • Posts: 3187
  • I place things in locations which later elude me.
    • Various and Sundry Items of Interest
Reply #5 on: March 02, 2010, 02:11:28 PM
I have to ask, was I the only one who thought the story changed from a romance to a horror story of sorts mid-way through? The same with 'Just Do It', although that one was closer to the end.

Indeed. I liked the very, very subtle way the story changed.

I did like this one a lot, although I think some of the words were a little jarring -- would an android's internal monologue use words like "fuck" and "ass"? Sure, he'd say them because he knows that's what April wants, but mentally?

Although if you think about it, a machine having an internal monologue is one of those fiction tropes that we all accept because we can't read decision-trees too much before our brains melt. And really that's what the MC's internal monologue was -- if this, then x probability of that; if this(prime), then y probability of that; etc. But I don't think we could've handled that.

I like BDSM, but I think the BDSM in this story may have been an additional layer that wasn't absolutely necessary to the story. I mean, I get it, April's kinky and she's with the MC because he'll respond to her every need... until the end when we find out it's actually the other way around: he may have made her love him.

The story really has a lot to think about, and I liked it a lot.

The moment I heard Steve say the title, I knew the MC would be a robot because he used his robot-voice. Am I the only one who knows that one by heart now?  ;D

"Farts are a hug you can smell." -Wil Wheaton

Blog || Quote Blog ||  Written and Audio Work || Twitter: @listener42


Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #6 on: March 02, 2010, 02:27:00 PM
I have to ask, was I the only one who thought the story changed from a romance to a horror story of sorts mid-way through? The same with 'Just Do It', although that one was closer to the end.

Indeed. I liked the very, very subtle way the story changed.

I did like this one a lot, although I think some of the words were a little jarring -- would an android's internal monologue use words like "fuck" and "ass"? Sure, he'd say them because he knows that's what April wants, but mentally?
Actually, that change was what I didn't like - it put me right off the story, in the same way the ending of 'Just Do It' did. MC in a comical setting or wholly unreal context I'm generally fine with, but seeing this sort of thing in a believable setting gives me an unpleasant feeling deep down. Not saying it was a bad story, just not my sort of story.

As for ass and fuck, why not? Androids wouldn't have a problem with the prudish side of our society that have determined those words to be 'bad words' - fuck is just a shorter way of saying 'sexual intercourse', and ass is just a shorter way of saying 'buttocks'.

Although if you think about it, a machine having an internal monologue is one of those fiction tropes that we all accept because we can't read decision-trees too much before our brains melt. And really that's what the MC's internal monologue was -- if this, then x probability of that; if this(prime), then y probability of that; etc. But I don't think we could've handled that.

I like BDSM, but I think the BDSM in this story may have been an additional layer that wasn't absolutely necessary to the story. I mean, I get it, April's kinky and she's with the MC because he'll respond to her every need... until the end when we find out it's actually the other way around: he may have made her love him.

The story really has a lot to think about, and I liked it a lot.

The moment I heard Steve say the title, I knew the MC would be a robot because he used his robot-voice. Am I the only one who knows that one by heart now?  ;D

I'm good with BDSM, and I like the lighter playful side of it, and respect people's right to engage in the darker, heavier side of it. Never tired it myself, but the lighter playful stuff I'd be willing to experiment with, heh. Of course, this short indirectly highlights the reasons why BDSM relationships are a continuous concern to those around the couple - i.e. is it geninuely volunteery or is it an abusing or manipulating relationship?
« Last Edit: March 02, 2010, 02:46:24 PM by Yargling »



stePH

  • Actually has enough cowbell.
  • Hipparch
  • ******
  • Posts: 3906
  • Cool story, bro!
    • Thetatr0n on SoundCloud
Reply #7 on: March 02, 2010, 03:29:43 PM
Secret conspiracy of robots manipulating human society? How is that anything but a horror story?

I dunno, but this reminds me of Asimov's "The Evitable Conflict" which I seem to recall was the last part of I, Robot.


But the story wasn't the work of Mr. Nobody McNeverheardahim; it was written by Tim Pratt, in whom I have more faith than in any elected official. 

Wow, if that's not setting the bar so low you could step over it...  ;D

"Nerdcore is like playing Halo while getting a blow-job from Hello Kitty."
-- some guy interviewed in Nerdcore Rising


Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #8 on: March 02, 2010, 03:46:06 PM
Just a note: UCS — Very real, very not actually evil.

Thats what your robot masters want us to believe!
« Last Edit: March 02, 2010, 04:10:39 PM by Yargling »



Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2938
  • Part-Time Psychopomp.
Reply #9 on: March 02, 2010, 04:01:45 PM
Just a note: UCS — Very real, very not actually evil.

Thats what your robot masters what us to believe!

What the what?

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #10 on: March 02, 2010, 04:11:30 PM
Just a note: UCS — Very real, very not actually evil.

Thats what your robot masters what us to believe!

What the what?

Doh - I mispelled - fixed the original... though the joke is sadly ruined by now... I blame the robots.



Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2938
  • Part-Time Psychopomp.
Reply #11 on: March 02, 2010, 04:17:27 PM
Just a note: UCS — Very real, very not actually evil.

Thats what your robot masters what us to believe!

What the what?

Doh - I mispelled - fixed the original... though the joke is sadly ruined by now... I blame the robots.

Your robot overlords categorically deny that they infected you with a word salad virus.

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #12 on: March 02, 2010, 04:41:24 PM
Just a note: UCS — Very real, very not actually evil.

Thats what your robot masters what us to believe!

What the what?

Doh - I mispelled - fixed the original... though the joke is sadly ruined by now... I blame the robots.

Your robot overlords categorically deny that they infected you with a word salad virus.

They do? Oh...well, I suppose they'd know... wait a minute...



KenK

  • Guest
Reply #13 on: March 02, 2010, 10:02:20 PM
BDSM and other non-instinctive behaviors are the result of brain chemistry. The author got that part right IMO. A person may or may not have a predilection for that sort of thing but it's the chemical mix within the organism itself that makes the experience seem pleasurable. The robot groks this and used it to his advantage. Humans do this too but not nearly as precisely as the robot did in the story.  :D A woman might make the chemicals that induce sexual desire in a potential partner by dressing in sexy lingerie; a man might do the same by shaving or dabbing on some cologne. The robot knew exactly what chemical substances to use for a specific effect. Humans largely don't know how to do this with any precision. Yet.  :D
« Last Edit: March 02, 2010, 10:04:35 PM by KenK »



timpratt

  • Extern
  • *
  • Posts: 16
Reply #14 on: March 03, 2010, 02:32:54 AM
I try to avoid wandering into comment threads, because a blathering author can stifle conversation, but since someone asked me about it on twitter and it's also been mentioned here, I just wanted to say: Naming the robot scientist group the Union of Concerned Scientists was a conscious joke -- I figured robots wouldn't scruple at using the same name for their own group (which humans would never find out about anyway), and with purposes the real UCS would abhor. I'm willing to believe it's only funny to me, though I'll preserve my dignity by choosing to believe the joke was just overly subtle; if others choose to believe it's simply unfunny or stupid, I won't argue. :)




Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2938
  • Part-Time Psychopomp.
Reply #15 on: March 03, 2010, 03:23:42 AM
I try to avoid wandering into comment threads, because a blathering author can stifle conversation, but since someone asked me about it on twitter and it's also been mentioned here, I just wanted to say: Naming the robot scientist group the Union of Concerned Scientists was a conscious joke -- I figured robots wouldn't scruple at using the same name for their own group (which humans would never find out about anyway), and with purposes the real UCS would abhor. I'm willing to believe it's only funny to me, though I'll preserve my dignity by choosing to believe the joke was just overly subtle; if others choose to believe it's simply unfunny or stupid, I won't argue. :)

As the one who asked on twitter (what's the point of being given the PC twitter account if not to harass, er, ask authors), I'll just say that I'm willing to believe that it only struck me as really weird because I'd called them earlier in the day for a work thing. It's probably something that would have worked better for me in print(after the initial befuddlement I got the joke), because hearing it in my headphones flashed me back to the phone call.

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


DrCrisp

  • Extern
  • *
  • Posts: 4
Reply #16 on: March 03, 2010, 05:32:36 PM
I enjoyed the subtle change of the robot from bottom to top, to borrow from the Outro.  And how the viruses subtly change human behavior.  The eeriest part was when he doesn't remember if she was interested in him or he was in her (Robots Want Women!!!).  And the Scientist/virus/conspiracy thing was great.  A good contrast is the Asimov book "The Robots of Dawn"  (I think)  where a Solarian woman takes a Robot as a husband and he is killed.

What I want to know is; where can I get some of that virus for my wife???



Bdoomed

  • Pseudopod Tiger
  • Moderator
  • *****
  • Posts: 5884
  • Mmm. Tiger.
Reply #17 on: March 03, 2010, 10:14:54 PM
So this story was incredibly frightening... Especially since I kept finding myself agreeing with the robot... Yet slowly realizing that he isn't just being reasonable.  There's more to it.
Shudder

I'd like to hear my options, so I could weigh them, what do you say?
Five pounds?  Six pounds? Seven pounds?


Gamercow

  • Hipparch
  • ******
  • Posts: 654
Reply #18 on: March 04, 2010, 12:57:09 AM
For me, the poem in the beginning could have been left out, but it was quickly forgotten when the excellent story was entered in full.  I found it to be an excellent story, in the Asimovan form of robots trying to self-identify, but with so much more.  I couldn't care less about the sex in this, or who was bottom or top or whatever was going on there.  What I cared about was the idea that started out subtly that emotions are just brain chemistry, and we are we as humans controlled by our emotions, or are we in control of them?  A very cerebral chicken/egg thing.  Is falsely induced emotion any different than "genuine"?  Do we want to be with someone because they make us happy, or do they make us happy because we want to be with them?  Excellent stuff. 

The cow says "Mooooooooo"


Sgarre1

  • Editor
  • *****
  • Posts: 1214
  • "Let There Be Fright!"
Reply #19 on: March 04, 2010, 02:57:16 AM
I think this might be appreciated here as it's robot and philosophy related, but mods feel free to move it if judged against..

Robot Socrates throws down on piety http://www.youtube.com/watch?v=VSHHXqjXCV4



KenK

  • Guest
Reply #20 on: March 04, 2010, 02:14:36 PM
If you feel great because you just won $10k in the lottery or a really hot woman agreed to out with you or because you were injected with hormonal chemicals doesn't matter to your brain.  What constitutes a "genuine" emotion is a impossible task. What "is" is real, quantifiable, and measurable; The "why" of why you feel a certain way is open to so many possible interpretations that it becomes impossible to sort out with any accuracy. That was the point that I took away from this story. If you enjoy BDSM or other such behaviors because you've been conditioned to enjoy it by random uncontrolled, circumstances or because you've been chemically juiced or had your OS tweaked doesn't alter the fact that it seems pleasurable to you. That's how I see it any how.

Also: The title of this story could just as easily have been used as a title for a scholarly article by a behaviorist psychologist. Maybe that's some ironic twist on the part of the author.  :D



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #21 on: March 04, 2010, 02:25:42 PM
If you feel great because you just won $10k in the lottery or a really hot woman agreed to out with you or because you were injected with hormonal chemicals doesn't matter to your brain.  What constitutes a "genuine" emotion is a impossible task. What "is" is real, quantifiable, and measurable; The "why" of why you feel a certain way is open to so many possible interpretations that it becomes impossible to sort out with any accuracy. That was the point that I took away from this story. If you enjoy BDSM or other such behaviors because you've been conditioned to enjoy it by random uncontrolled, circumstances or because you've been chemically juiced or had your OS tweaked doesn't alter the fact that it seems pleasurable to you. That's how I see it any how.

Also: The title of this story could just as easily have been used as a title for a scholarly article by a behaviorist psychologist. Maybe that's some ironic twist on the part of the author.  :D

Yes, but the key difference is that in the story, a sentient entity consciously decided to change the way people behaved - its mind rape of the highest order to use insideous agents like viruses to change the way someone thinks - i.e. by making them sexually attracted to robots.




gelee

  • Lochage
  • *****
  • Posts: 521
  • It's a missile, boy.
Reply #22 on: March 04, 2010, 05:55:01 PM
I had the same problem here that I usually have with AI stories: Why? 
Why is the robot pleased by the things that please it?  How do you motivate a thing with no innate motivations?  Even the most elemental human motivators, like hunger and pain, are matters of choice for this robot.
It's established early on that emotions are nothing more than the end result of a chemical/mechanical function.  It is also established that the robot can manipulate it's own internal reasoning and sensory processing to interpret any set of stimuli in any fashion it chooses.  Why does it choose to like B&D?  Why does it choose to be pleased by making its wife or step daughter happy?  Why should it give a rip about the sound of Wynter's giggle, or being called robodad?
There seems to be a lot of behind-the-scenes effort on the part of robot kind to manipulate humanity into a state that is more satisfactory to robots, but why bother?  Would it not be more efficient to simply choose to be happy with the status quo?  If it were simply established that emotional modeling was something built in to a robot, and copy protected, preventing easy manipulation, the whole issue is resolved, but as soon as the robot states that it can directly manipulate it's own emotional state, the premise of the story crumbles under it's own weight.
So, a solid piece of writing, well read, but with serious issues.

Oh, loved the poem, by the way :)  Very fun!  I wouldn't mind hearing more poetry from Escape Pod.



lowky

  • Hipparch
  • ******
  • Posts: 2717
  • from http://lovecraftismissing.com/?page_id=3142
Reply #23 on: March 04, 2010, 06:28:25 PM



Oh, loved the poem, by the way :)  Very fun!  I wouldn't mind hearing more poetry from Escape Artists.

There I fixed that for you  :P


Dave

  • Peltast
  • ***
  • Posts: 128
    • I Can Bend Minds With My Spoon
Reply #24 on: March 04, 2010, 11:59:16 PM
Well that didn't end on a sinister note at all.

-Dave (aka Nev the Deranged)


KenK

  • Guest
Reply #25 on: March 05, 2010, 01:24:11 AM
Yargling:
Quote
the key difference is that in the story, a sentient entity consciously decided to change the way people behaved - its mind rape of the highest order to use insideous agents like viruses to change the way someone thinks - i.e. by making them sexually attracted to robots.
Okay then so is lingerie, wine, incense, candles and love poems. It's called seduction. Given the proper sights, smells, touches, facial expressions, and such, hormones begin to flow into the blood stream and the process begins. But there are not any guarantees though. I have declined sexual advances and have had some of mine rebuffed as well.

When I feel sexual I "consciously decided" to attempt to persuade my intended to feel sexual too. The robot in the story is just way better at it than we are/I am. It's only rape when roofies or alcohol is used to render some one unconscious or otherwise unable to give or withhold consent.

That's how I see it.
« Last Edit: March 05, 2010, 01:33:12 AM by KenK »



gelee

  • Lochage
  • *****
  • Posts: 521
  • It's a missile, boy.
Reply #26 on: March 05, 2010, 11:04:13 AM
Okay then so is lingerie, wine, incense, candles and love poems. It's called seduction. Given the proper sights, smells, touches, facial expressions, and such, hormones begin to flow into the blood stream and the process begins. But there are not any guarantees though. I have declined sexual advances and have had some of mine rebuffed as well.
Sorry to but in, but this analogy is terrible.  Yes, we use devices and techniques to persuade, but there is the difference: persuasion.  Not control.  We engage the object of our desire at a conscious level and try to convince them that hooking up with us would be a good idea. We don't try to psychologicaly condition them to be sexually receptive to people like us.  And we sure as hell don't jack straight into someone's unconsious and start pushing buttons and pulling levers.  The means employed in this story are a lot more similiar to the roofies you cite than lingerie or candles.



KenK

  • Guest
Reply #27 on: March 05, 2010, 01:57:30 PM
Quote
gelee: We don't try to psychologicaly condition them to be sexually receptive to people like us.
Yes we do. And whatever works will be noted and used again.
Quote
gelee: And we sure as hell don't jack straight into someone's unconsious and start pushing buttons and pulling levers.
Yes that's exactly what we try to do, whether it's sexuality or anything else we desire. Remember those guilt tripping Save the Children ads from late night TV? If that isn't "button pushing" I don't know what is. Organisms will employ whatever means work to fulfill their desires and humans are no exception. But luckily our culture and  laws prohibit direct force or threats of it, but even with that it is still a very fragile limitation and it has to be continually reinforced.
Quote
gelee: The means employed in this story are a lot more similiar to the roofies you cite than lingerie or candles.
True, but only as the extreme end of a continuum of means available. The basic principle as I've stated remains clearly true. That rung of the ladder is only off-limits because you/we have 1) social conditioning, 2) technological limitations and 3) legal sanctions to prevent recourse to these means. And please don't misunderstand me; I think that this is a good thing.



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #28 on: March 05, 2010, 03:25:24 PM
Quote
gelee: We don't try to psychologicaly condition them to be sexually receptive to people like us.
Yes we do. And whatever works will be noted and used again.
Quote
gelee: And we sure as hell don't jack straight into someone's unconsious and start pushing buttons and pulling levers.
Yes that's exactly what we try to do, whether it's sexuality or anything else we desire. Remember those guilt tripping Save the Children ads from late night TV? If that isn't "button pushing" I don't know what is. Organisms will employ whatever means work to fulfill their desires and humans are no exception. But luckily our culture and  laws prohibit direct force or threats of it, but even with that it is still a very fragile limitation and it has to be continually reinforced.
Quote
gelee: The means employed in this story are a lot more similiar to the roofies you cite than lingerie or candles.
True, but only as the extreme end of a continuum of means available. The basic principle as I've stated remains clearly true. That rung of the ladder is only off-limits because you/we have 1) social conditioning, 2) technological limitations and 3) legal sanctions to prevent recourse to these means. And please don't misunderstand me; I think that this is a good thing.

Hi - back!

Now, I think the differences here are clear between persuasion and control:

1) Change of sexual preference: The robots here where directly changing people's sexual preference, something which, as far as we know, only torture or long term detainment and re-education can change with our current technology - the robots where LITERALLY forcing people to be sexually attracted to them.

No amount of chocolates, flowers, or back rubs will make a gay man want to have sex with a woman, same with straight men and other men, and so on with all points of the compass. In other words, full behaviour modification.

2) Involuntary: These changes are being forced on the humans without their knoledge or consent - hence, how I came up with "Mind Rape"

Its like if I had the power to make someone do what I want with just a snap of my fingers - it would still be a type of rape to use such a power to get a woman into my bed. If I used such a power to make a woman want to be there..well, that'd be a whole nother level of wrong - mind rape.



Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2938
  • Part-Time Psychopomp.
Reply #29 on: March 05, 2010, 04:21:03 PM
I guess the key difference here is that frilly garters and the like are not magic. While the amount of precise control demonstrated here with the viruses is science getting alike to magic (a la Clarke).

And you can say no to chocolates or flowers, and furthermore you're aware of the chocolates or flowers. The humans here were aware of the viruses, but not aware of their source or the fact they'd been used for some fairly fine-grained mind control.

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


gelee

  • Lochage
  • *****
  • Posts: 521
  • It's a missile, boy.
Reply #30 on: March 05, 2010, 05:04:03 PM
See, I think you have to differentiate between conditioning and persuasion.  You're right, in part.  We are conditioned to be intimate with consenting partners, among a wide variety of other things.  When you attempt to woo someone, or otherways persuade them to an action, that's not necessarily the same thing as conditioning.  My Intro Psychology class was a very long time ago, but it seems to me that conditioning would be more along the lines of permanent baseline personality modification, to paraphrase The Robot.  Think "Clockwork Orange."  Persuasion, I think, is more targeted and tends to be more incidental, like a sales pitch or campaign speech.
Also, when you persuade someone, you are presenting arguments (hopefully, reasonable ones) for an action or position.  That's what I mean when I say that you engage that persons consciousness.  When I talk about pushing buttons, I mean at a cellular level.  Appealling to one's ego or sense of pity to get a donation is one thing.  To reach into someone's head and mechanically change their opinion on an issue is something else entirely, and the two don't really compare well.



tinroof

  • Palmer
  • **
  • Posts: 47
Reply #31 on: March 05, 2010, 06:16:36 PM
It's only rape when roofies or alcohol is used to render some one unconscious or otherwise unable to give or withhold consent.

Then the wife here was raped, by your own definition. She was MADE to give consent. As soon as outright behaviour mods are used, she no longer has the option of withholding her consent. It is not simply seduction - people can decide they're unimpressed by flowers and compliments. People can't decide not to be affected by mind control.



yicheng

  • Matross
  • ****
  • Posts: 221
Reply #32 on: March 05, 2010, 08:32:26 PM
I found the story interesting, although it clashes with my personal theory on how AI's and humans will eventually coexist, i.e. human being will eventually become "post-human", by either using genetic manipulation or cybernetic neural plugins to transcend vanilla humans 1.0.  I doubt humans 1.0 will be enslaved or hunted down, but they will probably be marginalized (i.e. they just won't matter) as post-humans start running everything outside of Earth.  It's hard to say I liked this story, since I thought there was wwwaaaaayyyy too much anthropomorphizing of the main robot character.  My computer science training may have gotten in the way of this, but true AI's will neither think or act with anything remotely resembling human intelligence with an inner dialog (heck even a sense of "self"), and it's pretty self-serving to think that the apex of AI development will somehow be a human-like robot. 

As for the discussion here about mind manipulation, I personally don't believe in Free Will, as it assumes there's a separate "something-ness" that goes beyond your memories, neurons, and brain chemistry.  I take a "but-for-the-grace-of-God" approach, as I think any of us put into the exact same situations as anyone else would do nearly identical things.

I think an illuminating question to ask might be: What if it's not robots giving out the happiness virus?  What if it's just humans?  If we discovered a completely harmless way to make people be happy, compassionate, and benevolent to each other, instantly solve world problems like war, hunger, pollution, etc, wouldn't you be morally obligated to do it?



sixdeaftaxis

  • Extern
  • *
  • Posts: 5
Reply #33 on: March 06, 2010, 04:46:08 AM
From the point of view of human ethics, this is a horror story: inhuman creatures brainwashng humans, taking over the world and turning people into love/sex slaves. From the ethcal point of view of the protagonist, the robot, it is an uplifting paean: robots have always been able to easily adjust their own minds to create desirable emotions and improve their lives; soo. They will be able to permanently do the same with their poor irrational human partners.

I feel dirty just contemplating Tim Pratt's unique twist on robot ethics, but I am thrilled to finally read a thought-provoking robot story that is not beholden to St. Isaac and his Positronic Canon.



KenK

  • Guest
Reply #34 on: March 06, 2010, 05:00:12 AM
Quote
Heredel:The humans here were aware of the viruses,
Please note free will advocates; this point negates all of your arguments about it being [sic]
 "mind rape". The humans were aware that they'd been juiced. They could have taken steps. But they did not.



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #35 on: March 06, 2010, 10:57:07 AM
Quote
Heredel:The humans here were aware of the viruses,
Please note free will advocates; this point negates all of your arguments about it being [sic]
 "mind rape". The humans were aware that they'd been juiced. They could have taken steps. But they did not.

Thats like saying "A woman walking around without a chastity belt on could have taken steps to avoid rape". As it happens, we know the humans knew about the mood virus', but the behaviour control ones? No obvious sign - and as the robot in the story shows, they can use the moist membrane's of the eyes to infect someone, and further still the UCS was working on measures to get around the nose filters.

Just because someone doesn't take every concievable step towards preventing being mind controlled, like spending their whole lives inside a bio-hazard suit, does not mean they want, consent to, or are asking to be mind controlled.



KenK

  • Guest
Reply #36 on: March 06, 2010, 02:00:20 PM
@yargling
So I don't need to wear a  helmet when I bike or wear a seat belt when I drive then because I don't "consent" to having someone crash into me? I don't need to get vaccinated against disease because I don't "consent" to the polio virus infecting me? That's non-sense. If you value your autonomy that highly then perhaps, in the given universe of the story plot, people should wear bio-filters.

Clearly the humans in this story are aware of bio-chemical agents in their environment yet they take no measures. What does that say?



Talia

  • Moderator
  • *****
  • Posts: 2682
  • Muahahahaha
Reply #37 on: March 06, 2010, 04:52:35 PM
@yargling
So I don't need to wear a  helmet when I bike or wear a seat belt when I drive then because I don't "consent" to having someone crash into me? I don't need to get vaccinated against disease because I don't "consent" to the polio virus infecting me? That's non-sense. If you value your autonomy that highly then perhaps, in the given universe of the story plot, people should wear bio-filters.

Clearly the humans in this story are aware of bio-chemical agents in their environment yet they take no measures. What does that say?

You suggest carelessness, and to a degree that's true. But these examples you cite are examples of carelessness affecting the person in question, not a thinking being taking advantage of carelessness, which changes it from pure carelessness to a malicious action (even if its not intended to be malicious).
The difference between the examples cited and this example is the presence of the robots as the active, intelligent force deliberately taking advantage of carelessness.



KenK

  • Guest
Reply #38 on: March 06, 2010, 10:30:00 PM
@Talia
I was held up at gunpoint a few years back; was that "pure carelessness " on my part?*  When an older lady misjudged how much distance she'd need to stop and crashed into the back of my car two months ago was it "malicious intent" on her part? Who cares? I don't. All I/we can really do is try to note what possibilities for harm, loss, injury and death exist out there and try to be realistic about protecting myself. Whether harm is the result of malicious intent or happenstance the damage is  the same either way. So I wear seat belts, helmets, buy insurance, lock my doors, and vaccinate myself. And I have been known to wear face masks sometimes during flu season when using public transportation. I'm not gonna go hide in a cave and be a hermit in order to avoid all the risks of living in human society, but that doesn't mean I can just cast  my fate to the wind hope that God or the government or the police or the kindness of strangers will get me through life.

If I lived in the universe this story is set in, I'd check myself out. That's my bottom line.

* For being at the wrong place at the wrong time, that is.



Gamercow

  • Hipparch
  • ******
  • Posts: 654
Reply #39 on: March 08, 2010, 05:02:26 PM
I had the same problem here that I usually have with AI stories: Why? 

The easiest and most elegant reasoning I've heard is simply survival of the fittest.  Nearly every life form on Earth advances by being better than it's competition. Why should AI be any different?

The cow says "Mooooooooo"


KenK

  • Guest
Reply #40 on: March 08, 2010, 05:35:36 PM
@Gamercow
Bravo. Why indeed? What does it say about human beings that they knowingly decide to create a species of organism that is a biologically superior life-form to themselves and a direct evolutionary competitor? Sheesh. If human beings are really that stupid we are unfit to survive. Maybe that was the stories point?



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #41 on: March 08, 2010, 05:47:45 PM
I think an illuminating question to ask might be: What if it's not robots giving out the happiness virus?  What if it's just humans?  If we discovered a completely harmless way to make people be happy, compassionate, and benevolent to each other, instantly solve world problems like war, hunger, pollution, etc, wouldn't you be morally obligated to do it?

That happened in a Stephen King short story, if I recall correctly.  And had the unexpected side effect of causing dementia days after injection.

And even barring unforeseen side effects, I guess it depends on your definition of "harmless".  If you take away my free will, I don't consider that harmless regardless of whether the infecter was human or robot.  Sure, after the infection, I would probably like the idea, but at that point I'm not really me anymore, and that idea of loss of self scares the bejeezus out of me.




Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #42 on: March 08, 2010, 05:53:52 PM
Wow this was a dark story that would've fit right in at Pseudopod.  I like the subtle shifts that turned it from the beginning a lighthearted story about the public acceptance of robot-human marriage to a story about mind-controlling robot masters.  In the beginning I actually kind of liked the bot, though the sexual relationship was still creepy, and the most horrifying part was that the bot seemed to have totally good intentions and not aware of the mind-rape he is performing.

As for the question about why the bot decides to be happy about making others happy, I think it's just trying to be a "better person" to further fit itself into human society, and a broad definitions of "good person" is one who enjoys enhancing the happiness of others for no other reason that they've improved someone else's life.  But since the humans have only primitive means of enhancing one another's emotional state, it uses its biochemical expertise to create a faster and more reliable method. 




yicheng

  • Matross
  • ****
  • Posts: 221
Reply #43 on: March 08, 2010, 10:29:17 PM
That happened in a Stephen King short story, if I recall correctly.  And had the unexpected side effect of causing dementia days after injection.

And even barring unforeseen side effects, I guess it depends on your definition of "harmless".  If you take away my free will, I don't consider that harmless regardless of whether the infecter was human or robot.  Sure, after the infection, I would probably like the idea, but at that point I'm not really me anymore, and that idea of loss of self scares the bejeezus out of me.

I suppose I could challenge you to prove that you have a "self" and "free-will" to lose, but that would probably not be constructive.  I haven't read that Stephen King story, but I do remember an incident out of the Illiad on the Island of Soma, where the inhabitants were perpetually blissed out on "Happy Drugs" like Bronze Age Hippies, and Odysseus had to effectively kidnap his men in order to get home.  But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.



gelee

  • Lochage
  • *****
  • Posts: 521
  • It's a missile, boy.
Reply #44 on: March 08, 2010, 11:23:30 PM
I had the same problem here that I usually have with AI stories: Why? 
The easiest and most elegant reasoning I've heard is simply survival of the fittest.  Nearly every life form on Earth advances by being better than it's competition. Why should AI be any different?
But why would an AI compete to survive?  I know why I would.  I have a pretty good idea of why my cat would.  By why would a robot?  Mind you, I'm not saying it couldn't.  I'm not even saying it wouldn't.  I'm asking why it would bother.  No instinct for self preservation, no drive to reproduce, no emotions, no physical sensations except those it chooses to percieve.  How do you motivate a thing that lacks all of that?  The question is still unanswered.  Why would a robot do any of the things it did?  Any character in story should have reasons to act, movitves of its own.  For us humans, and most animals, the motives are self explanitory.  We love, hunger, and fear, among other things.  What about a machine?



yicheng

  • Matross
  • ****
  • Posts: 221
Reply #45 on: March 09, 2010, 06:34:35 AM
But why would an AI compete to survive?  I know why I would.  I have a pretty good idea of why my cat would.  By why would a robot?  Mind you, I'm not saying it couldn't.  I'm not even saying it wouldn't.  I'm asking why it would bother.  No instinct for self preservation, no drive to reproduce, no emotions, no physical sensations except those it chooses to percieve.  How do you motivate a thing that lacks all of that?  The question is still unanswered.  Why would a robot do any of the things it did?  Any character in story should have reasons to act, movitves of its own.  For us humans, and most animals, the motives are self explanitory.  We love, hunger, and fear, among other things.  What about a machine?

That's easy.  The robots that don't compete to survive get killed off, until the only ones left are the ones that do.  How are humans and animals that different than sentient robots?  Aren't we just chemical machines?



Yargling

  • Peltast
  • ***
  • Posts: 139
Reply #46 on: March 09, 2010, 07:58:05 AM
But why would an AI compete to survive?  I know why I would.  I have a pretty good idea of why my cat would.  By why would a robot?  Mind you, I'm not saying it couldn't.  I'm not even saying it wouldn't.  I'm asking why it would bother.  No instinct for self preservation, no drive to reproduce, no emotions, no physical sensations except those it chooses to percieve.  How do you motivate a thing that lacks all of that?  The question is still unanswered.  Why would a robot do any of the things it did?  Any character in story should have reasons to act, movitves of its own.  For us humans, and most animals, the motives are self explanitory.  We love, hunger, and fear, among other things.  What about a machine?

That's easy.  The robots that don't compete to survive get killed off, until the only ones left are the ones that do.  How are humans and animals that different than sentient robots?  Aren't we just chemical machines?

Possibly. I don't think it works quite that way with manufactured technologies. Regardless, who knows if AI's will have emotions or not? After all, emotions can be summarized as just a electronic charge state in the brain. If our AI's are as emotionally screwed up as us, it would explain all sorts of bizzare and irrational behaviour we consider normal.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #47 on: March 09, 2010, 03:24:54 PM
That happened in a Stephen King short story, if I recall correctly.  And had the unexpected side effect of causing dementia days after injection.

And even barring unforeseen side effects, I guess it depends on your definition of "harmless".  If you take away my free will, I don't consider that harmless regardless of whether the infecter was human or robot.  Sure, after the infection, I would probably like the idea, but at that point I'm not really me anymore, and that idea of loss of self scares the bejeezus out of me.

I suppose I could challenge you to prove that you have a "self" and "free-will" to lose, but that would probably not be constructive.  I haven't read that Stephen King story, but I do remember an incident out of the Illiad on the Island of Soma, where the inhabitants were perpetually blissed out on "Happy Drugs" like Bronze Age Hippies, and Odysseus had to effectively kidnap his men in order to get home.  But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I might think about taking it as well, but then it would be a choice, not forced on me by others. 



KenK

  • Guest
Reply #48 on: March 09, 2010, 05:41:00 PM
Quote
yicheng:But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I assume then that you are unaware of such substances  that are currently available such as Xanax, Ambien, or Prozac, just to name but a few?  ???
« Last Edit: March 09, 2010, 05:42:57 PM by KenK »



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #49 on: March 09, 2010, 05:46:17 PM
Quote
yicheng:But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I assume then that you are unaware of such substances  that are currently available such as Xanax, Ambien, or Prozac, just to name but a few?  ???

Those do have side effects, and are not so much aimed at making everybody happy, but at removing depression, not exactly the same thing. 



KenK

  • Guest
Reply #50 on: March 09, 2010, 05:55:52 PM
Unblinking
Quote
Those do have side effects, and are not so much aimed at making everybody happy, but at removing depression, not exactly the same thing. 

Possibly true, but you won't care either most likely. It isn't the damage, it's the not caring that concerns me.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #51 on: March 09, 2010, 06:04:55 PM
Unblinking
Quote
Those do have side effects, and are not so much aimed at making everybody happy, but at removing depression, not exactly the same thing. 

Possibly true, but you won't care either most likely. It isn't the damage, it's the not caring that concerns me.

I'm not sure I understand what you mean.  You won't care what the exact effect of the drug is, and that uncaringness bothers you?



jjtraw

  • Palmer
  • **
  • Posts: 24
Reply #52 on: March 09, 2010, 07:39:40 PM
Quote
If we discovered a completely harmless way to make people be happy, compassionate, and benevolent to each other, instantly solve world problems like war, hunger, pollution, etc, wouldn't you be morally obligated to do it?

As I recall, this very question was brilliantly explored in Ep112, The Giving Plague.



Scattercat

  • Caution:
  • Hipparch
  • ******
  • Posts: 4904
  • Amateur wordsmith
    • Mirrorshards
Reply #53 on: March 11, 2010, 06:25:39 AM
But what if the robot had tried to make humans shout at him in the back seat of a car?   :P



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #54 on: March 11, 2010, 02:29:35 PM
But what if the robot had tried to make humans shout at him in the back seat of a car?   :P

Good question!  This was even creepier than that--that one was icky, but still allowed the victim to make his own choices.



CryptoMe

  • Hipparch
  • ******
  • Posts: 1139
Reply #55 on: March 12, 2010, 03:31:46 AM
Quote
yicheng:But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I assume then that you are unaware of such substances  that are currently available such as Xanax, Ambien, or Prozac, just to name but a few?  ???

Those do have side effects, and are not so much aimed at making everybody happy, but at removing depression, not exactly the same thing. 

I would not take a happiness drug. I have been conditioned by my parents and society into believing that the path to true happiness is in the act of working hard for it.   ;D



CryptoMe

  • Hipparch
  • ******
  • Posts: 1139
Reply #56 on: March 12, 2010, 03:33:02 AM
But what if the robot had tried to make humans shout at him in the back seat of a car?   :P

Okay, that took me a moment, but then.... ROTFLMAO!!!



eytanz

  • Moderator
  • *****
  • Posts: 6109
Reply #57 on: March 13, 2010, 04:08:59 PM
Ok, another EP episode where I don't really want to step into the discussion arising from the story (not because it's a bad discussion, but the opposite - there are themes coming up here that deserve to be treated more seriously than I can given my time constraints these days); but let me just say that this was a brilliant creepy story - unlike some people, I did not perceive a genre shift, but rather a slow and gradual ramping up of sinister tones. This is one of the best Tim Pratt stories I've heard - and that is setting the bar up very high.



Gamercow

  • Hipparch
  • ******
  • Posts: 654
Reply #58 on: March 16, 2010, 02:22:37 PM
Back to gelee's(and other's) question, about why an AI would want to survive, why it would have that drive.   Firstly, the AIs in this story presumably came about either through self-discovery or programatically.
If they came about through self-discovery, it was most likely an evolutionary process.  It has been shown that computer programs can indeed evolve themselves,albeit simply right now, and that this is a good step towards estimating human thought and intelligence. (http://portal.acm.org/citation.cfm?id=1565465)  Simply put, programs self-analyze and improve, making their code more efficient, which makes their processes able to perform more tasks, which allows them to make better decisions, to make their code more efficient, and so on.  This could lead to AI, and this AI would still, most likely, have at its core, the self-improvement drive, and would want to survive and improve. 
If the AI was made programatically, made whole a la Adam in the bible, then wouldn't humans most likely put in a self-survival "instinct", either consciously, or more likely, subconsciously, in an effort to reduce repairs and poor "choices" by the sentient androids or programs?  Rather than putting in 1000 rules about "Don't walk into traffic, don't set yourself on fire, don't walk off a cliff, don't wander in front of a train", a situational condition would be put in the vague terms of "Do not let yourself come to harm?" 
Of course, all of this comes down to my mind of some of the simplest but most elegant rules regarding AI, and that's Asimov's 3 laws:
   1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
   2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
   3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
If you put these 3 laws on the robots in this story, their path towards human involvement in this story is still allowed.  They are not harming humans, just altering their emotions through experimentation to theoretically improve humans lives as a whole by giving them good emotions. 

To make a long story short, the survival instinct could be explained either with programming or evolution, and the motivation for survival would still be there, even though the underlying origins of the motivation(fear, sex, food) might not be, at least not truly.   

The cow says "Mooooooooo"


yicheng

  • Matross
  • ****
  • Posts: 221
Reply #59 on: March 16, 2010, 05:15:30 PM
@Gamercow, AI's can "evolve" to a certain extent, but this is one of those things (like flight) where simply copying what we see in nature may not be the ideal solution.  For one, natural evolution takes place over billions and billions of iterations.  While it is certainly possible that if you just randomly changed a few bytes of a program over and over again, eventually you'll get something smarter, it's extremely unlikely, and would statistically take a very very long time.  Most AI evolutionary systems rely on humans to build the infrastructure and parameters (the DNA if you will), from which it can then try some permutations and deduce the exact best solution (e.g. piecing together the optimal arrangement of chips on a circuit board).  The most "advanced" AI systems we have, such as Big Blue, are nothing more than a highly scalable collection of heuristics (in situation A, do B), pretty much a glorified look-up table that still had to be fed by humans.

I've said it before, but it's highly unlikely that Machine Intelligence (more accurate than AI) would be anything like human intelligence.  Most likely humans will leverage Machine Intelligence in the form of Expert Systems or Data Agents (like Google) to do the mental heavy-lifting while sticking to things that human brains are good at (synthesizing information, pattern recognition, subjective decisions).



stePH

  • Actually has enough cowbell.
  • Hipparch
  • ******
  • Posts: 3906
  • Cool story, bro!
    • Thetatr0n on SoundCloud
Reply #60 on: March 17, 2010, 04:56:50 PM
Of course, all of this comes down to my mind of some of the simplest but most elegant rules regarding AI, and that's Asimov's 3 laws:
   1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
   2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
   3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
If you put these 3 laws on the robots in this story, their path towards human involvement in this story is still allowed.  They are not harming humans, just altering their emotions through experimentation to theoretically improve humans lives as a whole by giving them good emotions. 

Again, see "The Evitable Conflict" from I, Robot, wherein the AIs are pretty much controlling and directing human society.

"Nerdcore is like playing Halo while getting a blow-job from Hello Kitty."
-- some guy interviewed in Nerdcore Rising


deflective

  • Hipparch
  • ******
  • Posts: 1171
Reply #61 on: March 18, 2010, 03:48:56 AM
another reading for the story is simple role reversal.  this conversation has touched on controlling ai a lot and we don't get the same sort of skin crawling creepiness even though it is pretty much exactly the same thing except that we're doing it to another conscious entity instead of it being done to us.

it's an interesting feedback loop, two separate species locked in a codependent relationship and both able to modify the thought patterns & motivations of the other.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #62 on: March 18, 2010, 01:55:20 PM
@Gamercow, AI's can "evolve" to a certain extent, but this is one of those things (like flight) where simply copying what we see in nature may not be the ideal solution.  For one, natural evolution takes place over billions and billions of iterations.  While it is certainly possible that if you just randomly changed a few bytes of a program over and over again, eventually you'll get something smarter, it's extremely unlikely, and would statistically take a very very long time.  Most AI evolutionary systems rely on humans to build the infrastructure and parameters (the DNA if you will), from which it can then try some permutations and deduce the exact best solution (e.g. piecing together the optimal arrangement of chips on a circuit board).  The most "advanced" AI systems we have, such as Big Blue, are nothing more than a highly scalable collection of heuristics (in situation A, do B), pretty much a glorified look-up table that still had to be fed by humans.

I've said it before, but it's highly unlikely that Machine Intelligence (more accurate than AI) would be anything like human intelligence.  Most likely humans will leverage Machine Intelligence in the form of Expert Systems or Data Agents (like Google) to do the mental heavy-lifting while sticking to things that human brains are good at (synthesizing information, pattern recognition, subjective decisions).

Some aspects of AI are more mystical than sets of heuristics.  Fuzzy logic, for example, or artificial neural networks.  ANNs, in particular, set up weights between nodes that are designed to work well for a training set and then hopefully will apply well to the outside world.  An AI that could alter it's own weights could change itself pretty drastically.



tinroof

  • Palmer
  • **
  • Posts: 47
Reply #63 on: March 18, 2010, 05:03:47 PM
deflective - oh wow. Yes. For me, that interpretation takes the story from mediocre-if-creepy to actually pretty genius.



Calculating...

  • Palmer
  • **
  • Posts: 56
  • Too much knowledge never makes for simple decision
Reply #64 on: March 19, 2010, 11:37:19 PM
eerie. i thought it was going to be all lovely about everyone getting along in the future and instead i ended up fearing a matrix-like world where we are controlled by our emotions, the very things that make us human.  are when then reduced to mere biological machines then? creepy, like i said

I don't know who you are or where you came from, but from now on you'll do as I tell you, okay?


zcarter80

  • Extern
  • *
  • Posts: 9
Reply #65 on: April 04, 2010, 08:04:17 AM
You know, I really hate to say it but beyond having just a general sense of the story being a good tale overall I'm a little curious as to why there haven't been any others here in a few weeks? was there a metacast or posted explanation as to why my favorite podcast for all things scifi has somehow gone and left me lonely? I'm beginning to feel like a dog waiting at the front door for his master to return from work when in an reality he is on vacation and they forgot to change the plutonium power source in the self contained automated K9 assistance unit.



Talia

  • Moderator
  • *****
  • Posts: 2682
  • Muahahahaha
Reply #66 on: April 04, 2010, 02:41:38 PM
You know, I really hate to say it but beyond having just a general sense of the story being a good tale overall I'm a little curious as to why there haven't been any others here in a few weeks? was there a metacast or posted explanation as to why my favorite podcast for all things scifi has somehow gone and left me lonely? I'm beginning to feel like a dog waiting at the front door for his master to return from work when in an reality he is on vacation and they forgot to change the plutonium power source in the self contained automated K9 assistance unit.

I will direct you to this here thread:

http://forum.escapeartists.net/index.php?topic=3404.0

:)



zcarter80

  • Extern
  • *
  • Posts: 9
Reply #67 on: April 05, 2010, 06:51:36 AM
I Honestly cant thank you enough, Im new to the forum, just not new to the podcast. its always nice to be pointed in the right direction. . . unless it leads to an alternate universe where dogs are in charge and we have to wear collars.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #68 on: April 05, 2010, 04:59:24 PM
You know, I really hate to say it but beyond having just a general sense of the story being a good tale overall I'm a little curious as to why there haven't been any others here in a few weeks? was there a metacast or posted explanation as to why my favorite podcast for all things scifi has somehow gone and left me lonely? I'm beginning to feel like a dog waiting at the front door for his master to return from work when in an reality he is on vacation and they forgot to change the plutonium power source in the self contained automated K9 assistance unit.

I will direct you to this here thread:

http://forum.escapeartists.net/index.php?topic=3404.0

:)

I've managed to hold off EP withdrawal so far because I'm still going through the backlog.  I'm looking forward to seeing new episodes again!



Heradel

  • Bill Peters, EP Assistant
  • Hipparch
  • ******
  • Posts: 2938
  • Part-Time Psychopomp.
Reply #69 on: April 05, 2010, 06:01:52 PM
I Honestly cant thank you enough, Im new to the forum, just not new to the podcast. its always nice to be pointed in the right direction. . . unless it leads to an alternate universe where dogs are in charge and we have to wear collars.

That's Tuesdays.

I Twitter. I also occasionally blog on the Escape Pod blog, which if you're here you shouldn't have much trouble finding.


zcarter80

  • Extern
  • *
  • Posts: 9
Reply #70 on: April 06, 2010, 09:49:59 AM
Sad to say, I have gone through the backlog already including reviews and meta casts. The only thing keeping me sane are the episodes of the Drabble cast and the new episodes of Pseudopod. Where are some other good podcasts of funny or unique scifi & horror? I already went through all the well told tales, and fear on demand. Not to mention you should be writing and classic tales so where do i go now? Any suggestions?



Talia

  • Moderator
  • *****
  • Posts: 2682
  • Muahahahaha
Reply #71 on: April 06, 2010, 11:20:05 AM
You should check out StarShipSofa. Its different.. its not purely fiction, they run fact articles and reviews too.. but its really really engaging, even the nonfiction segments they run are interesting. Plus the fiction stories they run there are by people like Paolo Bacigalupi, Tad Williams, Tanith Lee, Gene Wolfe, Bruce Sterling.. and that was just from a quick look at their webpage.



Portrait in Flesh

  • Hipparch
  • ******
  • Posts: 1121
  • NO KILL I
Reply #72 on: April 07, 2010, 01:35:05 AM
Quote
yicheng:But for the sake of argument, let's suppose that there are people who have taken the Happiness Drug, and are unharmed as far as you can tell with no side-effects, with the only difference being that they are nicer and happier (think permanently tripping out on X).  I, for one, would definitely take it.

I assume then that you are unaware of such substances  that are currently available such as Xanax, Ambien, or Prozac, just to name but a few?  ???

Those do have side effects, and are not so much aimed at making everybody happy, but at removing depression, not exactly the same thing. 

Yep.  Having been on antidepressants for a couple of years, I know very well that they don't automatically make you happy.  Rather, for me (and I assume others), they help to level out certain aspects of my brain chemistry that, if left unchecked, make me want to curl up in a darkened room and not talk to anybody or fill me with so much self-loathing that I'd rather sleep than have to deal with myself.

So, picking up on yicheng's hypothetical about a side-effect-free Happiness Drug...oddly enough, I would not take such a thing.  Perfect, perpetual happiness would be just as bad as crushing, relentless depression, but even worse because I feel that, after a while, the happiness would become "stale."  Without the sadness, how could you possibly ever enjoy happiness?  Sadness, like pain in general, is needed to a certain extent.  Without pain, you would never know something is wrong.  And without sadness, you'd never know just how happiness is "supposed" to feel.

There's some pretty fine art out there borne out of sadness.  It could all be gone if there were a Happiness Pill freely available.

"Boys from the city.  Not yet caught by the whirlwind of Progress.  Feed soda pop to the thirsty pigs." --The Beast of Yucca Flats


Scattercat

  • Caution:
  • Hipparch
  • ******
  • Posts: 4904
  • Amateur wordsmith
    • Mirrorshards
Reply #73 on: April 07, 2010, 04:07:31 AM
Yet is declining to take the side-effect free happiness pill not equivalent to, say, refusing to go out for ice cream with your friends for fear of improving your mood and ruining your poetry?  The happiness as it existed in this story was not euphoric or drugged; the 'victims' were fully rational and cognizant, but they also felt cheerful about things that used to upset them greatly.  Where is the line between what is acceptable mood modification (music, friends, food) and unacceptable (mystic perfect happiness drug)?



Portrait in Flesh

  • Hipparch
  • ******
  • Posts: 1121
  • NO KILL I
Reply #74 on: April 07, 2010, 10:41:05 PM
Yet is declining to take the side-effect free happiness pill not equivalent to, say, refusing to go out for ice cream with your friends for fear of improving your mood and ruining your poetry? 

Well, if all the cool kids are doing it then that obviously makes a difference.

I guess I just don't see it as an equivalent. 

"Boys from the city.  Not yet caught by the whirlwind of Progress.  Feed soda pop to the thirsty pigs." --The Beast of Yucca Flats


Scattercat

  • Caution:
  • Hipparch
  • ******
  • Posts: 4904
  • Amateur wordsmith
    • Mirrorshards
Reply #75 on: April 07, 2010, 10:57:27 PM
I'm just pointing out that lots of things we do influence our brain chemistry.  Smiling changes your brain chemistry.  Looking at other smiling people changes your brain chemistry.  Eating food you like changes your brain chemistry.

Why is a pill different from ice cream, other than the fact that pills tend to have more pronounced effects?  Why would it be awesome if everyone got free ice cream in their favorite flavor whenever they were sad, but terrible if everyone got a pill to take that made them happier whenever they were sad?



CryptoMe

  • Hipparch
  • ******
  • Posts: 1139
Reply #76 on: April 08, 2010, 03:45:37 AM
Why would it be awesome if everyone got free ice cream in their favorite flavor whenever they were sad, but terrible if everyone got a pill to take that made them happier whenever they were sad?

Because ice cream is soooo tasty  ;D

Seriously, though, I think it has a lot to do with not taking the easy solution, but working to solve your real issues. A pill that is guaranteed to make you happy is not solving the underlying problem if one is truly depressed or unhappy about something. It just ends up forming a dependency (as does ice cream for some people). Even anti-depressants are not meant to be a long term solution, but a stop gap measure when someone is really stuck in a downward spiral. Most respectable psychiatrists will not simply prescribe an anti-depressant and then send you off. They will work to figure out what is making you depressed and then help you solve that. Also, the great thing about brain chemistry, which is different from most other autonomous body functions, is that (as Scattercat pointed out) it can be changed by your behaviour. So, isn't it worthwhile to learn how to do this yourself instead of being enslaved to big pharma? I guess, for me, it comes down to this: If I am stuck on a deserted island, do I want to be doomed to a life of depression when my pills run out?



Scattercat

  • Caution:
  • Hipparch
  • ******
  • Posts: 4904
  • Amateur wordsmith
    • Mirrorshards
Reply #77 on: April 08, 2010, 04:00:08 AM
If you're stuck on a desert island, you're probably doomed to a life of depression anyway.  The social monkey brain really doesn't deal well with isolation.  ;-)



CryptoMe

  • Hipparch
  • ******
  • Posts: 1139
Reply #78 on: April 08, 2010, 04:38:03 AM
If you're stuck on a desert island, you're probably doomed to a life of depression anyway.  The social monkey brain really doesn't deal well with isolation.  ;-)

LOL!  Of course you know I was just using a hyperbole...



WillMoo

  • Palmer
  • **
  • Posts: 36
Reply #79 on: April 19, 2010, 12:53:07 PM
I, for one, welcome our robotic dildo overlords.  :D

Didn't really care for the story.



Eliyanna Kaiser

  • Matross
  • ****
  • Posts: 236
    • Just Another Writer
Reply #80 on: May 03, 2010, 05:28:25 PM
I think what I liked best about this one was that the inevitable robot takeover was sort of innocent. The robots were just trying to re-make the world so they could be happy.

If you must mount the gallows, give a jest to the crowd, a coin to the hangman, and make the drop with a smile on your lips.
-Birgitte, R. Jordan's Wheel of Time


justenjoying

  • Peltast
  • ***
  • Posts: 144
Reply #81 on: January 23, 2012, 03:23:46 AM
This seemed like a bad summary of Eros, Philia, Agape (EP250) by  Rachel Swirsky. This used the same ideas, but in
a creepy and invading way at the end. It just did not hold up to the afore mentioned story and had almost all the same ideas
within it though handled very differently I can't help but compare them and not many storys hold even a glint to Swirsky.
Thus I'm luke warm on this story if that.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #82 on: February 27, 2012, 05:18:01 PM
This seemed like a bad summary of Eros, Philia, Agape (EP250) by  Rachel Swirsky. This used the same ideas, but in
a creepy and invading way at the end. It just did not hold up to the afore mentioned story and had almost all the same ideas
within it though handled very differently I can't help but compare them and not many storys hold even a glint to Swirsky.
Thus I'm luke warm on this story if that.

I had pretty much the reverse view on that.  But that's not too surprising since I almost always like Pratt stories and almost never like Swirsky stories.  Just a matter of tastes I suppose.