Author Topic: EP659: Caesura  (Read 2707 times)

divs

  • Moderator
  • *****
  • Posts: 175
on: January 11, 2019, 10:02:55 PM
Escape Pod 659: Caesura


AUTHOR : Hayley Stone
NARRATOR : Stephanie Morris
HOST: Tina Connolly

---

Priya begins by striking the words love, hate, heart, and feel from the computer’s vocabulary, and blocks the internet. It isn’t with malicious intent. She does it on a whim, as with most things: fixing herself tacos at eleven o’clock at night, taking a right instead of a left turn against the advice of her GPS, showing up to her brother’s funeral in bright pink and yellow leopard-print high-tops.

“Your shoes look like they’re wanted for the murder of a Lisa Frank poster,” Demetri said when she first bought them, after nearly shooting Pepsi through his nose.

“You’re just jealous because I look fly, and you’d get shot wearing these around the city,” Priya said.

“Fly? So you’re a little gangster now, huh?”

“More than you.”

He did get shot. But it wasn’t over shoes.


Listen to this week’s Escape Pod!

« Last Edit: January 11, 2019, 10:06:50 PM by divs »



CryptoMe

  • Hipparch
  • ******
  • Posts: 1146
Reply #1 on: January 25, 2019, 05:03:44 PM
I tend to like machine-develops-sentience (or semblance of it) stories. This one was quite well done. I particularly like how the MC's relationship develops with the machine, all with her brother's death as a backdrop. Great juxtaposition.



Ichneumon

  • Matross
  • ****
  • Posts: 219
Reply #2 on: January 28, 2019, 04:09:18 PM
I liked all of the different things Priya accomplished with her Demi project. She was grieving for her brother, exploring an academic topic, nurturing a friend, exercising her creativity, and more.
Artificial intelligence is one of the topics in science fiction that I still struggle with. I just don't know how to feel about it. How do (and how will) human concepts of morality and emotion apply to a computer? Can human feelings and choices be distilled to code? My brain can't handle it!



CryptoMe

  • Hipparch
  • ******
  • Posts: 1146
Reply #3 on: February 13, 2019, 03:25:12 PM
How do (and how will) human concepts of morality and emotion apply to a computer?

My philosophy has always been, if a machine can ask for the rights of personhood, they should be granted those rights, both morally and legally. But then they are also subject to the responsibilities of personhood, again, morally and legally. Everything else can be figured out from there.



Fenrix

  • Curmudgeonly Co-Editor of PseudoPod
  • Editor
  • *****
  • Posts: 3996
  • I always lock the door when I creep by daylight.
Reply #4 on: October 01, 2019, 10:57:26 AM
I love it when stories tackle connected and autonomous cars, even if it's a brief sidebar.

All cat stories start with this statement: “My mother, who was the first cat, told me this...”