Escape Artists
June 16, 2019, 10:12:20 AM *
Welcome, Guest. Please login or register.

Login with username, password and session length
News: Read stories and vote in Podcastle Flash Fiction contest! New groups appear every other day.
   Home   Help Search Login Register  
Pages: [1]
Author Topic: EP659: Caesura  (Read 521 times)
Posts: 82

« on: January 11, 2019, 05:02:55 PM »

Escape Pod 659: Caesura

AUTHOR : Hayley Stone
NARRATOR : Stephanie Morris
HOST: Tina Connolly


Priya begins by striking the words love, hate, heart, and feel from the computer’s vocabulary, and blocks the internet. It isn’t with malicious intent. She does it on a whim, as with most things: fixing herself tacos at eleven o’clock at night, taking a right instead of a left turn against the advice of her GPS, showing up to her brother’s funeral in bright pink and yellow leopard-print high-tops.

“Your shoes look like they’re wanted for the murder of a Lisa Frank poster,” Demetri said when she first bought them, after nearly shooting Pepsi through his nose.

“You’re just jealous because I look fly, and you’d get shot wearing these around the city,” Priya said.

“Fly? So you’re a little gangster now, huh?”

“More than you.”

He did get shot. But it wasn’t over shoes.

Listen to this week’s Escape Pod!

« Last Edit: January 11, 2019, 05:06:50 PM by divs » Logged
Posts: 1042

« Reply #1 on: January 25, 2019, 12:03:44 PM »

I tend to like machine-develops-sentience (or semblance of it) stories. This one was quite well done. I particularly like how the MC's relationship develops with the machine, all with her brother's death as a backdrop. Great juxtaposition.
Posts: 217

« Reply #2 on: January 28, 2019, 11:09:18 AM »

I liked all of the different things Priya accomplished with her Demi project. She was grieving for her brother, exploring an academic topic, nurturing a friend, exercising her creativity, and more.
Artificial intelligence is one of the topics in science fiction that I still struggle with. I just don't know how to feel about it. How do (and how will) human concepts of morality and emotion apply to a computer? Can human feelings and choices be distilled to code? My brain can't handle it!
Posts: 1042

« Reply #3 on: February 13, 2019, 10:25:12 AM »

How do (and how will) human concepts of morality and emotion apply to a computer?

My philosophy has always been, if a machine can ask for the rights of personhood, they should be granted those rights, both morally and legally. But then they are also subject to the responsibilities of personhood, again, morally and legally. Everything else can be figured out from there.
Pages: [1]
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines Valid XHTML 1.0! Valid CSS!