Author Topic: EP441: Kumara  (Read 14286 times)

Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #25 on: April 15, 2014, 01:56:39 PM
Cool idea Chris!  I think it may not have worked because trying to make that kind of alteration would probably just make the crew members crash.  But a cool idea anyway, and would've been cool if the story had noted it and dismissed it for some reason or other.



matweller

  • EA Staff
  • *****
  • Posts: 678
Reply #26 on: April 15, 2014, 03:15:29 PM
Wouldn't that be neat, though? If the afterlife is going to be a computer construct anyway, why not make it a copy of the ship? they might even be able to make the transition with barely a blip between reality and afterlife...just some deja vu or something Matrix-style... Although, I have to admit, if that were the deal IRL, I think I'd rather just be dead than forced to continue in work purgatory. So maybe the answer is just that Kumara should have killed them all.



Unblinking

  • Sir Postsalot
  • Hipparch
  • ******
  • Posts: 8729
    • Diabolical Plots
Reply #27 on: April 15, 2014, 03:47:19 PM
I mean, this is heaven to them, therefor this is what makes them happy.

I disagree.  It was CALLED heaven, probably named so because it's where you put dead crew members.  But these weren't heavens made for you to be happy, they're heavens that keep you alive because they keep you striving, pushing.  Kumara didn't take away their happiness, she took away their striving. 

Which to me makes the choice a much more muddied one.  Ask a lobotomy recipient if he's happy and he may well say yes, but that's not the same thing as him saying yes BEFORE, because his mind has been fundamentally changed.  But, why should the lobotomized guy care about what the pre-lobotomy guy would say as long as lobotomized guy is happy?  Why should that matter?  The reason it matters, to me, is that I'm pre-lobotomized guy thinking about how I'd like it if it happened, dammit, and I want to be me.  I'm sure Lobotomized Unblinking is a very nice fellow, and I'd be happy to buy him a Guinness if I ever run into him as long as I don't have to become him.

In the end, I don't think Kumara made an unreasonable decision given her situation.  IMO it is simultaneously wondrous and horrendous, a fine balance to strike.



matweller

  • EA Staff
  • *****
  • Posts: 678
Reply #28 on: April 15, 2014, 08:23:21 PM
...It was CALLED heaven...
That's H.E.A.V.E.N. -- the Humane Evanescent Alternative Virtual Environment Nexus (TM) -- brought to you by OmniCorp.



Alasdair5000

  • Editor
  • *****
  • Posts: 1022
    • My blog
Reply #29 on: April 15, 2014, 08:39:16 PM
Share and enjoy!



MichaelFoster

  • Extern
  • *
  • Posts: 4
Reply #30 on: April 16, 2014, 12:40:34 AM
What I really liked about this piece was how it subtly unfolded layers and layers of the actual twist at the end--and every time we get a bit deeper, we learn more about the characters and the narrator in a way that makes us sympathize with all of them. At the end, I felt like it didn't really matter who was human and who wasn't--it was a story of individuals who loved each other and did what they thought was best for each other.

Very delicately, sensitively written and a great concept.



albionmoonlight

  • Matross
  • ****
  • Posts: 213
Reply #31 on: April 16, 2014, 08:19:35 PM
Hey! Just listened to this one, and I had a completely different solution occur to me in the middle. Instead of cutting out what made each of them most human to save on Heaven space, (or delete one of them, or the Macguffin,) why not translate the things that Kumara needs the memory for into those virtual heavens, and let them help her fight off the Revenants? There are possible parallels, though they don't fit perfectly; the tech officer fighting off ferocious threats would be the clearest, dialing him into the defenses. The captain's heaven could be tied into the search for the safe jump point, (even though she's trying to find somewhere she hasn't been, and Kumara is trying to get home. Might have to deceive her a little.) The two lovers would be the hardest to fit in a satisfying way, something about combining to analyze data, or just sending it back and forth.

After I got to the bit about feeding what it got from the human crew into the Macguffin, I don't really like my idea so well, but I still wanted to share it with y'all.

That would have been a really cool place for the story to go.



evrgrn_monster

  • Lochage
  • *****
  • Posts: 356
  • SQUAW, MY OPINIONS.
Reply #32 on: April 22, 2014, 12:10:45 AM
Welp, here goes.

I did not like this story. In fact, I think my dislike of this story was directly proportional to the amount of love it has gotten from the majority of the forum.

There were many things that I agree were laudable; the writing and pacing was more than decent, and the characters were distinct and interesting. However, the different heavens were, honestly, to me, a bit heavy handed. I couldn't help but think of What Dreams May Come, that Robin Williams movie, that I saw as a kid. It's not that the idea in itself wasn't intriguing, or that the heavens themselves weren't well described; they just read as cheesy. In addition, I felt like having three of the four crew member's heavens be water based made those two particular places lack distinction. With literally nothing but their own minds shaping their worlds, it is less interesting to me that the majority of the group would pick such similar destinations. Thinking on it as I write, the fact that those three were at such peace in their heavens, while the one in the mountains was in such turmoil, was another thing that took away some of the appeal I think this story had potential of having.

I did not like the twist. Not the twist that the one who made the decision was Kumara, but who the AI was speaking too and attempting to explain herself to. I felt like the entire reason behind the fighting itself, against the Machine for something stolen, was poorly explained, so the entity she was speaking to was, in extension, hard to really understand in a meaningful way. I don't really get how the things she took from the crew members was fine when being used by it, but not when being used by the crew themselves.

On a philosophical level, I believe she made the full-on wrong decision. I'd rather die than be less of myself, especially if the thing taken from me was as defining of my personality and soul as the things she took from the crew. That choice was the one thing that I agree with the forum on, though. That was a great moral quandry to think on; I just didn't really like the packing I got the question in.


Rose Embolism

  • Extern
  • *
  • Posts: 3
Reply #33 on: April 23, 2014, 01:48:35 AM
It was weeks ago that I listened to this podcast, but just now, reading the forums, something occurred to me: By taking the actions she did, Kumara didn't just harm her crew, she ALSO screwed up the mission.

Recall the object was to get a sample from the Machine God, and then grow it to see what it would become. But she went and injected the sample with the human drives and personality elements of her crew, which means she hopelessly contaminated the sample. It's as if a researcher pithed their fellow team members in order to get a sample of Lake Vostok, and then dumped their pee in the sample.

So not only did Kumara wreck the minds of the crew, she rendered the entire mission, and the actions she took in order to complete the mission, pointless. Man, the things we do for love.



TheFunkeyGibbon

  • Palmer
  • **
  • Posts: 23
Reply #34 on: May 13, 2014, 01:10:17 PM
I loved the reveal and it made me smile. That Kumara pleaded with the non-existent Systems Officer was actually brilliant for me because it made me think how we all wrestle with decisions and sometimes vocalise those questions despite already possessing the answer.

I normally hate open ended endings because the frustrate with their lack of resolution but this did actually feel like a complete story and that the adventures of the 'other' would be a who new story. Which I would love to hear by the way...



meggzandbacon

  • Extern
  • *
  • Posts: 6
Reply #35 on: May 13, 2014, 02:18:31 PM
Ad to me to the list of love for this one.  I've been thinking about it all day actually.  You know it's a fantastic story when it has you doing that!  :)



SonofSpermcube

  • Guest
Reply #36 on: May 13, 2014, 05:12:53 PM
The idea that the simulation could not be stopped doesn't make any sense in a computer that is itself capable of stopping and restarting without loss.  Consider emulator savestates, or hibernation savestates.  Externally, the emulation (or simulation if you like) stops, but internally to that simulation it is continuous.  The only way that this idea makes sense is if the computer on which the minds are being emulated is itself a brain; AND that also requires that you accept that the reason brains aren't good at being restarted without loss must be less to do with what happens after the mind stops minding or what causes it to do so, and more to do with the actual stoppage itself. 

On a bad-sci-fi scale of 1 to 10 with the human batteries premise in The Matrix being about an 8 and every sci-fi themed pop song being a 10, this is about a 4 or 5; it was clearly required to make the conflict work, but it just stood out like a hangnail for me.  Good story otherwise. 



Fenrix

  • Curmudgeonly Co-Editor of PseudoPod
  • Editor
  • *****
  • Posts: 3996
  • I always lock the door when I creep by daylight.
Reply #37 on: May 20, 2014, 09:29:09 PM
Something that has not been mentioned about the reveal that Kumara is the Systems Officer is the critical nature of this deception to the story craftsmanship. If we did not have Kumara pleading with the Systems Officer to hurry and to emphasize the external threat, there would be a fraction of the dramatic tension in the story. If this deception was not included to emphasize the external threat and time crunch then there would be mass complaining about the computer dawdling under fire.

Good stuff, and a really nice pickup for EscapePod as an original publication. Well done!

All cat stories start with this statement: “My mother, who was the first cat, told me this...”


PotatoKnight

  • Palmer
  • **
  • Posts: 51
Reply #38 on: May 20, 2014, 11:20:48 PM
On a bad-sci-fi scale of 1 to 10 with the human batteries premise in The Matrix being about an 8 and every sci-fi themed pop song being a 10

Now I'm just curious what sci-fi themed pop songs you are using to calibrate your scale.



Scattercat

  • Caution:
  • Hipparch
  • ******
  • Posts: 4904
  • Amateur wordsmith
    • Mirrorshards
Reply #39 on: May 22, 2014, 11:48:03 AM
On a bad-sci-fi scale of 1 to 10 with the human batteries premise in The Matrix being about an 8 and every sci-fi themed pop song being a 10

Now I'm just curious what sci-fi themed pop songs you are using to calibrate your scale.

A suggestion.



sethdickinson

  • Extern
  • *
  • Posts: 1
Reply #40 on: May 22, 2014, 08:42:07 PM
The idea that the simulation could not be stopped doesn't make any sense in a computer that is itself capable of stopping and restarting without loss.  Consider emulator savestates, or hibernation savestates.  Externally, the emulation (or simulation if you like) stops, but internally to that simulation it is continuous.  The only way that this idea makes sense is if the computer on which the minds are being emulated is itself a brain; AND that also requires that you accept that the reason brains aren't good at being restarted without loss must be less to do with what happens after the mind stops minding or what causes it to do so, and more to do with the actual stoppage itself. 

On a bad-sci-fi scale of 1 to 10 with the human batteries premise in The Matrix being about an 8 and every sci-fi themed pop song being a 10, this is about a 4 or 5; it was clearly required to make the conflict work, but it just stood out like a hangnail for me.  Good story otherwise. 

This was definitely on my mind while writing the story - it's pretty fundamental to the nature of computing that the simulation should be able to halt. Ultimately I resorted to a kind of cheat, invoking that dread term 'quantum'. Kumara's heaven mainframe is a topological quantum computer, and under these operating conditions, the minds stored within will become lossy and decohere if the simulation is halted. I don't know if you'll find this a satisfying answer, but I hope it's worth something that it was on my thoughts!

Kindest thanks to all of you who've responded.



Scattercat

  • Caution:
  • Hipparch
  • ******
  • Posts: 4904
  • Amateur wordsmith
    • Mirrorshards
Reply #41 on: May 24, 2014, 12:14:50 AM
"Quantum" is to modern SF the way "nuclear" or "magnetic" was to 1950's SF.  Someday, people besides the experts will be able to go, "What?  That makes no sense whatsoever," but in the interim, it's a useful stand-in for phlebotinum or unobtainium.  ;-)



hardware

  • Matross
  • ****
  • Posts: 192
Reply #42 on: July 29, 2014, 03:15:29 PM
Hmm, strangely I was not a huge fan of this story. I felt like the unreliable narrator trick was played in a way that was only there because the story needed it, not because the narrator had any real motivation behind it. Specially since the narrator has to turn the table on itself, which felt like it came out of nowhere.

Similarly, the science and technology of the world feels way too convenient for the story, bordering on deus ex machina territory. The characters are not particularly interesting, and the whole heaven construct basically reduces them to a single trait.

But given those complaints, I found the narrative drive and writing good, and the world building had seeds of something very interesting. I remain curious to see what the author comes out with in the future.



Gamercow

  • Hipparch
  • ******
  • Posts: 654
Reply #43 on: July 29, 2014, 08:06:41 PM
The idea that the simulation could not be stopped doesn't make any sense in a computer that is itself capable of stopping and restarting without loss.  Consider emulator savestates, or hibernation savestates.  Externally, the emulation (or simulation if you like) stops, but internally to that simulation it is continuous.  The only way that this idea makes sense is if the computer on which the minds are being emulated is itself a brain; AND that also requires that you accept that the reason brains aren't good at being restarted without loss must be less to do with what happens after the mind stops minding or what causes it to do so, and more to do with the actual stoppage itself. 

On a bad-sci-fi scale of 1 to 10 with the human batteries premise in The Matrix being about an 8 and every sci-fi themed pop song being a 10, this is about a 4 or 5; it was clearly required to make the conflict work, but it just stood out like a hangnail for me.  Good story otherwise. 

This was definitely on my mind while writing the story - it's pretty fundamental to the nature of computing that the simulation should be able to halt. Ultimately I resorted to a kind of cheat, invoking that dread term 'quantum'. Kumara's heaven mainframe is a topological quantum computer, and under these operating conditions, the minds stored within will become lossy and decohere if the simulation is halted. I don't know if you'll find this a satisfying answer, but I hope it's worth something that it was on my thoughts!

Kindest thanks to all of you who've responded.

I'm a sysadmin.  I work with virtual machines every day and I actually loved the idea that a person's mind could not be stopped and restarted.  If a ship member's mind completely lives in memory, stopping and starting that person's mind, to me, would at the very least revert the mind to the time that it was put into the system in the first place, the snapshot of that mind.  If there was no snapshot, then removing them from memory would absolutely wipe them out. 
When you turn off a virtual machine, you have to "quiesce" the virtual memory of that virtual machine.  If you don't, you are very likely going to cause problems with that virtual machine.  There are some ways around this including caching, snapshots and shared storage, but if a machine is 100% memory, you're boned. 

Up until that explanation, I was skeptical of the decision because it was a simple "just hibernate them" answer for me.  Thank you for closing that loophole. 

As for the decision itself, I had the opposite problem to Kumara.  The captain wanted to die to save the ship, kill her.  The pair were so wrapped up in each other and themselves that they were communicating millions of times the amount of data, therefore they experienced millions of times the time, therefore they were good candidates to be killed.  The tech officer was basically in his own Sisyphean hell, and would have been better off not existing, so kill him.  This choice would not have taken me long, I probably would have stopped at the captain.  But all 4 were justifiable.

The cow says "Mooooooooo"