I am the king under the mountain... and this is the first post on this thread.
Anyway, I liked this one. It was a neat, sad story about how the things we make sometimes aren't exactly what we planned for them to be... and how that can be tragic. It made sense, of course. How could you explain to a machine that you built to understand mass and velocity and trajectory all the complications of its mission? How could you explain to this machine that it's
job is to
kill? So, of course, "she" misunderstands her mission and screws it all up...
Freia's journey towards understanding beauty as rightness (there's more to beauty, of course, but I think I can forgive a drone killer who had to come to understand what beauty means on her own, by surfing the web, for having a somewhat simplistic view of beauty) was, well, beautiful. Her ultimate fate, of course, is sad, because there can be no peace for Freia. Either her targets will eventually kill her, or she'll run out of fuel.
One little thing I kind of liked: new brain science is teaching us that actually, all decision making is emotional. Oh, sure, you think that the ideal is to be all Spock-like and weigh the options, but it doesn't really work that way. If I went into your head and gave you a little selective brain damage, severing the connections between your forebrain and your emotional bits, not only would I totally screw up your social life and cause you to die miserable and alone (sorry 'bout that), the most immediate consequence is that you would lose the ability to make decisions. Even little things, like "which color socks should I wear" or "which color pen should I use to write this down?" would take you
hours.
So, to the extent that Freia had to make decisions, she had to have emotions, which is what screwed her up. However, to the extent that she was a machine and hadn't been built to understand the larger context and stakes of her actions, she was ultimately incapable of making the decision. So the problem wasn't that Freia was a machine whose complexity led her to have feelings... the problem was that they didn't go far
enough. They didn't teach Freia to understand what she was doing, to hate her enemy and want them to die.
Ultimately, this brings on a final layer of complexity to this story.
Freia was built to make war in the way that we wish we could make war. I think that a lot of people like to see war as clean, abstract, pure, and glorious. It is a clash of nation on nation, ideal on ideal. We don't want to think that we hate and fear our opponents: we want to think that our interests conflict with theirs, and we will use military force to decide who gets what they want and who has to do without. This, however, is total bull. War is not clean. It's awful and hellish and people die, or come home broken in mind and body, and we do it because we allow our hate, greed, and fear (rather than our love, compassion, and generosity) to influence the decision-making process. Freia was built to make war in the way we imagine war to be... and as a result, she couldn't do it.
By the way, another neat brain science tidbit: if you want people to make nicer decisions, make them eat cake, ice cream, and other oh-so-good-but-oh-so-bad-for-you treats. Apparently, eating treats makes you feel like a bad person, so you then have to try to balance the scales by being nice to people. By contrast, eating health food makes you less likely to be nice or helpful. In other words, pie makes you a saint and granola makes you a jerk.
