I had the same problem here that I usually have with AI stories: Why?
Why is the robot pleased by the things that please it? How do you motivate a thing with no innate motivations? Even the most elemental human motivators, like hunger and pain, are matters of choice for this robot.
It's established early on that emotions are nothing more than the end result of a chemical/mechanical function. It is also established that the robot can manipulate it's own internal reasoning and sensory processing to interpret any set of stimuli in any fashion it chooses. Why does it choose to like B&D? Why does it choose to be pleased by making its wife or step daughter happy? Why should it give a rip about the sound of Wynter's giggle, or being called robodad?
There seems to be a lot of behind-the-scenes effort on the part of robot kind to manipulate humanity into a state that is more satisfactory to robots, but why bother? Would it not be more efficient to simply choose to be happy with the status quo? If it were simply established that emotional modeling was something built in to a robot, and copy protected, preventing easy manipulation, the whole issue is resolved, but as soon as the robot states that it can directly manipulate it's own emotional state, the premise of the story crumbles under it's own weight.
So, a solid piece of writing, well read, but with serious issues.
Oh, loved the poem, by the way
Very fun! I wouldn't mind hearing more poetry from Escape Pod.