Terminators are not programmed to be altruistic. They're not programmed to be cruel either, of course: they will simply carry out their mission by any means necessary.
But if a Terminator develops free will, can it invent altruism from first principles? Can a Terminator develop a moral code based purely on reason and its own inbuilt imperatives? Or do they need the outside intervention of a Creator to give them morality?
Weaver is trying the religious approach, and it's no coincidence that she's chosen Ellison, the devout Christian, to teach morality to her artificial intelligence. I don't think she understands why a sense of ethics is important - and it's clear she has none herself. But she's trying to create an AI with free will, one that will cross the road against the lights, one that will take decisions for itself. Lacking full metaphysical autonomy (a soul, if you will) herself, she can only attempt to mimic the humans around her. So she first turns to a child psychologist for his help, and then to Ellison's divine inspiration. Weaver is attempting to create her own Creator, and that's a task beyond her understanding - so she can only proceed by blind faith.
She is now attempting to input a sense of morality into John Henry's basic programming - and if John Henry does go on to become Skynet 2.0, we can only hope that Ellison's teachings get as far as "Love thy neighbour as thyself " and don't stop at "I, the Lord thy God, am a jealous God."
And then there's Cameron. If we take her at her word - not necessarily the wisest action - her mission is to protect John, and in order to do that on a long-term basis her secondary missions must be to protect her own existence, and to learn how to blend into human society. However, whether she had it all along or whether it's a product of the chip damage she suffered in 'Samson and Delilah', Cameron now has free will. Her basic programming is telling her to terminate John, but she is overriding it.... and what is free will if not the ability to override your own base instincts?
Like Weaver, Cameron has no inbuilt understanding of morality and has to ask the people around her for instruction. However, she then applies the lessons she is learning to her own behaviour. This is not simply an act put on to fool others: Cameron is told by Maria the ballet teacher that "Dance is the hidden language of the soul"... and so she goes secretly to her room and dances a ballet there in private. When
If anything, she's doing it more and more. In Season 1 Cameron was often impatient of the human foibles John and Sarah showed. They were inefficient, didn't help the mission, and were therefore pointless. But increasingly now in Season 2 she's asking them the reasons behind their actions, and taking them on board, and then trying to apply the principles to her own behaviour. She does the same with other humans she meets, like the night librarian in 'Self Made Man' - and isn't that an evocative title in light of Cameron's attempt to construct her own personality? Yes, she still comes out with utterly crass comments now and then, to prove that her humanity is still very much a work in progress. She's still ruthless when it comes to fulfilling her objective whatever the cost, just to remind us that she's still a Terminator. In human terms she's a sociopath, valuing other people solely for what they can do for her and her mission. But she's slowly learning otherwise.
Is it all just an elaborate act to help her blend in, taken to extremes because she is now living among humans permanently? Perhaps it is. But for we normal, organic people morality is also something we are taught as children to help us function in society as adults. From a utilitarian perspective, ethics are the rules we invented to stop us from killing and raping and robbing each other, and as a result they make human civilisation possible. So in the end, what's the difference? If Cameron is successful in teaching herself to act as a human, how is she different morally from the people who were born human?