> (g) The ten commandments (or another set of moral codes).
I'm sorry to piss on your fire, folks, but it will not be necessary to instill moral codes into our androids. You design a system for what you want it to do, not what you want it not to do. You make the assumption that "intelligence" (like humans) implies "free will" i.e. any possibility can be selected in the solution of any self-selected goal for the fulfillment of an equally self-selected motivation. Humans kill, rape and do all these terrible things, because our base motivation is malevolent. That's everyone's, not just the people who actually end up in the situations that make these things necessary. Don't forget that we're genetic beings that have evolved from a seething mass of competition and death for millions of years. Robotic slaves made by humans will have a particular purpose, maybe ironing, warcraft piloting etc...., none of which are similar domains to the natural selection that humans and the other species have been through. Their base motivation will be whatever we make it, and we're hardly likely to make it "Survive at all costs, and try to rule the world". It will not spontaneously decide to do anything that is not a solution to it's base motivations (just like us). We humans just like to glorify ourselves, and in doing this we pass on the abstract concept of our superiority and "free will" to other "intelligences" that we create. Let's discuss this further - I'm sure that some of you won't be happy with the above, so we can clarify this most interesting point wiv some more discussion!
This footnote also confirms that this email message has been swept by MIMEsweeper for the presence of computer viruses.