RE: Singularity: AI Morality

Billy Brown (
Wed, 9 Dec 1998 12:32:51 -0600

Samael wrote:
> The problem with programs is that they have to be designed to _do_
> something..
> Is your AI being designed to solve certain problems? Is it being designed
> to understand certain things? What goals are you setting it?
> An AI will not want anything unless it has been given a goal (unless it
>accidentally gains a goal through sloppy programming of course)..

Actually, its Eliezer's AI, not mine - you can find the details on his web site, at

On of the things that makes this AI different from a traditional implementation is that it would be capable of creating its own goals based on its (initially limited) understanding of the world. I think you would have to program in a fair number of initial assumptions to get the process going, but after that the system evolves on its own - and it can discard those initial assumptions if it concludes they are false.

Billy Brown