Re: Yudkowsky's AI (again)

den Otter (neosapient@geocities.com)
Thu, 25 Mar 1999 23:17:39 +0100



> From: Eliezer S. Yudkowsky <sentience@pobox.com>

> Bryan Moss wrote:
> > Yudkowsky wrote:
> > > If your self is preserved, you wouldn't kill off your
> > > fellow humans, would you?
> >
> > Yes.
> >
> > BM
>
> Okay, we now have:
>
> The Official List of People Not To Let Anywhere Near An Uploading Device:
> 1. den Otter.
> 2. Bryan Moss.
>
> Any other volunteers?

How about yourself? Never *ever* trust someone who jumps at every opportunity to ride the moral high horse. Those are the worst witch-burners. Besides, didn't you write repeatedly that you value the Singularity above everything else? Aren't you the one who wants to place humanity at the mercy of your pet AI?

Let's cut the sanctimonious crap and debate this as reasonable people, ok?

Here's link that may help to clear things up, do click it:

http://pierce.ee.washington.edu/~davisd/egoist/articles/Egoism.Robinson.html