Carol Tilley wrote:
> > I am looking forward, with immense anticipation, to the day
> > when Eliezer looks into the bowels of his transcendent AI
> > and sees my face staring out at him.... :-)
> Yes. And we will each be standing by with a set of paddles, gleefully
> awaiting the opportunity to revive him from his unconscious cognition. Or
> was that conscious incognition?
Okay, I didn't understand this, I don't want to understand this, and I declare myself the winner of this argument on the grounds that my opponents have degenerated into incoherent gibberish.
> > ...the at-risk computational systems are those that attempt
> > to self-evolve without a backup copy.
> and here I thought that gods controlled/created their own destiny
> > Dancin on the edge,
> Well, I suppose that even your virtual edge is serving it's virtual
> purpose.<repressing a smile>
> C. Tilley
Are you guys on drugs or something?
-- firstname.lastname@example.org Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way