Darin Sunley wrote:
> Spike Jones wrote:
> >Honest to god, life is fun, even those decades
> >that have 3s and 4s in the tens column. Tell us that the AI guys
> >are planning *something* as an escape mechanism, and I mean
> >something more convincing than Clarke's automatic cable cutter
> >on HAL's power cord. Let's see: packing explosives around
> >the SIAI mainframe rigged to explode should HWoMBeP's heart
No, no, no, *no*, NO!
> I just visualized a nanite swarm bleeding out of the computer console,
> enveloping all human beings in the room, and receding, leaving only an
> elegant reliquary [sp?] [container for relics, medieval Catholic church
> thing] containing Eliezer's still-beating heart.
Pretty much, yes. There's a gruesome little story here if anyone wants to
Friendly AI, rule #7: "If the AI *wants* to violate Friendliness, you've
If it gets to the point where explosives packed around the mainframe start to
look reassuring to the clueless, you are already screwed over so thoroughly
that a strategic nuke isn't going to help. Every non-nitwit safeguard happens
*before* a transhuman AI decides it hates you.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:20 MDT