Re: Otter vs. Yudkowsky: Both!

From: D.den Otter (neosapient@geocities.com)
Date: Mon Mar 13 2000 - 16:59:50 MST


----------
> From: Eliezer S. Yudkowsky <sentience@pobox.com>

> Dan Fabulich wrote:
> >
> > So AI and space travel are totally compatible goals. d.Otter may have
> > good reasons to not want you to build the seed at all, but that's a
> > totally unrealistic thing to hope for. So where's the argument left?
>
> I'm totally on board with both. In fact, I've explicitly proposed doing
> both. What worries me is the part where Otter wants to outlaw AI.

Ah no, outlawing never stopped anything. I think the word I
used was "curbed", but didn't write that the *state* would
have to do the curbing. Not on my website, anyway.

But hey, I'm in full agreement here: getting to space asap
is in everyone's best interest. We can always sort our
differences out later.

> You
> know and I know that as long as humanity survives, the creation of AI is
> inevitable; that doesn't mean that the creation of AI can't be delayed
> until nanocatastrophe either wipes out most of Earth's population (which
> I'd view as a partial loss) or all life in the Solar System (a total loss).

...or, until we ascend. Anyway, it doesn't really matter what
happens after you're dead 'cause you won't be there to see
it. Earth wiped out, galaxy wiped out, universe wiped out...
makes no difference to the dead. Life is *subjective*, and
goals only make sense while you're alive. When you die,
your arbitrary preferences die with you, and that's it.



This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:05:03 MDT