"Michael M. Butler" wrote:
> "Eliezer S. Yudkowsky" wrote:
> > No, actually the whole idea is that nanotech/AI/superintelligence scenario is
> > hopefully alien enough to eliminate even the need for self-defensive
> > thinking. My own philosophy is that the future is so distant - not just in
> > terms of the environment, but in terms of who we will be - that the best
> > course is to try and be the most intense human you can be, here and now,
> > without moping too much over how much better you would be if you didn't have
> > to be human. Arguably, firearms - and to an even greater extent, martial arts
> > - are a part of that. The question, as always, is time.
> Cheerio. The one rhetorical card I see you palming in the above is "the
> whole idea" s/b "my [Eliezer's] whole idea". :) You're still doing 'way
> better than most politicians.
I didn't get that. And what's s/b?
> It can also be read as "I have a handle on a piece of what the Universe
> can dish out, insofar as I am able."
I have that right now, and while I may know which end of a safety is up and
what a roundhouse kick is, the truth is that I don't know how to shoot and I
haven't taken any martial-arts training since age nine or so. That particular
got-a-grip feeling is the result of self-awareness, knowing how your mind will
react even if you run across something you can't handle.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:18 MDT