From: "Jim Fehlinger" <email@example.com>
> Also, effects that some folks on this list can contemplate with
> are events that would horrify many people outside of extreme
> circles. For example, Eliezer Yudkowsky has said on many occasions
> as long as the human race survives long enough to give birth to some
> of superintelligence, the ultimate fate of humanity is of no consequence
> (to him or, presumably, in the ultimate scheme of things). I suspect
> that this
> attitude is part of what gives folks like Bill Joy the willies.
Please be more careful when quoting people. I'm sure the context of the word
'humanity' indicated 'human-ness'. I really don't think Eliezer means he
doesn't care about the people living now, far from it.
Personally, I'd rather have a better substrate, and one which I can
customize too. This ridiculous bag of mostly-water I pilot is a real drag. I
won't even mention the twisted cognitive machinery I was given. This primate
model is due for a complete redesign. Humanity ha! you can keep it!
On the point of it scaring the willies out of people, I think it's already a
lost cause to try & get everyone up to speed. The fact is; the world is
changing faster than society. This will only become more pronounced as the
curve steepens. At some point there will be a destabilization of the old
ways. <sigh> That's gonna takes some fancy footwork or we're all screwed. I
have zero faith in 'humanity' to figure it out alone without bloodshed. Just
read the news.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:19 MDT