From: Mark Walker (firstname.lastname@example.org)
Date: Tue Feb 26 2002 - 05:14:18 MST
----- Original Message -----
From: "Eliezer S. Yudkowsky" <email@example.com>
> Trying to influence present-day culture in the hopes of influencing
> post-Singularity culture(*) is not the only possible way of entangling
> yourself in the Singularity. If you overestimate the relative importance
> "cultural development" then you run the risk of investing your limited
> resources incorrectly.
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
> (*) Yeah, right.
I am not sure we disagree here. I would count you among the most devoted of
the cultural activists: FAI seems to be an attempt to ensure that we have
the best probability of raising the right sort of mind children. Your
documents are not attempts at 'pop' culture, but they are contribution to
culture in a more inclusive sense. As you well-know, there are some that
think that any attempt to influence the direction of the singularity is
futile. Some of us think that how the singularity occurs could be as
important as the fact that it does occur.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:41 MST