Olga Bourlin wrote:
> I don't know if Extroprians or the Singularity Institute has a SIG which
> addresses social concerns and damage control in the here and now (before the
> Singularity). If not, I would suggest starting one. Brian Atkins has
> already made it known that he doesn't "have time for PC crap. Why not take
> it somewhere else and waste their time with it?" (I understand, he is a
> busy guy.)
The Singularity Institute does not. There are legions upon endless
legions of organizations devoted to damage control in the here and now.
Most people aren't aware that any other kind of damage control exists.
Those institutions have plenty of funding. We don't and we are not
interested in eating our seed corn. There is no reason for the
Singularity Institute to get involved unless there's an AI or
ultratechnology or Singularity aspect to the potential problem, something
that puts it out of reach of the people who would otherwise attempt to
deal with it.
If/when we have a prototype AI, there may be certain classes of
here-and-now social problem that can only be solved by SIAI - for example,
if an infrahuman AI system in our possession is sufficient to turn a
then-modern PC into a better educational system than a current-day public
school, thus enabling us to solve the problem by throwing hardware at it.
This would also constitute a legitimate rationale for SIAI's intervention.
Finally, Brian Atkins is not the Singularity Institute. When he sneezes,
the Singularity Institute does not blow its nose. It takes a majority of
the Board, speaking in their official capacity, for that to happen. Thus,
Brian is not obligated to watch his every word in case someone
misinterprets it. Please do bear this in mind.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:14 MDT