When Eliezer speaks of "The Singularity" (tm?) he means a friendly one.
ExI also plays a big role in helping create a friendly vs. non-friendly
event horizon IMO. Imagine if we never had these discussions about these
things, and you had AI researchers going off and doing clueless stuff that
would have a much higher chance of ending badly. Hmm well actually they do
that anyway, but at least _some_ of them are getting more awake regarding
the all important friendly AI issue.
Samantha Atkins wrote:
>
> "Eliezer S. Yudkowsky" wrote:
> >
> > Zero Powers wrote:
> > >
> > > "Your dollars accomplish around a hundred times as much"??? Really?
> > [snip]
> > > Don't get me wrong, I'm not knocking Exi or agreeing with Ian. But I think
> > > your comments might be just as much an overstatement as his post was.
> >
> > Hey, get with the Singularity. Billions of lives are at stake here. If the
> > ExI has a total pre-Singularity lifetime budget of, say, $1M (sadly, probably
> > a huge overestimate) and it contributes a hundredth of one percent to the
> > "win" - assuming we win - then that's the equivalent of, say, six lives for
> > ten dollars. Alternatively, if the existence of ExI means the Singularity
> > happens four days sooner, then that's also the equivalent of six lives for ten
> > dollars. Can the Red Cross match that? I really don't think so. I think
> > they'd be hard put to save another six lives with another thousand dollars.
> >
> >
>
> I hope you are tongue in cheek. Billions of lifes are at stake in the
> world in any case. It is not at all clear that getting to the
> Singularity will be a net good thing that will aid rather than
> destroying billions of lives. The work has to be much broader imho than
> simply racing toward Singularity.
>
> - samantha
-- Brian Atkins Director, Singularity Institute for Artificial Intelligence http://www.singinst.org/
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:18 MDT