On Sun, 24 Sep 2000, Samantha Atkins wrote:
> "Eliezer S. Yudkowsky" wrote:
> >
> > Samantha Atkins wrote:
> > >
> > > "Eliezer S. Yudkowsky" wrote:
> > >
> > > > Oh, why bother. I really am starting to get a bit frustrated over here. I've
> > > > been talking about this for weeks and it doesn't seem to have any effect
> > > > whatsoever. Nobody is even bothering to distinguish between subgoals and
> > > > supergoals. You're all just playing with words.
> > >
> > > Hey! Wait a second. If you are going to be a leader in the greatest
> > > project in human history (or in any project for that matter) you have to
> > > learn and learn damn fast to be able to motivate and enroll people.
> >
> > No, actually I should expect that the seed AI project will have smaller total
> > expenditures, frcom start to finish, than a typical major corporation's Y2K
> > projects. I used to think in terms of the greatest project in human history,
> > but I no longer think that'll be necessary, and a damn good thing, as I don't
> > think we're gonna *get* the largest budget in human history.
>
>
> Huh? I wasn't using "greatest" in respect to budget or size of
> development team (although I think both will be greater than you
> think). I was using it in terms of the criticality of this project.
> You continuously tell us it is our only hope. It is difficult to
> imagine a project much "greater" than that.
And what will it accomplish in terms which laymen can understand, the way
we understand a computer playing chess better than a human or doing
arithmetic better than a human?
FWP
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:38:50 MDT