> What you and I need to worry about is the AIs getting their own ideas,
> completely independently of anything we did, and acting on those. You
> need to worry that the AIs will do an "Eliezer Yudkowsky" on you and
> reject the motivations it started out with, in favor of some more
> logical or rational set of goals.
Yeah, whatever, but it's still those primitive emotions that drive you on. Eliminate the passion, and...well, see below.
I need to worry about the AI, like
> EURISKO, suddenly deciding that if it shuts itself down it won't make
> any mistakes - or making some other logical error.
Hey, if the AI thinks that shutting itself down is the Right Thing To Do, then who are *you* to question its competence? Logical error my ass, death is the meaning of life. 8-P
> Emotions don't enter into it, and neither does the way we treat them.
> AIs don't react. They act.
Without emotions or some other motivational system, I'm afraid they'll just sit on their ass and be lethargic. Not that I'm complaining, of course, as this would further the cause of uploading.