"Eliezer S. Yudkowsky" wrote:
> I grew up knowing I was a genius and knowing that I could almost
I have no doubt of that. However since I am probably not a genius (yet), I have up
to this point been unable to argue with you on the finer points of your reasoning.
However, after abundant contemplation, I'm beginning to notice what may be some
logical inconsistencies in your position.
> certainly be part of something big if I played my cards right.
> I am not acting on wishes. I do not at this time project that the Singularity
> will result in the gratification of any of the desires that initially
> motivated me, not pride, not the acknowledgement of greatness, not the fun,
> and probably not even the knowledge of success. The personal drama that once
> captivated me is irrelevant. I am acting on logic.
Ok, so your acting on logic. If I understand you correctly, all of the things that makes life enjoyable (fun) are irrelevant because logically our only outcomes are singularity or oblivion? Either we face the inevitable or accept extinction?
> Now, I happen to think that even from humanity's perspective, a rapid
> Singularity is the best way to go, because I don't see a feasible alternative.
Can you concisely explain why a a non-rapid path to the Singularity is unfeasible?
> The only way that any of us
> can "not die" is through a friendly Singularity. If that's impossible, well,
> at least our deaths will be the ethically correct thing to do. Sooner or
> later human civilization will perish or go through a Singularity. This I
> guarantee. What can you possibly accomplish, for yourself or for anyone, by
> delaying it?
My life for starters. Unlike you Elizier, I could care less about the singularity if it means the end of my existence. What I do care about is the continued existence of my memetic conscious self and others in my memesphere (which includes even you Eliezer). Now if that means that I must embrace the Singularity or face death, then you and I are in agreement. I'm very willing to embrace logical outcomes, but only under the condition of my continued existence. If somehow 'logic' prevented me from surviving a situation, then 'logic' would have to go. Call me selfish, call me egocentric; but I'm not about to put my faith in an unknowable singularity (god) over my own self-directed consciousness. I would rather lead a long and futile quest for perfection as an individual, rather than join a more potentially satisfactory collective borganism.
> But that could all be rationalization. It's not the reasoning I use. I wish
> to do the right thing, which is a question that is resolved by intelligence,
> and thus I am required to create a higher intelligence to accept orders from.
> My allegiance is to the Singularity first, humanity second, and I'm tired of
> pretending otherwise to myself.
Paul Hughes