Eugene Leitl wrote,
> ...I would get second and
> third and fourth opinions from independent experts in the area) and
> *would not cease and desist* when confronted with what they do and
> *given proper time to recant* would have to be prevented from what
> they do, *preferably by locking them up*, or, failing that, by
> *executing them*.
I don't know... This just doesn't sound very friendly to me.
In fact AI seems much more friendly.
Even if AI did not seem more friendly, I'd trust it more than this brand of
> ...I will be compelled to terminate these people with
> whatever means are at my disposal.
How very depressing. I suppose Joseph Stalin had similar feelings.
All right, go ahead and terminate me -- but can you please spare my kitty cat?
She's just a little furry feline critter... she likes to purr when she gets
Please let her live won't you? I mean, she won't ever hurt you or anything.
She doesn't eat or drink very much, and her atoms are small... so you won't need
to eat her until way later in your world domination plan. Can't you just maybe
let her go into the woods or something?
Her name is Eva, and she wouldn't hurt a fly (well, not maliciously anyway).
> Now assuming you're mistaken about your superhuman AI,
Yes, you do need to make such an assumption, don't you.
Sort of like, "Assuming you're mistaken about your child being human" -- we can
go ahead and use it to fuel Eugene's Turing Police Patrol and Donut Factory.
> Enough of this crap.
Now there's a quotation worthy of the Amara Graps Collection.
"Enough of this crap."
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT