-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Damien Broderick <d.broderick@english.unimelb.edu.au> Wrote:
>If it [the self] is somehow separate, with its own (perhaps hard-won) canons
>of principles, it might feel swayed but still need to *choose*
Yes, it needs to choose, it needs time for processing, it needs time to think because it doesn't know what state its mind will end up in regarding the issue in question. Turing proved that issues exist that can never be resolved ( the computation never stops) into a satisfactory mental state, and frustratingly he also proved there is no way to identify such impossible problems. Everybody has problems that bother them, we ignore them for a time and do other things but they gnaw at us and we go back to them for another try. Sometimes we eventually win and the problem stops, sometimes it never does.
Some things are futile to work on, but we don't know what they are.
>Does a totally consistent libertarian, or an enlightened and egoless Zen
>saint, or a totally depraved fiend, feel like a robot?
No because nobody can escape Turing. Well ... an egoless man might, but I don't believe in Zombies.
O'Regan, Emlyn <Emlyn.ORegan@actew.com.au> Wrote:
>Say you were a super super ... super intelligence (S^NI), modified beyond
>all comparison with the gaussian version of yourself. After a particular new
>modification to jump you up to a new level of intelligence, you find that
>you are so awesomely intelligent that you can predict with 99.99% accuracy
>the outcome of any action that you might consider,
John K Clark jonkc@att.net
-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.5.5
iQA/AwUBN7grP9+WG5eri0QzEQLGEACeJG7Xwd7Hklmyp7P9ojERBSN/SZkAn2oT
fxgOc2pGpd0lWZE0xCc49D5D
=Lsl3
-----END PGP SIGNATURE-----