Eliezer S. Yudkowsky wrote:
>JKC has noted that evolution has put a tremendous amount of effort into
>detecting and concealing lies, so a perfect truth machine is unlikely.
>I say exactly the opposite: Evolution has put so much effort into lies
>that there's probably a module of the brain devoted to lying (anyone
>know if someone's looked for it?), and thus it might be very easy to
>detect activity with an fMRI. Evolution baffles verbal and kinesic
>perceptions, but would have absolutely no reason to defend against
>neuroimaging. The inventor might not even need much cognitive science;
>a neural net might be very easily trainable to decode "lying" brain activity.
I haven't read the book, though it should arrive today from amazon.com.
I agree with Eliezer that it is likely that we can find clear fMRI signatures of conscious intentional lying. What bothers me more is the fact that we are only conscious of the smallest fraction of what goes on in our heads. Our unconsious selves are quite familiar with "lying" to our conscious selves to help our genes to reproduce.
Romeos sincerely believe they'll "love her forever", even though it'll be a different her in a week. Politicians sincerely believe that public schools are best for us all, even though for some reason they prefer to send their kids to private schools. Sales people find it easy to believe that their product is really the best for you. Most people find it easy to think they are above average drivers, lovers, etc. "The Moral Animal" by Robert Wright discusses a lot of this sort of self-deception. See also: http://hanson.berkeley.edu/belieflikeclothes.html
As John Clark said:
>Another problem is that the most dangerous and horrible monsters are
>also sincere monsters. I don't think it was just an act, I think Hitler
>really thought Jews were subhuman and that he was doing a good thing by
>butchering them. Sincerity is a vastly overrated virtue.
With a truth machine, people would want to avoid intellectuals, cynics, and others who point out how their "sincere" beliefs are inconsistent and self-serving. After all, if they listened to such critics they might adopt beliefs which would hurt them socially. "In all likelihood, I'll only love you this much for a week." I fear people might instead begin a strong and perhaps bloody supression of critical voices. Sorta like what happens in war time suppressing those who aren't sure the war is a good idea, only much more so.
email@example.com http://hanson.berkeley.edu/ RWJF Health Policy Scholar FAX: 510-643-8614 140 Warren Hall, UC Berkeley, CA 94720-7360 510-643-1884