Robin Hanson wrote:
> I agree with Eliezer that it is likely that we can find clear fMRI
> signatures of conscious intentional lying. What bothers me more is the
> fact that we are only conscious of the smallest fraction of what goes on
> in our heads. Our unconsious selves are quite familiar with "lying" to
> our conscious selves to help our genes to reproduce.
Heh, heh, heh! But you see, my dear Dr. Hanson, there's an immense amount of evolutionary effort behind our ability to lie to ourselves. Does everyone see where this is going?
The development of an "unrationalization" device would have an impact even larger (albeit less violent) than that of a truth machine. This may well be the most powerful form of near-current-tech intelligence enhancement I know of, perhaps more powerful than neurosurgical Algernic specialization. The impact of this device is almost impossible to estimate. If the development of a truth machine represents a change in the sociopolitical climate on the order of an Ice Age, then an unrationalization device represents an asteroid strike.
Such a device is mentioned in passing in a Story In Progress of mine, although it really deserves its own novel.
> Romeos sincerely believe they'll "love her forever", even though it'll
> be a different her in a week. Politicians sincerely believe that public
> schools are best for us all, even though for some reason they prefer to
> send their kids to private schools. Sales people find it easy to believe
> that their product is really the best for you. Most people find it easy
> to think they are above average drivers, lovers, etc. "The Moral Animal"
> by Robert Wright discusses a lot of this sort of self-deception. See
> also: http://hanson.berkeley.edu/belieflikeclothes.html
A wonderful book; it was my introduction to evolutionary psychology. Required Reading. Anyway, given the highly specialized behaviors, it isn't at all hard to believe that there are specific brain areas involved.
> As John Clark said:
> >Another problem is that the most dangerous and horrible monsters are
> >also sincere monsters. I don't think it was just an act, I think Hitler
> >really thought Jews were subhuman and that he was doing a good thing by
> >butchering them. Sincerity is a vastly overrated virtue.
> With a truth machine, people would want to avoid intellectuals, cynics,
> and others who point out how their "sincere" beliefs are inconsistent and
> self-serving. After all, if they listened to such critics they might
> adopt beliefs which would hurt them socially. "In all likelihood, I'll
> only love you this much for a week." I fear people might instead begin
> a strong and perhaps bloody supression of critical voices. Sorta like
> what happens in war time suppressing those who aren't sure the war is a
> good idea, only much more so.
We appear to be in definite agreement on one thing: A truth machine would cause a gigantic political upheaval. A lot of social arrangements would snap like twigs and the balance of power everywhere would drastically alter. It would be the Y2K problem of the social fabric.
Politicians with all their dirty linen suddenly aired. Church officials - what happens if the machine red-lights the Pope? Dictators stamping out all traces of rebellion. Children of religious parents punished for their hidden doubts. Consensual criminals hunted down. Chairmen who don't really care about their employees ousted. Ninety-nine people with dirty linen and one altruist trying to get all use of the machine declared a civil rights violation. UN bans. Uprisings. Black-market machines used for blackmail. Congressbeing Lawyer replaced with Congressbeing Fanatic, Congressbeing Rationalizer, and Congressbeing Idiot.
On the whole, I really think it would be a good idea if the "unrationalization" machine were introduced first.
Singularity permitting, I expect the Age of the Neurohackers to begin within 15 years. An article in _Wired_ once pointed out that the reason the great biotech frontier hasn't gone mainstream is because biotech is so expensive; there are no hackers, no cowboys. He expected biohackers in 30 years, I think. I expect the first neurohackers in 10.
Ah, it's an exciting time to be alive!
But don't forget the canned food and shotguns.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.