It appears as if Hal <firstname.lastname@example.org> wrote:
|Greg Egan had a short story about a guy who had some brain damage that
|left him unable to perceive beauty. So they did an operation to fix it,
|and he ended up with the ability to manually control how beautiful he
|perceived any given object or situation. He could pop up a little dial
|in his visual field and mentally twist it one way or the other, to make
|whatever he was looking at seem incredibly beautiful or very plain.
|Egan seemed to want us to draw his usual moral (IMO) about how this
|illustrates the pointlessnes of everything, how our perceptions are
|simply arbitrary and meaningless, etc. My gripe was that it presented an
|over-simplified view of beauty; it is not just a single perception that
|could plausibly be tweaked as in the story. It comes from many different
|sources. Many aspects of an entity may cause it to be considered beauty;
|its complexity, its symmetry, its concordance with other aspects of
|itself or of your beliefs and feelings. You can't just tweak a knob,
|you would need to go much deeper.
Do you suggest that ``beauty'', etc., have an objective existence outside of the neurological circuits of the mind observing the ``beauty''?
If not, then you in essence discuss the complexity of the knob implementation.
If you do, then you subscribe to a mysticist world view, in which science never can do <this> or <that>. These beliefs have cropped up before, and they have always been due to human error.
|The very idea that a trivial one-note hum could be made to seem beautiful
|in the same way as an elaborate symphony, is not only wrong but IMO
|pernicious. It cheapens our perceptions, it cheapens the inherent
|beauty and complexity of the mind itself. Yes, we are machines, but
|we are complex, intricate, endlessly fascinating machines. Egan always
|seems to miss this point. Rather than losing faith because we are mere
|machines, we should marvel at what mere machines can become.
Humans should not let the amygdala rule supreme in these matters.
The symbological threat to your ``humanity'' by the ``human machine'' concept means nothing, unless one embraces the belief that machines somehow have a lesser ``value'' than humans.
Personally, I perceive no reason to add a mysticist layer to my perception of the human machine to justify the notion that complex intelligence systems have a value in themselves, especially the conscious ones.