Timothy Bates wrote:
> >blah blah blah.
> the only hard problem, IMHO, is keeping all the philosophers who want to
> waste our lives talking about the hard problem instead of solving the series
> of "easy" problems, which is all there really is.
> This "hard problem" phrase is really just an ambit claim for high
> intellectual ground, ie., "oh yes, but you biopsych's only study the
> non-hard problem ..."
> ZhuangZi: Look how happy those fish are!
> HuiZi: You're not a fish; how can you know they are happy?
> ZhuangZi: You're not me; how do you know I don't know?
I agree, in the sense that there's a persistent illusion of a "hard' consciousness problem, or a perceived need for conscious perceptions to have a uniquely special status of some sort. Look at some of Searles' arguments about "semantic contents" for instance, he might as well be talking about ectoplasm or some other kind of ghostly stuff. Here on this list, we've got high intellectual ground arguments of a comparable nature, where maybe perfectly smart ET's wouldn't be conscious because of not having the right ectoplasm, or excuse me, because maybe they wouldn't have the right sort of super hypothetical physics connection, or something.
I've been inspired by reading Dennett's book _Kinds of Minds_, he does interesting arguments about how evolutionary process relates to the development of mind capabilities, it's all part of what he calls the Tower of Generate and Test. Meanwhile, if we really want to get back to basics, we should recognize that our mental processes are in themselves real-world computational events, and therefore no *less* real than anything else in the universe. In other words, "I think, therefore I think". Also "how do you know that I don't know whether a fish has feelings", like hey man, that's right on!
David Blenkinsop <firstname.lastname@example.org>