Re: Information

VirgilT7@aol.com
Tue, 7 Jul 1998 17:19:46 EDT

In a message dated 7/5/98 1:34:28 AM Eastern Daylight Time, johnkc@well.com writes:

<<
I further predict that I'm correct so you'll never find a counterexample to prove me wrong, but I predict I can't prove I'm right either because a proof that I'm correct, that is, a way to demonstrate it in a finite number of steps, does not exist. Not that I can prove any of this of course.>>

<g> And on and on we'll go. Perhaps the problem lies in our criteria of knowledge.

          
 <<        >"deep reality" (and I'm really sure that I know what that means). 
 
 

Great! Even though I'm the one who used the term so I'm sure it has profound significance, I really wish somebody would explain it to me because I haven't a clue.>>

It might be safe to say that it refers to possibly existing aspects of reality that can be known only through extreme extrapolation from our immediate senses, if known at all. I understand that as a definition it has an enormous number of inadequecies, but perhaps we can agree on it as a rough pointer and worry about refinements later, if at all.

<< >You're assuming mental content to be a function only of complexity.

Yes, if something is able to act in a complex way similar to the way I do then it's mental content must be similar to me.>>

Yet it's entirely possible that there could exist objects that look, and act, just as you do, but possess none of the internal mental content that you do. If mental content provides the best explanation for the object's behavior, however, then I agree that we should consider it as having mental content.

          
 <<        >I think that there are a lot of reasons not to assume that.
          

If so then religious people are right, the soul exists. I think that is, to put it politely, unlikely.>>

Hmmm... I may not have communicated clearly. I meant that there are a number of reasons to suppose that complexity by itself will not necessarily give rise to consciousness, that particular substances and particular configurations of those substances could be required, e.g. an extraordinarily complex assortment of foam rubber might not give rise to consciousness while a less complex assortment of nerves might.

<< I think other people are conscious, at least part of the time, because they act that way, but not when they're in a deep sleep or a coma or dead. To put it another way, I assume something is conscious if it act intelligently. I admit that all this is just an assumption but we'll never get anything better.>>

I don't agree that it's merely an assumption. I think that the facts that we are all very similar creatures physically, that we developed via evolution, and a number of other facts could lead one to make the strong case that the exisence of consciousness in other beings provides the best explanations of their behaviors.

          
<<          >>Me:

>>If you and everyone and everything you know are nothing but
>>software programs then ...

>Then what?

Then you and everything you know are information.>>

Oh... ok, I see now. But we wouldn't be the written record of the program, but actual execution of it. And while the program might tell us how the machine is <g> executing us and thereby creating us, the execution itself is not necessarily information, even though it can of course provide information.

          
 <<        >I really think that the brain in the vat assumes a lot less about 

>consciousness than the software analogy, which doesn't I think work
>necessarily.

Except for the complexity problem which will certainly be solved in time, what magical property does meat have that computers lack?>>

I suppose that I'm very relunctant simply because I haven't given a software analogy much though, and aren't sure of what the implications of it would be, while I am very familiar with the brain in a vat analogy.

<< Without food you die, without information a program crashes.>>

Your original comment was: Nothing can provide anything but information. And you very conveniently quoted that in your reply. But I don't see the link between this final comment, and that first comment.

          
 <<        >Unless you're going to make every single cause and effect sequence

>an instance of language, which is strikingly absurd

Exactly, so why go on and on about lightning being a language?>>

So every cause and effect sequence is NOT an instance of language?

<< >consciousness must be for there to be a language.

Let's see, I know other people are conscious because they use language and I can tell it's a language because the people are conscious. What's wrong with this picture?>>

:) I didn't draw the picture of any such argument. It might have some strength though, with a little tinkering.

 <<         >If we can say that the "grammar" of a genetic code is the fact
that         

>it works in 3 chemical units,

There is no "if" about it, that's the way the genetic code works. >>

There's an "if" as to whether it can be called "grammar" in a literal as opposed to a metaphorical sense.

<<As I said, a language with an infinite number of letters is gibberish.>>

Not if the function of each succeeding letter were defined, after a first number of letters in a sequence, in terms of letters within that original sequence.

<<          >>Me:

>>in what way would the ribosome act differently if CAU did mean
>>something to it? >It wouldn't, necessarily.

Apparently you don't realize it but by making this huge concession you've thrown in the towel. If meaning doesn't change anything, then there is no point in talking about it.>>

It does not *necessarily* change the behavior of the object. Whether it does or does not depends of course on many factors aside from whether the object comprehends meaning.

<< If you're right then I'd have no way of showing that language exists, not ever. You expressed incredulity that someone could believe that another human being is not conscious but now you say that behavior is no guide to the mental state of anything.>>

I really hate to cut up your argument like this, but it'll probably save time in the end. I did NOT say that behavior provides zero evidence as to whether an object has mental states or not. I said that a ribosome would not *necessarily* act differently if CAU really meant something to it, that is, that it expressed a certain concept that the ribosome understood. It might be that the ribosome would think to itself, Oh! CAU! Okay, time start producing x protein...

I didn't reply to the last part of the argument because it was based on the misunderstanding.

<< One other thing, although our internal mental states and emotions may be priceless to us they are of interest to evolution only as they effect behavior, and if they don't then there is no way random mutation and natural selection could have produced it for us to enjoy. Yet we have it, or at least I do.>>

Could be a side-effect.

<< The logic is impeccable and the conclusion absurd, therefore the premise must
be wrong, we CAN deduce meaning from behavior and it something starts to attach meaning to a great many things, much, much, more than the 64 simple triplets that ribosomes can handle, then we can begin to start calling something like that conscious.>>

I don't think that we can deduce meaning from behavior, for reasons already stated; furthermore, the absurd conclusion which you argued against was never drawn by me, so the above conclusion, elegant though it may be, simply won't carry.

Andrew