Re: Penrose

Hal Finney (
Mon, 10 Nov 1997 12:43:22 -0800

John Clark, <>, writes:
> Penrose's argument is that if people are algorithmic like a Turing machine
> then we couldn't jump out of the system, look at ourselves and find the
> inadequacies in the system as Godel seemed to do it his proof. I think at the
> very lowest level we are the equivalent of Turing machines, however the
> axioms are numerous but ridiculously simple at that level, "If you're in
> state A and see a 0, change it to a 1, go to state B, and move one square to
> the left". I don't think we can jump out of THAT system and change the basic
> operating principles that Turing found 60 years ago, but that very low level
> can't be the level the conscious mind operates. Something must be running on
> top of that machinery, something that has few if any axioms and so is not a
> formal system.

It's not clear to me that minds and/or neurons can be expressed as formal
systems. Equivalently, there can be no computer program which _exactly_
(exactly, exactly) simulates a brain. This is because we do not have
a full understanding of the laws of physics. It is possible, as Penrose
argues, that actually the laws of physics are nonlocal and/or have non
algorithmic properties. Therefore his argument does not apply even at the
lowest level of our own brains.

However, dealing with an upload or an AI is a different matter. Now we
have a full understanding of the substrate which is executing the program.
It is a completely deterministic, mechanical and logical machine, and is
even relatively simple at the lowest level - as little as a dozen or so
opcodes is probably enough to be a universal machine able to run an
upload, by our current understanding.

Given that the substrate is deterministic and mechanical, it follows
(I think) that any program loaded onto it shares these properties,
even if it is a complicated brain simulation. What you need to do is
to produce some kind of concrete way of defining what it means for the
simulated mind to believe a certain mathematical proposition. This would
require a complete understanding of how its consciousness worked, and
what the neural (or lower level) structure was which mapped to conscious
belief, but is possible in principle. Given this, you can set up a formal
system which expresses all possible beliefs that the simulated mind can

The key idea here is that this is only possible because we know the mind
is running on a deterministic, formally structured computer. It follows
that the mind is also deterministic and has a formal structure of its
own. A mind running on neurons and the laws of physics does not have
this property (unless it turns out that the laws of physics are in the
appropriate sense deterministic and algorithmic, which Penrose believes
is not the case).

The mind running on the computer, then, differs from the mind in the
brain, because we can theoretically set up a formal system which expresses
all the thoughts of the first, but we may not be able to do so (Penrose
would say cannot) for the second. Present the mind on the computer with
a Godel statement based on the formal system which constrains it and
it will not be able to believe its truth, while the mind in the brain
supposedly has no such limitation.

> Minds are problem solvers and I don't think non-algorithmic problem solving
> is necessarily mystical. I doubt if we use axioms like "if A and B then C"
> very much, more important are billions of heuristics like "if A and B then
> usually something close to C".

True, but this is a very gross level of description. To actually pin
down those heuristics will turn them into immensely complicated but
deterministic formulas. (As discussed before, if the formula needs a
random component, it can be supplied by a deterministic random number

> If the mind is not a formal system, at least at the highest conscious level,
> then Godel does not apply to us or to a intelligent program running on a
> computer.

Right, but Penrose would claim that a mind uploaded to a computer is a
formal system, even at the highest conscious level, as I understand it.