Rob Harris <firstname.lastname@example.org> writes:
> >no matter what our level of technology, we
> >will have uncertainty due to limitations in information processing,
> >sensing abilities, information storage and complexity.
> How certain can we be that these limitations are completely insurmountable ?
First of all, we have Gödel's theorem, which shows that any sufficiently powerful formal system contains unprovable true statements or is inconsistent; this means that there are stuff that cannot be proven with your system unless you extend it further, and then there will be new unprovable statements. In practice this will likely seldom be a problem, but it is a fundamental limitation, especially if the Church-Turing thesis holds (as seem likely).
Also, it is possible (Chaitin has done so) to construct well defined sequences of numbers that are random in the sense that even if you know all the previous numbers and the rule for making new ones, the simplest way of figuring out the next number is to use the rule - which can of course be of arbitrary complexity. In fact, algorithmic information theory shows that most numbers are uncomputable.
Then we have chaos. Chaotic dynamical systems exhibit sensitive dependency on initial conditions, which means that any disturbance will grown exponentially, making predictions of the future state impossible. The best we can do is give a (statistical) description of the attractor. And chaos is everywhere, from the weather and climate to neural networks. As long as we do not have a full understanding of the equations describing the system and total knowledge of the initial state + disturbances we cannot predict its behavior within a certain precision beyond a certain time (the logarithm of the precision divided by the largest lyapunov exponent). Chaos is especially prone to occur in systems of interacting agents, such as evolution or social interactions (e.g. economics).
We also have the limitations on information processing set up by hardware; physics doesn't allow arbitrarily powerful computers due to the Bekenstein bound on information density, the limited speed of light, a limited amount of available matter, energy dissipation from irreversible computation. Sensing gets limitations from the Heisenberg relations and a host of other physical principles.
(book suggestion: John D. Barrow's _Impossibility: The Limits of Science and the Science of Limits_, which deals with the nature of the impossible)
> I say this as I notice that historically, scientists and the like are most
> eager to use the "impossible" stamp, only to take it back in the future due
> to some unprecedented workaround or something - such as parallel computing
> moving the upper limit of computation speed in computers, or the
> "impossible" sound barrier simply being broken.
Sure. I would be *happy* if some of the above stuff turned out to be wrong. But the first three limitations are inherent in mathematics itself, and not something you can just make a work araond for. The fourth group is more loose, I have myself worked on a paper finding ways of approaching the limits or circumventing them (soon to be published in JTH), but it seems clear that there are physical limits (they might just be so wide that they appear irrelevant to most practical problems).
Saying the sound barrier cannot be broken is based on a lot of assumptions about flight. Pointing out a provably impossible task is something different.
> What I know for damn sure is
> that there's no way scientists know everything about information processing,
> sensing abilities and information storage at this time, much as they might
> like to think they do.
What I'm saying is that the way in which we approach
> these problems, and the technologies implemented, will almost certainly end
> up changing in time as old methods are superceded - and who can say what
> this will mean for "absolute limits" calculated based upon the current
> For instance, (I could have the wrong end of the stick) I notice that in the
> commonly available explanation of quantum uncertainty, the provided analogy
> goes something like this:....Supposing you want to measure 2 different
> aspects of a system for a calculation requiring 2 measurements...blah
> blah.....you'd have to shoot a proton at.....blah blah.....then by the time
> the second measurement is taken......blah blah....this and that changed,
> measurements out of step, incalculable.
> So, because we can't think of a way of doing it right this second, it can't
> be done, therefore absolute uncertainty exists. It seems to me that this
> assumes an invasive method of measurement, linear and constantly progressing
> time, etc....etc...
The explanation you refer to is indeed not correct, it does not prove the Heisenberg relations (it is just a good way of showing why they are likely). The real derivation is purely mathematical and based on the nature of the fields, it is independent of our technology. If quantum mechanics is very wrong it will not hold, but otherwise it is an inherent aspect of the kind of physics we inhabit.
If you have a physics that is invariant under Poincare group you get relativity, that is a logical effect you cannot avoid. Noether's theorem shows that time translation symmetry implies energy conservation, and so on. This is not something you can cange with refined technology, unless you believe it is possible to change the properties of physical law.
> In fact the whole idea of probability becomes null if time manipulation
> becomes possible!
Not necessarily. Just look at Novikov's work on a quantum mechanical treatment of closed timelike curves - he showed that in a simplified setting quantum mechanics actually enforces consistency through probability - inconsistent events get probability zero, while all others retain roughly their classical probabilities. Very interesting work.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! email@example.com http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y