The most interesting thing about Dyson's scenario is that it imposes a number of limits. These limits make it easier to predict what life could `evolve' into. (Tipler's Omega Point in a closed universe relies on omniscience and is therefore an omniboring area of conversation.) For instance, the need for an infinite amount of memory storage (Dyson suggests analog memory) "will put severe constraints on the rate of acquisition of permanent new knowledge". He also proposes scaling laws, the need for hibernation, and limits to communication. I'm sure an even better model of future life could be created. For instance, are there any fundamental limits to complexity that would stop *really big* Jupiter Brains from functioning (increased complexity generally means more errors and more redundancy)? Will limits to communication (speed of light, background radiation) and size (spread of matter in the universe, rate of decay) insure individuality?
And so on...
BM